Intermittent DNS errors. IP team not able to diagnose
-
Intermittent DNS errors showing up in GSC for our fashion portal www.AJIO.com. Our IP team doesn't find any issues at our end. Everytime i write to them, they come back saying 'DNS is resolving fine in all servers'.
How do we resolve this? Pl help
-
There are a few issues with the DNS that I can see. First off, the email address in the SOA record is wrongly formatted, so won't receive mail. While that isn't the source of your issue, your IT people should change it so that they can receive messages about the DNS. Change this:
to this:
jliodns\admins.ril.com.
(Copy all of that exactly, including the dot at the end of the line.)
Secondly, there are different SOA serial numbers available. There are four nameservers for ajio.com, and they are ns1.ajio.com, ns2.ajio.com, ns3.ajio.com, and ns4.ajio.com. When I query ns1, ns2, or ns4, I get back a line like this:
ajio.com. 3600 IN SOA ns1.ajio.com. jiodns.admins@ril.com. 2016052708 10800 900 1209600 86400
But from ns3 I get this line:
ajio.com. 3600 IN SOA ns1.ajio.com. jiodns.admins@ril.com. 2016041301 10800 900 1209600 86400
The difference is the serial number (the bit after ril.com.): it's 2016052708 in the first case, and 2016041301 in the second. These numbers should always be the same. It looks from this line as though the ns3 nameserver should be fetching the newer serial number every 10800 seconds, yet it clearly isn't because it has a serial number from April 2016, while the other three nameservers have a serial number from May 2016. That's still not enough to explain your issue, but basically, your IT team need to investigate the performance of their nameservers, because this may be the symptom of a wider issue.
Good places for them to start are here and here. The first of those two links is the Pingdom test that Nicolas linked to, but with the important difference that it's checking ajio.com, rather than www.ajio.com.
-
Take the first tool available http://dnscheck.pingdom.com/?domain=www.AJIO.com and it seems your domain have errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Does non-critical AMP errors prevent you from being featured on Top Stories Carousel?
Consider site A which is a news publishing site that has valid AMP pages with non-critical AMP pages (as notified within Search Console). Also, Site A publishes news articles from site B (its partner site) and posts it on site A which have AMP pages too but most of them are not valid AMP pages with critical AMP errors. For brand terms like Economic Times, it does show a top stories carousel for all articles published by Economic Times, however it doesn't look the same for site A (inspite of it having valid AMP pages). Image link: http://tinypic.com/r/219bh9j/9 Now that there are valid AMP pages from site A and invalid AMP pages from site B on site A, there have been instances wherein a news article from site A features on the top stories carousel on Desktop for a certain query whereas it doesn't feature on the mobile SERPs in spite of the page being a valid AMP page. For example, as mentioned in the screenshot below: Business Today ranks on the Top Stories carousel for a term like “jio news” on Desktop, but on Mobile although the page is a valid AMP page, it doesn’t show as an AMP page within the Top Stories Carousel. Image Link: http://tinypic.com/r/11sc8j6/9 There have been some cases where although the page is featured on the top carousel on desktop, the same article doesn't show up on the mobile version for the same query on the Top Stories Carousel. What could be the reason behind this? Also, would it be necessary to solve both critical and non-critical errors on site A (including those published from site B on site A)?
Intermediate & Advanced SEO | | Starcom_Search1 -
I should keep looking at the IP address location on my backlink prospects?
Hello Everyone, I'm doing a backlink research and we are considering how to define a strategy. My website is located in Spain and I am focusing on ranking my website on Google.es for public coming from Spain. The thing is that until now I used to look carefully at the IP address of my backlink prospects, to check if the website was really hosted in Spain. I know Matt Cutts used to say very long ago that hosting location of a website was important. But taking into account that nowadays with cloud servers, CDNs, AnyCast and such technologies it's no longer possible to accurately identify the location of a website, specially if the CDN of the websites uses AnyCast, so that an IP address can be used by different machines in different locations. Do you guys think I should keep looking at the IP address location on my backlink prospects?
Intermediate & Advanced SEO | | C.A0 -
Rankings gone, no WMT errors, help!
Hi, Client Google rankings have been seriously hit. We have done everything we know of to see why this is the case, and there is no obvious explanation. The client dominated search terms, and are no down on page 7/8 for these search terms. There are no errors in WMT, so we can not resubmit for reconsideration. This is a genuine client and their business has been seriously affected. Can anybody offer help? Thanks in advance!
Intermediate & Advanced SEO | | roadjan0 -
What is best practice to eliminate my IP addr content from showing in SERPs?
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center). Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that). I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows. As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical. So here are my questions. Is there a better way? If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
Intermediate & Advanced SEO | | ovenbird0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
SEO friendly way to redirect users based on IP address
Hi There I have an issue where I need to redirect all visitors from a specific country to a sub-domain, which has content explaining to those users that our service is not offered in that country. I would like to know the most SEO friendly way to do this? Thanks Sadie
Intermediate & Advanced SEO | | dancape0 -
SEO issues with IP based content delivery
Hi, I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site. Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me: 1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites. 2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience. 3. You tell me
Intermediate & Advanced SEO | | DevakiPhatak2