New server update + wrong robots.txt = lost SERP rankings
-
Over the weekend, we updated our store to a new server. Before the switch, we had a robots.txt file on the new server that disallowed its contents from being indexed (we didn't want duplicate pages from both old and new servers).
When we finally made the switch, we somehow forgot to remove that robots.txt file, so the new pages weren't indexed. We quickly put our good robots.txt in place, and we submitted a request for a re-crawl of the site.
The problem is that many of our search rankings have changed. We were ranking #2 for some keywords, and now we're not showing up at all. Is there anything we can do? Google Webmaster Tools says that the next crawl could take up to weeks! Any suggestions will be much appreciated.
-
Dr. Pete,
I just ran across one of your webinars yesterday and you brought up some great ideas. Earned a few points in my book
Too often SEOs see changes in the rankings and react to counter-act the change. Most of the time these bounces are actually a GOOD sign. It means Google saw your changes and is adjusting to them. If your changes were positive you should see positive results. I have rarely found an issue where a user made a positive change and got a negative result from Google. Patience is a virtue.
-
Thanks everyone for the help! Fortunately we remedied the problem almost immediately, so it only took about a day to get our rankings back. I think the sitemap and fixed robots.txt were the most important factors.
-
I agree, let Google re-index first and then re evaluate the situation.
-
I hate to say it, but @inhouseninja is right - there's not a lot you can do, and over-reacting could be very dangerous. In other words - don't make a ton of changes just to offset this - Google will re-index.
A few minor cues that are safe:
(1) Re-submit your XML sitemap
(2) Build a few new links (authoritative ones, especially)
(3) Hit social media with your new URLs
All 3 are at least nudges to re-index. They aren't magic bullets, but you need to get Google's attention.
-
Remain calm. You should be just fine. It just takes time for Google to digest the new robots.txt. I would be concerned if things didn't change in 3-4 weeks. Adopt a rule to not freak out on Google until you've given the problem 14 days to resolve. Sometimes Google moves things around and this is natural.
If you want Google to crawl your site faster, build some links and do some social media. That will encourage Google to speed it up.
-
If this is all that happened the next crawl should fix it. Just sit tight and they should bounce up again in a week or so.
-
That does not sound fun at all.... So you just changed the server, complete copy?
My first question would be other than the server did anything else change? Copy or URL's
My second question would be is the other server still up and live to the internet?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New site failing to rank - could this be why?
I have a new client site that is not appearing anywhere in the top 100 for its main keywords. ASSUMING that this is not an issue with optimization or link quality, I am wondering whether it might be the following... The client's company has a parent company whose website has decent authority. This website links to the new (client) website. In addition, the 2 press releases we have done include links to both companies, since one was an outgrowth of the other. This is all 100% natural, so my inclination is that this is not causing the issue. But does anyone have any experience to suggest otherwise? That having website A linking to website B, and 50+ press release websites linking to both, could be causing the algorithm to throttle website Bs ability to rank? Thanks in advance!
Intermediate & Advanced SEO | | zakkyg0 -
Huge Dip in Traffic Last Week - New Algo Update?
Hi Mozzers, We experienced a huge dip in traffic on Thursday, 8/14, across our entire site. It was not a specific set of pages, it was sitewide. Google Webmaster Tools notes our impressions are down as well. The traffic has not recovered. It appears our pages are still indexed in Google, just not ranking well. Here are some questions I have to help isolate the cause: We recently completed a major redesign of our entire website on 7/26. We did not notice any dip in traffic after the new design launch - in fact, it actually increased a bit. Is it possible that only now Google sees our new site design and this is the reason for our dip? Is there a way to see Google's past cache dates? Did anyone else experience a similar dip in traffic since Thursday? Was there a recent Google update? It would be much appreciated if someone takes a look at our site - www.consumerbase.com for any glaring SEO errors (missing necessary meta tags, etc.). What steps do you guys suggest I take to isolate the cause in this dip in traffic? Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Better to publish regular new pricelist articles or update the existing ones ?
Hello Moooooooooooooz ! I could not sleep yesterday because of a SEO nightmare ! So I came up with the following question: "Is it better to release regular new articles or update the existing ones" I explain more. Our company release regular pricelists (every month new pricelists available for a month, with the same brands. ex: January pricelist for brand A, etc.) Right now those pricelists are ranking good on google. So I wondered: Would it better to do: Make the pricelist articles stronger: Our company - Brand A pricelist (title) blog/offer/brand-A-pricelist.html (url) -> every month I update the text. So I just have one article /link to work on **Make more content on the pricelist: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A-pricelist-january.html (url) -> So google keeps indexing new fresh content **Work on a extra category: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A/pricelist-january.html (url) -> So I work on one link over the web blog/offer/brand-A where Google finds lots of new relevant contents I know that Matt Cutts said it's good to udpate an old article but in this case it's a bit different. Has anyone experiment the same ? Tks a lot !
Intermediate & Advanced SEO | | AymanH0 -
Should comments and feeds be disallowed in robots.txt?
Hi My robots file is currently set up as listed below. From an SEO point of view is it good to disallow feeds, rss and comments? I feel allowing comments would be a good thing because it's new content that may rank in the search engines as the comments left on my blog often refer to questions or companies folks are searching for more information on. And the comments are added regularly. What's your take? I'm also concerned about the /page being blocked. Not sure how that benefits my blog from an SEO point of view as well. Look forward to your feedback. Thanks. Eddy User-agent: Googlebot Crawl-delay: 10 Allow: /* User-agent: * Crawl-delay: 10 Disallow: /wp- Disallow: /feed/ Disallow: /trackback/ Disallow: /rss/ Disallow: /comments/feed/ Disallow: /page/ Disallow: /date/ Disallow: /comments/ # Allow Everything Allow: /*
Intermediate & Advanced SEO | | workathomecareers0 -
Why is this site not ranking?
http://www.petstoreunlimited.com They get good grades from the on-page tool. The links are not amazing, but are not super spammy. Yet it ranks for nothing they target Any reason why?
Intermediate & Advanced SEO | | Atomicx0 -
Robots.txt: Syntax URL to disallow
Did someone ever experience some "collateral damages" when it's about "disallowing" some URLs? Some old URLs are still present on our website and while we are "cleaning" them off the site (which takes time), I would like to to avoid their indexation through the robots.txt file. The old URLs syntax is "/brand//13" while the new ones are "/brand/samsung/13." (note that there is 2 slash on the URL after the word "brand") Do I risk to erase from the SERPs the new good URLs if I add to the robots.txt file the line "Disallow: /brand//" ? I don't think so, but thank you to everyone who will be able to help me to clear this out 🙂
Intermediate & Advanced SEO | | Kuantokusta0 -
Disallow my store in robots.txt?
Should I disallow my store directory in robots.txt? Here is the URL: https://www.stdtime.com/store/ Here are my reasons for suggesting this: SEOMOZ finds crawl "errors" in there that I don't care about I don't think I care if the search engines index those pages I only have one product, and it is not an impulse buy My product has a 60 day sales cycle, so price is less important than features
Intermediate & Advanced SEO | | raywhite0 -
Robots.txt 404 problem
I've just set up a wordpress site with a hosting company who only allow you to install your wordpress site in http://www.myurl.com/folder as opposed to the root folder. I now have the problem that the robots.txt file only works in http://www.myurl./com/folder/robots.txt Of course google is looking for it at http://www.myurl.com/robots.txt and returning a 404 error. How can I get around this? Is there a way to tell google in webmaster tools to use a different path to locate it? I'm stumped?
Intermediate & Advanced SEO | | SamCUK0