Severe rank drop due to overwritten robots.txt
-
Hi,
Last week we made a change to drupal core for an update to our website. We accidentally overwrote our good robots.txt that blocked hundreds of pages with the default drupal robots.txt. Several hours after that happened (and we didn't catch the mistake) our rankings dropped from mostly first, second place in Google organic to bottom and mid first page.
Basically I believe we flooded the index with very low quality pages at once and threw a red flag and we got de-ranked.
We have since fixed the robots.txt and have been re-crawled but have not seen a return in rank.
Would this be a safe assumption of what happened? I haven't seen any other sites getting hit in the retail vertical yet in regards to any Panda 2.3 type of update.
Will we see a return in our results anytime soon?
Thanks,
Justin
-
Your present approach is correct. Ensure all these pages are tagged as noindex for now. Remove the block from robots.txt and let Google and Bing crawl these pages.
I would suggest waiting until you are confident all the pages were removed from Google's index, then check Yahoo and Bing. If you decide that robots.txt is the best decision for your company, then you can replace the disallows after confirming your site is no longer affected by these pages.
I would also suggest that, going forward, you ensure any new pages on your site that you do not wish to index always include the appropriate meta tag. If this issue happens again then you will have a layer of protection in place.
-
We're pretty confident thus far that we have flooded the index with about 15,000 low rank URLs all at once. This has happened once in the past a few years back but we didn't flood their index, they were newer pages at the time in which were low quality and could have been seen as spam since there was no real content but adsense so we removed them with a disallow in robots.
We are adding the meta no-index to all of these pages. You're saying we should remove the disallow in robots.txt so googlebot can crawl these pages and see the meta-noindex?
We are a very large site and we're crawled often. We're a PR7 site and MOZrank DA is 79/100. We have dropped from 82.
We're hoping these URLs will be removed quickly, I don't think there is a way of removing 15k links in GWMT without setting off flags also.
-
There is no easy answer for how long it will take.
If your theory about the ranking drop being caused by these pages being added is correct, then as these pages are removed from Google's index, your site should improve. The timeline depends on the size of your site, your site's DA, the PA and links for these particular pages, etc.
If it was my site I would mark the calendar for August 1st to review the issue. I would check all the pages which were mistakenly indexed to be certain they were removed. After, I would check the rankings.
-
Hi Ryan,
Thanks for your response. Actually you are correct. We have found some of the pages that should be no follows still indexed. We are now going to use the noindex, follow meta tags on these pages because we can't afford to have theses pages indexed as they are particularly for clients/users only and are very low quality and have been flagged before.
Now, how long until we see our rank move back? Thats the real big question.
Thanks so much for your help.
Justin
-
That's a great answer Ryan... I wonder, just out of curiosity, if it wouldn't hurt to look at the cached version of the pages if they're indexed? I'd be curious to know if the date they were cached is right near when the robots.txt was changed? I know it wouldn't alter his course of action, but might add further confirmation that this caused the problem?
-
Justin,
Based on the information you provided it's not possible to determine if the robots.txt file was part of the issue. You need to investigate the matter further. Using Google enter a query in an attempt to find some of the previously blocked content. For example, let's assume your site is about SEO but you shared a blog article about your movie review of the latest Harry Potter movie. You may have used robots.txt to block that article because it is unrelated to your site's focus. Perform a search for "Harry Potter insite:mysite.com" replacing mysite.com with your main web address. If the search returns your article, then you know the content was indexed. Try this approach for several of your previously blocked areas of the website.
If you find this content in SERPs, then you need to have it removed. The best thing to do is add the "noindex, follow" tags to all these pages, then remove the block from your robots.txt file.
The problem is that with the block in place on your robots.txt file, Google cannot see the new meta tag and does not know to remove the content from it's index.
One last item to mention. Google does have a URL removal tool but that would not be appropriate in this instance. That tool is designed to remove a page which causes direct damage by being in the index. Trade secrets or other confidential information can be removed with this tool.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would cause a major drop in ranking for (images) on both desk top and mobile? Web rankings are holding OK?
Sometime between Nov 8th and Nov 11th are rankings on Google for images dropped substantially. Our web rankings are holding OK. any idea!
Technical SEO | | OCFurniture0 -
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
Pagespeed drop on https
Hi all, Since I have moved from http to https my pagespeed has dropped. My hosting company/ programmer say this is normal but I am not happy with the results. Before: mobile 69 desktop 89 Now: Mobile: 52 desktop 69 Anybody experience with this issue? All help is welcome! Thank You in advance Tymen
Technical SEO | | Tymen0 -
Bing rankings question
Hi, We just wrapped up a website redesign about a month ago. The content stayed primarily the same. Once we launched the new site all of our rankings in Google stayed the same but we lost rank for all competitive keywords on Bing. I looked in Bing Webmaster tools and it doesn't show any penalties but it does show that we have too many H1 tags. I don't think the H1 tag thing is the issue but maybe. Do you know what could be causing this?
Technical SEO | | BT20090 -
My Alexa ranking dropped after a 301 redirect is that bad?
I had all of my non www pages redirect to the www versions. My alexa ranking dropped and keeps dropping after I did this. I'm guessing its because its tracking the non www version. Does anyone know if this is correct and should I worry?
Technical SEO | | CandleCam0 -
Ranking going down and down and disappears.
I asked a question a few weeks ago about a main keyword that we are targeting that is fluctuating up and down. The keyword is "trash bags" and that is what my company sells. All different colors, thicknesses and sizes. This isn't just a random keyword we are trying to optimize for, this is our business. Before I started optimizing the website for "trash bags" we used the term "garbage bags", now that we started we have been off the charts and on the charts, but we have never regained rankings. The trend finally looked like it was going upwards... but now I see I dropped off of google again. Is this normal? Should I be worried that google is penalizing me for this keyword (There is many links that have trash bags in the anchor text - but we do sell that!)? Here is a screenshot of our ranking history for trash bags: https://www.diigo.com/item/image/3vpdp/no01
Technical SEO | | EcomLkwd1 -
Rankings drop after Panda
Hi All, My site dropped completely out of the SERPS on September 27th. I've tried everything I know to do (re-wrote all content, disavow links tool, filed DCMA complaints, de-optimized on-page content, made anchor text less aggressive, etc). Can you all please take a look at www.doctorloanusa.com and let me know what you think the problem is and how much you'd charge to help? Keywords used to be: doctor loans, physician loans. I ranked 2 or 3 for those keywords consistently for over 4 years. I know I need more content, but I feel like it's a waste of time creating it. If a thin site was the issue, wouldn't I at least rank SOMEWHERE in the 1000 results? Thanks for your consideration. At my wits end.
Technical SEO | | Cary_Forest0 -
Robots.txt versus sitemap
Hi everyone, Lets say we have a robots.txt that disallows specific folders on our website, but a sitemap submitted in Google Webmaster Tools that lists content in those folders. Who wins? Will the sitemap content get indexed even if it's blocked by robots.txt? I know content that is blocked by robot.txt can still get indexed and display a URL if Google discovers it via a link so I'm wondering if that would happen in this scenario too. Thanks!
Technical SEO | | anthematic0