New CMS system - 100,000 old urls - use robots.txt to block?
-
Hello.
My website has recently switched to a new CMS system.
Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls.
Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical'
Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find.
My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary.
Thanks!
-
Great stuff..thanks again for your advice..much appreciated!
-
It can be really tough to gauge the impact - it depends on how suddenly the 404s popped up, how many you're seeing (webmaster tools, for Google and Bing, is probably the best place to check) and how that number compares to your overall index. In most cases, it's a temporary problem and the engines will sort it out and de-index the 404'ed pages.
I'd just make sure that all of these 404s are intentional and none are valuable pages or occurring because of issues with the new CMS itself. It's easy to overlook something when you're talking about 100K pages, and it could be more than just a big chunk of 404s.
-
Thanks for the advice! The previous website did have a robots.txt file with a few wild cards declared. A lot of the urls I'm seeing are NOT indexed anymore and haven't been for many years.
So, I think the 'stop the bleeding' method will work, and I'll just have to proceed with investigating and applying 301s as necessary.
Any idea what kind of an impact this is having on our rankings? I submitted a valid sitemap, crawl paths are good, and major 301s are in place. We've been hit particularly hard in Bing.
Thanks!
-
I've honestly had mixed luck with using Robots.txt to block pages that have already been indexed. It tends to be unreliable at a large scale (good for prevention, poor for cures). I endorsed @Optimize, though, because if Robots.txt is your only option, it can help "stop the bleeding". Sometimes, you use the best you have.
It's a bit trickier with 404s ("Not Found"). Technically, there's nothing wrong with having 404s (and it's a very valid signal for SEO), but if you create 100,000 all at once, that can sometimes give raise red flags with Google. Some kind of mass-removal may prevent problems from Google crawling thousands of not founds all at once.
If these pages are isolated in a folder, then you can use Google Webmaster Tools to remove the entire folder (after you block it). This is MUCH faster than Robots.txt alone, but you need to make sure everything in the folder can be dumped out of the index.
-
Absolutely. Not founds and no content are a concern. This will help your ranking....
-
Thanks a lot! I should have been a little more specific..but, my exact question would be, if I move the crawlers' attention away from these 'Not Found' pages, will that benefit the indexation of the now valid pages? Are the 'Not Found's' really a concern? Will this help my indexation and/or ranking?
Thanks!
-
Loaded question without knowing exactly what you are doing.....but let me offer this advice. Stop the bleeding with robots.txt. This is the easiest way to quickly resolve that many "not found".
Then you can slowly pick away at the issue and figure out if some of the "not founds" really have content and it is sending them to the wrong area....
On a recent project we had over 200,000 additional url's "not found". We stopped the bleeding and then slowly over the course of a month, spending a couple hours a week, found another 5,000 pages of content that we redirected correctly and removed the robots....
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Robots information
Hi, I have a question about the Meta Robots information Accoarding to the Moz bar, our page uses the meta robots noodp and noydir. Our competitor uses
On-Page Optimization | | AdoBike
INDEX,FOLLOW I read that noodp and noydir are dated and not used anymore. Is it wise to use INDEX FOLLOW instead for better SEO? Thanks in advance!1 -
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
We have new website abexpress.ae. The website is not ranking in the first page when I search with the Domain name itself. Any tips to improve the SEO for a new website?
We have launched a website abexpress.ae. The website is not ranking well even if we search with the Domain name. The Domain authority of the website it 10. How to increase this?
On-Page Optimization | | AlliedTransport0 -
Is Disqus comments useful as per SEO?
Is Disqus comments useful as per SEO? We have some comments on each of our pages and its time taking to moderate them, so wanted to know if its beneficial in any ways for SEO?
On-Page Optimization | | bsharath0 -
To change or not to change site URL structure?
I am learning my way around SEO, having always used professionals for it in the past on previous businesses i have decided to do it myself and learn more about it. Now the dilemma i am up against is i recently changed some of my permalinks on quite a few main pages throughout the site. The site launched in April this year so we're quite new. The problem is since my last change i have not seen any increase...a decrease which in fact hasn't recovered at all. Having now analysed them in more depth and read up more on the whole subject of SEO, (which is endless) i have put together a complete new strategy; with this increased understanding of what i am doing (but by no means conclusive) i want to complete a full overhaul on all SEO (via Wordpress which i use along with YOAST SEO tools), ensuring i have all my keywords, permalinks and descriptions spot on throughout every page, post and picture. I spent a lot of time mapping these out, ensuring there is no Focus keyword duplication, and that the site is relatively flat in terms of its layout. What i am unsure about now is whether changing my permalinks again is a bad thing to do?
On-Page Optimization | | MrCostello
Could it permanently damage my rep going forward?
Should i just focus on my content and keywords/descriptions? I am at a loss as i don't want to do irreparable damage to our reputation. The site is still reasonable easy to manage so changing now is the best time to do it, but if changing the URLs is a waste of time then i may just forget that and just work on the keywords, descriptions and content. Advice is 'oh so welcome' 🙂0 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
Page authority 1 for new URLs
Hi There Quite a beginner question. I have changed url structure last week and is already avaliable on google.What i find strange is that the PA reported by SEOMOZ is 1 and there's no google cache. If the page has to crawled yet, why it's avaliable on google index already? Dario
On-Page Optimization | | Mrlocicero0 -
Redirecting URLS on windows
Could anyone help out here please. A client of ours have reveloped their website from HTML to ASP (helpful!). They have 60 odd pages indexed in Google with the .html extension. We need to do a redirect on these pages so that all link juice is passed to the new pages. What would be the best way to do this please?
On-Page Optimization | | Grumpy_Carl0