Is there a limit to how many URLs you can put in a robots.txt file?
-
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda.
Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
-
Hi Kristen,
I did this recently and it worked. The important part is that you need to block the pages in robots.txt or add a noindex tag to the pages to stop them from being indexed again.
I hope this helps.
-
Hi all, Google Webmaster Tools has a great tool for this. If you go into WMT and select "Google index", then "remove URLs". You can use regex to remove a large batch of URLs then block them in robots.txt to make sure they stay out of the index.
I hope this helps.
-
Great thanks for the input. Per Kristen's post I am worried that it could just block the URLs altogether and they will never get purged from the index.
-
Yes, we have done that and are seeing traction on those urls, but we can't get rid of these old urls as fast as we would like.
Thanks for your input
-
Thanks Kristen, thats what I was afraid I would do. Other than Fetch is there a way to send Google these URLs in mass? There are over 100 million URLs so Fetch is not scalable. They are picking them up slowly, but at current pace it will take a few months and I would like to find a way to make it purge faster.
-
You could add them to the robots.txt but it you have to remember that Google will only read the first 500kb (source) - as far as I understand with the number of url's you want to block you'll pass this limit.
As Google bot is able to understand basic regex expressions it's probably better to use regex (you will probably be able to block all these url's with a few lines of code.
More info here & on Moz: https://moz.com/blog/interactive-guide-to-robots-txtDirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Role of Robots.txt and Search Console parameters settings
Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?
Technical SEO | | LivDetrick0 -
SEO URLs: 1\. URLs in my language (Greek, Greeklish or English)? 2\. Αt the end it is good to put -> .html? What is the best way to get great ranking?
Hello all, I must put URLs in my language Greek, Greeklish or in English? And at the end of url it is good to put -> .html? For exampe www.test.com/test/test-test.html ? What is the best way to get great ranking? I am a new digital marketing manager and its my first time who works with a programmer who doesn't know. I need to know as soon as possible, because they want to be "on air" tomorrow! Thank you very much for your help! Regards, Marios
Technical SEO | | marioskal0 -
Can someone evaluate this page so I can continue adding others?
Hi, I am adding a bunch of similar category stickers and I am not looking into that good SEO for these since there will be hundreds of them coming but I just want to include the relevant keywords that people perhaps use in the Google image search to take them to our site. They are all related to JDM (Japanese Domestic Motors) so I decided to include JDM at the end of all the SEO titles. I am writing totally different short descriptions for all of these stickers and the Related Products are changing as well. I just want to achieve something like Amazon or eBay listings do - not the perfect SEO since I cannot spend too much time with each sticker optimizing it but I don't want to NOINDEX, FOLLOW them either - hence the different related products for all items and also unique short descriptions. If you check one of the pages: http://www.redrockdecals.com/rising-sun-wakaba-leaf-sticker-red-black-jdm Do you think I should be in the safe side so I don't hurt my overall SEO? Thanks!!
Technical SEO | | speedbird12290 -
Single URL not indexed
Hi everyone! Some days ago, I noticed that one of our URLs (http://www.access.de/karriereplanung/webinare) is no longer in the Google index. We never had any form of penalty, link warning etc. Our traffic by Google is constantly growing every month. This single page does not have an external link pointing to it - only internal links. The page has been indexed all the time. The HTTP status code is 200, there is no noindex or something in the code. I submitted the URL on GWMT to let Google send it to the index. It was crawled successfully by Google, sent to the index 5 days ago - nothing happened, still not indexed. Do you have any suggestions why this page is no longer indexed? It is well linked internally and one click away from the home page. There is still the PR of 5 showing, I always thought that pages with PR are indexed.......
Technical SEO | | accessKellyOCG0 -
Changing all urls
A client of mine has a wordpress website that is installed in a directory, called "site". So when you go to www.domain.com you are redirected to www.domain.com/site. We all know how bad it is to have a redirect fron your subdomain to another page. In this case I measured a loss of 5 points of page authority. The question is: what is the best practice to remove the "site" from the address and changing all the urls? Should I use the webmaster tool to tell to Google that the site is moving? It's not 100% true, cause the site is just moving one level up. Should I install a copy of the website under www.domain.com and just redirect 301 every old page to its new url? This way I think the site would be deindexet for 2/3 months. Any suggestions or tips welcome! Thanks DoMiSol
Technical SEO | | DoMiSoL0 -
Robots.txt Question
In the past, I had blocked a section of my site (i.e. domain.com/store/) by placing the following in my robots.txt file: "Disallow: /store/" Now, I would like the store to be indexed and included in the search results. I have removed the "Disallow: /store/" from the robots.txt file, but approximately one week later a Google search for the URL produces the following meta description in the search results: "A description for this result is not available because of this site's robots.txt – learn more" Is there anything else I need to do to speed up the process of getting this section of the site indexed?
Technical SEO | | davidangotti0 -
Do you get credit for an external link that points to a page that's being blocked by robots.txt
Hi folks, No one, including me seems to actually know what happens!? To repeat: If site A links to /home.html on site B and site B blocks /home.html in Robots.txt, does site B get credit for that link? Does the link pass PageRank? Will Google still crawl through it? Does the domain get some juice, but not the page? I know there's other ways of doing this properly, but it is interesting no?
Technical SEO | | DaveSottimano0 -
Page Titles where URL customization is limited
Hi all, I'm working for a new company which has several websites built on the Miva Merchant 5.5 platform. I'm new to SEO and trying to improve one specific category of products. With Miva the URL structure is set to: "category/" or "product/". I would have liked to have the ability to create URLs like "bike/beach-cruisers/mens-red-hawaiian.html". Since I cannot do that I'm trying to determine the best product name and page titles. Currently all of our titles have the word "bike". So when a category page is displayed, which shows over 100 products I get flagged in my campaign for over using the keyword "bike". However, if I take the work "bike" out of the page title I'm concerned that it would hurt us in the SERPs. Another factor that I'm getting flagged for on my campaign is the fact that our navigation uses the same key words repeatedly in each link. I'm not sure if it's really hurting us or not. Below is an example. I'm looking for some input on recommendations for product names and page titles. Below are some examples of what I'm working with. Any input or suggestions are greatly appreciated. Menu Sample: Bikes-Street-Blue-Mens Bikes-Street-Blue-Womens Bikes-Street-Blue-Kids Bikes-Street-Orange-Mens Bikes-Street-Orange-Womens Bikes-Street-Orange-Kids Bikes-Beach-Cruiser-Blue-Mens Bikes-Beach-Cruiser-Blue-Womens Bikes-Beach-Cruiser-Blue-Kids Bikes-Beach-Cruiser-Orange-Mens Bikes-Beach-Cruiser-Orange-Womens Bikes-Beach-Cruiser-Orange-Kids Current Page Titles/Name: Mens Bike Street Blue | XYZ Bike Mfg. - product/mens-bike-street-blue.html Mens Bike Street Orange | XYZ Bike Mfg. - product/mens-bike-street-orange.html Womens Bike Street Blue | XYZ Bike Mfg. - product/womens-bike-street-blue.html Womens Bike Street Orange | XYZ Bike Mfg. - product/womens-bike-street-orange.html
Technical SEO | | Technical_Contact0