Large robots.txt file
-
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately)
Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down.
Does anybody have any experience with a robots.txt of that size?
-
Answered my own questions:
https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt?csw=1#file-format
"A maximum file size may be enforced per crawler. Content which is after the maximum file size may be ignored. Google currently enforces a size limit of 500kb."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Canonicals for Splitting up large pagination pages
Hi there, Our dev team are looking at speeding up load times and making pages easier to browse by splitting up our pagination pages to 10 items per page rather than 1000s (exact number to be determined) - sounds like a great idea, but we're little concerned about the canonicals on this one. at the moment we rel canonical (self) and prev and next. so b is rel b, prev a and next c - for each letter continued. Now the url structure will be a1, a(n+), b1, b(n+), c1, c(n+). Should we keep the canonicals to loop through the whole new structure or should we loop each letter within itself? Either b1 rel b1, prev a(n+), next b2 - even though they're not strictly continuing the sequence. Or a1 rel a1, next a2. a2 rel a2, prev a1, next a3 | b1 rel b1, next b2, b2 rel b2, prev b1, next b3 etc. Would love to hear your points of view, hope that all made sense 🙂 I'm leaning towards the first one even though it's not continuing the letter sequence, but because it's looping the alphabetically which is currently working for us already. This is an example of the page we're hoping to split up: https://www.world-airport-codes.com/alphabetical/airport-name/b.html
Intermediate & Advanced SEO | | Fubra0 -
How important is the file extension in the URL for images?
I know that descriptive image file names are important for SEO. But how important is it to include .png, .jpg, .gif (or whatever file extension) in the url path? i.e. https://example.com/images/golden-retriever vs. https://example.com/images/golden-retriever.jpg Furthermore, since you can set the filename in the Content-Disposition response header, is there any need to include the descriptive filename in the URL path? Since I'm pulling most of our images from a database, it'd be much simpler to not care about simulating a filename, and just reference an image id in my templates. Example: 1. Browser requests GET /images/123456
Intermediate & Advanced SEO | | dsbud
2. Server responds with image setting both Content-Disposition, and Link (canonical) headers Content-Disposition: inline; filename="golden-retriever"
Link: <https: 123456="" example.com="" images="">; rel="canonical"</https:>1 -
Advanced: SEO best practice for a large forum to minimise risk...?
Hi Hope someone can offer some insight here. We have a site with an active forum. The transactional side of the site is about 300 pages totals, and the forum is well over 100,000 (and growing daily) meaning the 'important' pages account for less that 0.5% of all pages on the site. Rankings are pretty good and we're ticking lots of boxes with the main site, with good natural links, logical architecture, appropriate keyword targeting. I'm worried about the following: crawl budget PR flow Panda We actively moderate the forum for spam and generally the content is good (for a forum anyway), so I'm just looking for any best practice tips for minimising risk. I've contemplated moving the forum to a subdomain so there's that separation, or even noindexing the forum completely, although it does pull in traffic. Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | iProspect_Manchester1 -
Huge increase in server errors and robots.txt
Hi Moz community! Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors. The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again. These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone... Anyone have any ideas here? It would be greatly appreciated! 🙂 I've been chasing this up with the dev agency and the hosting company for weeks, to no avail. Massive thanks in advance 🙂
Intermediate & Advanced SEO | | labelPR0 -
Soft 404's from pages blocked by robots.txt -- cause for concern?
We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?
Intermediate & Advanced SEO | | nicole.healthline4 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280 -
Page load increases with Video File - SEO Effects
We're trying to use a flash video as a product image, so the size increase will be significant. We're talking somewhere around 1.5 - 2mb on a page that is about 400kb before the video. So the increase is significant. There is SEO concern with pages peed and thinking perhaps having the flash video inside an iframe might overcome the speed issues. We're trying to provide a better experience with the video, but the increase in page size, and therefore speed, will be significant. The rest of the page will load, including a fallback static image, so we're really trying to understand how to mitigate the page load speed impact of the video. Any Thoughts?
Intermediate & Advanced SEO | | SEO-Team0