Have I constructed my robots.txt file correctly for sitemap autodiscovery?
-
Hi,
Here is my sitemap:
User-agent: *
Sitemap: http://www.bedsite.co.uk/sitemaps/sitemap.xml
Directories
Disallow: /sendfriend/
Disallow: /catalog/product_compare/
Disallow: /media/catalog/product/cache/
Disallow: /checkout/
Disallow: /categories/
Disallow: /blog/index.php/
Disallow: /catalogsearch/result/index/
Disallow: /links.htmlI'm using Magento and want to make sure I have constructed my robots.txt file correctly with the sitemap autodiscovery?
thanks,
-
Hey thanks for the response. There are about 14,000 url's in the sitemap. It shouldn't freeze up - please would you try again.
http://www.bedsite.co.uk/sitemaps/sitemap.xml
I know what you mean about the allow all
-
Also, here is the best place to answer your questions.
From Google: "The Test robots.txt tool will show you if your robots.txt file is accidentally blocking Googlebot from a file or directory on your site, or if it's permitting Googlebot to crawl files that should not appear on the web. " You can find it here
-
The robots.txt looks fine. I always add an allow all, even knowing it is not necessary but it makes me feel better lol.
The problem you have is with the sitemap itself. How big is it? I cannot tell how many links you have because it locks up every time I go to it in both chrome and firefox.
I tried to send a tool that is designed to pull sitemaps as the SERPS do and it also freezes up.
How many links do you have?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Little confused regarding robots.txt
Hi there Mozzers! As a newbie, I have a question that what could happen if I write my robots.txt file like this... User-agent: * Allow: / Disallow: /abc-1/ Disallow: /bcd/ Disallow: /agd1/ User-agent: * Disallow: / Hope to hear from you...
Technical SEO | | DenorL0 -
Htaccess file help
Hi, thanks for looking i am trying (and failing) to write a htaccess for the following scenario <colgroup><col width="679"></colgroup> http://www.gardening-services-edinburgh.com/index.html http://www.gardening-services-edinburgh.com http://www.gardening-services-edinburgh.com/ | so that all of these destinations goto the one resource any ideas? thanks andy
Technical SEO | | McSEO0 -
Do I have a robots.txt problem?
I have the little yellow exclamation point under my robots.txt fetch as you can see here- http://imgur.com/wuWdtvO This version shows no errors or warnings- http://imgur.com/uqbmbug Under the tester I can currently see the latest version. This site hasn't changed URLs recently, and we haven't made any changes to the robots.txt file for two years. This problem just started in the last month. Should I worry?
Technical SEO | | EcommerceSite0 -
Sitemap errors have disappeared from my Google Webmaster tools
Hi all, A week ago I had 66 sitemap errors related to href langs in my GWT. Now, all the errors are gone, and it shows no errors. We have not done any work to fix the errors. I wonder if anybody has experienced the same thing, of Google suddenly changing the criteria or the way they report on errors in Google Webmaster Tools. I would appreciate any insights from the community! Best regards Peru
Technical SEO | | SMVSEO0 -
When to file a Reconsideration Request
Hi all, I don't have any manual penalties from Google but do have a unnatural links message from them back in 2012. We have removed some of the spammy links over the last 2 years but we're now making a further effort and will use the disavow tool once we've done this. Will this be enough once I submit the file or should I / can I submit a Reconsideration Request as well? Do I have to have a manual penalty item in my webmaster account to be able to submit a request? Thanks everyone!
Technical SEO | | KerryK0 -
Duplicate Video Onsite - How do you treat this in Sitemap?
How would you handle multiple pages using the same video content? As sometimes it does not make sense to have new videos for every product so you re purpose. Will you still get the effects in search results if the thumbs and video location is duplicated for some product urls?
Technical SEO | | andrewv0 -
Use of Robots.txt file on a job site
We are performing SEO on a large niche Job Board. My question revolves around the thought of no following all the actual job postings from their clients as they only last for 30 to 60 days. Anybody have any idea on the best way to handle this?
Technical SEO | | WebTalent0 -
Blocking other engines in robots.txt
If your primary target of business is not in China is their any benefit to blocking Chinese search robots in robots.txt?
Technical SEO | | Romancing0