How ro write a robots txt file to point to your site map
-
Good afternoon from still wet & humid wetherby UK...
I want to write a robots text file that instruct the bots to index everything and give a specific location to the sitemap. The sitemap url is:http://business.leedscityregion.gov.uk/CMSPages/GoogleSiteMap.aspx
Is this correct:
User-agent: *
Disallow:
SITEMAP: http://business.leedscityregion.gov.uk/CMSPages/GoogleSiteMap.aspxAny insight welcome
-
Thank you so much for all your replies
[CASE CLOSED] -
Ryan's answer is correct. I just wanted to jump in to say that I know from first hand experience that Google and Bing are both able to read the sitemap file even if it is a different extension and even if you can't name it sitemap.xml.
-
Yes, your example is correct.
A great page for learning about robots.txt is: http://en.wikipedia.org/wiki/Robots_exclusion_standard#Sitemap
I will share the official method of declaring your sitemap location involves only the first letter being capitalized (i.e. Sitemap not SITEMAP) but I am almost certain it does not make a difference.
A few other suggestions which are best practices but do not have to be followed:
-
use all lowercase letters in URLs
-
name the sitemap file "sitemap" not "GoogleSiteMap"
-
submit XML sitemaps when possible. I am again almost certain Google can read other versions so if all you care about is Google then it's fine but otherwise I would suggest just using xml files.
example: business.leedscityregion.gov.uk/cmspages/sitemap.xml
Some other helpful links:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=183668
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site Metrics are not as per they should
Hi, I am regularly making links on my site to improve its metrics but i am confused how other people fastly improve their DA/PA and my DA/PA is not improving with that site. The same happened with spam score. It has been a month i disavow my links having spam score but instead of decrease in it, my spam score increased. Please advice. Is there any special way to use that help moz crawler to check site and update accordingly? Please help
Technical SEO | | AzadSeo37310 -
These days on Google results, it also shows the site map. I submitted my company's sitemap and it still does not show?What am I doing wrong?
Look at the image in the link. I want my company to look like the "pluralsight" website in Google. I want it to show the sitemap. I have already submitted the sitemap to Google few days back, what am I doing wrong? search?sourceid=chrome-psyapi2&ion=1&espv=2&ie=UTF-8&q=pluralsight&oq=pluralsight&aqs=chrome..69i57j0l5.11024j0j8
Technical SEO | | Deein0 -
Robots.txt
Hi All Having a robots.txt looking like the below will this stop Google crawling the site User-agent: *
Technical SEO | | internetsalesdrive0 -
Best use of robots.txt for "garbage" links from Joomla!
I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received. One of my biggest gripes is the point of "Dublicate Page Content". Right now im having over 200 pages with dublicate page content. Now.. This is triggerede because Seomoz have snagged up auto generated links from my site. My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears. Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google. So i just want to get rid of them. Now to my question I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all. But, how do i do this? what should my syntax be? A lof of the links looks like this, but has different id numbers according to the product that is being send: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I guess i need a rule that grabs the following and makes google ignore links that contains this: view=send_friend
Technical SEO | | teleman0 -
Cross links between sites
hi, We have several ecommerce sites and we cross linked 3 of them by mistake. We realize that the sites were linked through WMT, We have shut down 2 of the sites about 2 months ago, but WMT still shows the links coming from those 2 sites. how do we make sure that google will see the sites are shut down. Is there a better of way resolving this issue. We are no longer using those sites, so do not need them to be active. whats the best solution to show google that the links are no longer there. Crawler shows that it was able to crawl the site 45 days after it is shut down. thanks nick
Technical SEO | | orion680 -
Partial Site Move -- Tell Google Entire Site Moved?
OK this one's a little confusing, please try to follow along. We recently went through a rebranding where we brought a new domain online for one of our brands (we'll call this domain 'B' -- it's also not the site linked to in my profile, not to confuse things). This brand accounted for 90% of the pages and 90% of the e-comm on the existing domain (we'll call the existing domain 'A') . 'A' was also redesigned and it's URL structure has changed. We have 301s in place on A that redirect to B for those 90% of pages and we also have internal 301s on A for the remaining 10% of pages whose URL has changed as a result of the A redesign What I'm wondering is if I should tell Google through webmaster tools that 'A' is now 'B' through the 'Change of Address' form. If I do this, will the existing products that remain on A suffer? I suppose I could just 301 the 10% of URLs on B back to A but I'm wondering if Google would see that as a loop since I just got done telling it that A is now B. I realize there probably isn't a perfect answer here but I'm looking for the "least worst" solution. I also realize that it's not optimal that we moved 90% of the pages from A to B, but it's the situation we're in.
Technical SEO | | badgerdigital0 -
Robots.txt file question? NEver seen this command before
Hey Everyone! Perhaps someone can help me. I came across this command in the robots.txt file of our Canadian corporate domain. I looked around online but can't seem to find a definitive answer (slightly relevant). the command line is as follows: Disallow: /*?* I'm guessing this might have something to do with blocking php string searches on the site?. It might also have something to do with blocking sub-domains, but the "?" mark puzzles me 😞 Any help would be greatly appreciated! Thanks, Rob
Technical SEO | | RobMay0