I have two sitemaps which partly duplicate - one is blocked by robots.txt but can't figure out why!
-
Hi, I've just found two sitemaps - one of them is .php and represents part of the site structure on the website. The second is a .txt file which lists every page on the website. The .txt file is blocked via robots exclusion protocol (which doesn't appear to be very logical as it's the only full sitemap). Any ideas why a developer might have done that?
-
There are standards for the sitemaps .txt and .xml sitemaps, where there are no standards for html varieties. Neither guarantees the listed pages will be crawled, though. HTML has some advantage of potentially passing pagerank, where .txt and .xml varieties don't.
These days, xml sitemaps may be more common than .txt sitemaps but both perform the same function.
-
yes, sitemap.txt is blocked for some strange reason. I know SEOs do this sometimes for various reasons, but in this case it just doesn't make sense - not to me, anyway.
-
Thanks for the useful feedback Chris - much appreciated - Is it good practice to use both - I guess it's a good idea if onsite version only includes top-level pages? PS. Just checking nature of block!
-
Luke,
The .php one would have been created as a navigation tool to help users find what they're looking for faster, as well as to provide html links to search engine spiders to help them reach all pages on the site. On small sites, such sitemaps often include all pages of the site, on large ones, it might just be high level pages. The .txt file is non html and exists to provide search engines with a full list of urls on the site for the sole purpose of helping search engines index all the site's pages.
The robots.txt file can also be used to specify the location of the sitemap.txt file such as
sitemap: http://www.example.com/sitemap_location.txt
Are you sure the sitemap is being blocked by the robots.txt file or is the robots.txt file just listing the location of the sitemap.txt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
One site, two blogs, URL structure?
I address a two sided market: consumer research and school fundraising. Essentially parents answer research surveys to generate proceeds for their school. My site will have a landing page at www.centiment.co that directs users to two different sub-landing pages, one related to research and one related to school fundraising. I am going to create two blogs and I am wondering if I should run off one installation of wordpress.org or two? The goal here is to optimize SEO. Separate URL paths by topic are clean but they require two installations of wordpress.org www.centiment.co/research/blog www.centiment.co/fundraising/blog If were to use one installation of wordpress it would be www.centiment.co/blog and then I would have a category for fundraising and a category for research. This is a little simpler. My concern is that it will confuse google and damage my SEO given general blog posts about fundraising are far different then those about research. Any suggestions? Again I don't want to compromise my SEO as I'm creating a blog to improve my SEO. Any insights are much appreciated. Thank you!
Intermediate & Advanced SEO | | kurtw14
Kurt0 -
Two websites (Domains) with same content for more than 4 years which one to choose now?
Hi, I need help with this decision, thanks in advance. My client has 2 websites but they have the same content: one has 4 years http://radiocolombia.com.co/ and the the other one 7 years: http://radiocolombiainternacional.com/web/ This content has been duplicated for years, how do I know which website is more relevant for google? we have to pick one. Please any advice? Thanks, David
Intermediate & Advanced SEO | | seoweb330 -
Client rebranded with a new website but can't migrate now defunct franchise website to new website.
Hi everyone, My client is a chain of franchised restaurants with a local domain website named after the franchise. The franchise exited the market while the client stayed and built its own brand with a separate website. The franchise website (which is extremely popular) will be shut down soon but the client will not be able to redirect the franchise website to the new website for legal reasons. What can I do to ensure that we start ranking immediately for the franchise keyphrase as soon as the franchise website is shutdown. We currently have the new website and access to the old website (which we can't redirect) Thanks, T
Intermediate & Advanced SEO | | Tarek_Lel0 -
Should I disallow via robots.txt for my sub folder country TLD's?
Hello, My website is in default English and Spanish as a sub folder TLD. Because of my Joomla platform, Google is listing hundreds of soft 404 links of French, Chinese, German etc. sub TLD's. Again, i never created these country sub folder url's, but Google is crawling them. Is it best to just "Disallow" these sub folder TLD's like the example below, then "mark as fixed" in my crawl errors section in Google Webmaster tools?: User-agent: * Disallow: /de/ Disallow: /fr/ Disallow: /cn/ Thank you, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
I want to Disavow some more links - but I'm only allowed one .txt file?
Hey guys, Wondering if you good people could help me out on this one? A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached. However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List". When I went to upload this new list I was informed that I would be replacing the existing file. So, my question is, what do I do here? Make a new list with both old and new domains that I plan on disavowing and replace the existing one? Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?
Intermediate & Advanced SEO | | Webrevolve0 -
Effect duration of robots.txt file.
in my web site there is demo site in that also, index in Google but no need it now.so i have created robots file and upload to server yesterday.in the demo folder there are some html files,and i wanna remove all these in demo file from Google.but still in web master tools it showing User-agent: *
Intermediate & Advanced SEO | | innofidelity
Disallow: /demo/ How long this will take to remove from Google ? And are there any alternative way doing that ?0 -
My Google title isn't showing what is entered
Help! On Yahoo and Bing, if you search "Chant Real Estate" the full title that I've entered appears in the search listings: Chant PA Real Estate | Real Estate PA | Pennsylvania: Find PA Homes for Sale But in Google it only shows "Chant PA Real Estate". This title was what the original developer used for their site and it's been almost a year now that it's been under our control. Any suggestions? The URL is www.chantre.com.
Intermediate & Advanced SEO | | gXe0