Sitemap Warnings
-
Due to an issue with our CMS, I had a bunch of URL aliases that were being indexed and causing duplicate content issues.
I disallowed indexing of the bad URLs (they all had a similar URL structure so that was easy). I did this until I could clean up the bad URLs
I then recieved a bunch of sitemap warnings that the URLs that I blocked URLs with robots.txt that were in the sitemap.
Isn't this the point of robots.txt? Why am I getting warnings and how can I get rid of them?
-
Irving -
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Irving,
Totally get that and we're working to ensure they are no longer included in the sitemap.
Thanks,
Lisa
-
The purpose of your sitemap is to tell Google to go out and index the pages you specify. The purpose of the robots.txt is to tell Google not to index the page. The warning is likely just a precaution to let you know that you may have by accident requested them to block something in robots.txt. If you remove the URL's from your submitted sitemap the warnings should disappear. If you leave them, you will have warnings but Google should not index the content since your blocked it in robots.txt.
-
you are not supposed to include blocked URLs in the sitemap.xml files, or Google considers it wasting their crawl time. Are these automated sitemap.xml files?
You're basically saying "come index these pages i've listed, but don't index them!"
Remove the URLs that are blocked content (or rerun/regenerate them) and resubmit the sitemaps and the warnings will go away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Best Practices
My question is regarding the URL structure best practices of a sitemap. My website allows search any number of ways, i.e. 1. http://www.website.com/category/subcategory/product 2. http://www.website.com/subcategory/product 3. http://www.website.com/product However, I am not sure which structure to use in the sitemap (which is being written manually). I know that for SEO purposes the 3rd option is best as the link is more relevant to that individual product, but the Moz tool states that the home page should have less than 100 links (although Google doesn't penalise for having more) and by writing my entire site in the 3rd way it would result in a lot more links adjoining to the home page. It is either the 2nd or 3rd option, I think, as the 1st category is not keyword specific (rather a generic term, i.e. novelties). Does anyone have experience with this?
Moz Pro | | moon-boots0 -
Issues with Moz producing 404 Errors from sitemap.xml files recently.
My last campaign crawl produced over 4k 404 errors resulting from Moz not being able to read some of the URLs in our sitemap.xml file. This is the first time we've seen this error and we've been running campaigns for almost 2 months now -- no changes were made to the sitemap.xml file. The file isn't UTF-8 encoded, but rather Content-Type:text/xml; charset=iso-8859-1 (which is what Moveable Type uses). Just wondering if anyone has had a similar issue?
Moz Pro | | BriceSMG0 -
I have had ro resubmit my sitemap to google, Bing & yahoo. Does SEOmoz automatically pic that up?
Hi there I am monitoring this website for a client: www.smsquality.com Someone on their side had gone and blocked the sitemap from being crawled and also in some form or another removed it as well. (Confusing I know) However I have gone and recreated the sitemat for these guys allowing robots to crawl the site, resubmitted it to all major search engines. My question is; Will SEOmoz be ableto crawl the site like it usually does and give me proper results for my Keywords placed into the Keywords Capmaign as well as give me Onsite page crawls using these keywords with proper results? Thanks in Advance Ray
Moz Pro | | RayHay0 -
Is there a quick and easy way to fix 8776 errors, 19131 warnings and 164 notices on a campaign?
My account dashboard shows several types of errors,, warnings and notices. I am just asking if there is a quick way to fix this.
Moz Pro | | Jchapman0 -
"Too many on-page links" warning on all of my Wordpress pages
Hey guys, I've got like 120 "Too many on-page links" warnings in my crawl diagnostics section. They're all the pages of my WordPress blog. Is this an acceptable and expected warning for Wordpress or does something need to be better optimized? Thanks.
Moz Pro | | SheffieldMarketing0 -
What is the best practice for replacing an old xml sitemap?
I have an existing xml sitemap that my website developer loaded, however I don't think its set up properly. What is the best practice for replacing an old xml sitemap? Is there anything I should be concerned about?
Moz Pro | | webestate0 -
Wordpress-related warnings
As proposed by a number of people here, I have moved from WordPress.com to a self-hosted WordPress blog. I have also installed the SEO All in One plugin. This has been up and running for a month or so. My problem is that it is generated many (thousands) of warnings through my PRO Dashboard for Crawl Diagnostics. Specifically, I have a huge number of "Overly-dynamic URL" warnings. A typical URL is as folows: http://www.wednet.com/blog/2011/10/07/do-us-a-favor-dont/?utm_source=rss&utm_medium=rss&utm_campaign=do-us-a-favor-dont This has three querystring parameters, all generated by WordPress automatically. Here's another significant issue. With the SEO All In One plug in I can control the SEO-related parameters for each post (title, meta description, etc). However, WordPress generates a ton of virtual URLs which I can't (as far as I know) directly control. For example, the following page is a category page with all the posts for a single category. This is generating warnings because the meta description is missing. However, I do not know how to control such parameters since the page is automatically generated. http://www.wednet.com/blog/category/ceremony/ These type of warnings dominate the stats I have through my dashboard. How can I resolve these? Thanks. Mark
Moz Pro | | MarkWill0 -
Anyone have a free tool to create a xml sitemap?
Or can I use the Custom Crawl tool to help create this? The domain I'm working with has 1,300+ pages so most free tools I've used in the past won't capture that many pages.
Moz Pro | | JonClark150