Sitemap Warnings
-
Due to an issue with our CMS, I had a bunch of URL aliases that were being indexed and causing duplicate content issues.
I disallowed indexing of the bad URLs (they all had a similar URL structure so that was easy). I did this until I could clean up the bad URLs
I then recieved a bunch of sitemap warnings that the URLs that I blocked URLs with robots.txt that were in the sitemap.
Isn't this the point of robots.txt? Why am I getting warnings and how can I get rid of them?
-
Irving -
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Irving,
Totally get that and we're working to ensure they are no longer included in the sitemap.
Thanks,
Lisa
-
The purpose of your sitemap is to tell Google to go out and index the pages you specify. The purpose of the robots.txt is to tell Google not to index the page. The warning is likely just a precaution to let you know that you may have by accident requested them to block something in robots.txt. If you remove the URL's from your submitted sitemap the warnings should disappear. If you leave them, you will have warnings but Google should not index the content since your blocked it in robots.txt.
-
you are not supposed to include blocked URLs in the sitemap.xml files, or Google considers it wasting their crawl time. Are these automated sitemap.xml files?
You're basically saying "come index these pages i've listed, but don't index them!"
Remove the URLs that are blocked content (or rerun/regenerate them) and resubmit the sitemaps and the warnings will go away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 702 'No-Index' warnings. Is this bad?
Moz has giving me 702 'No-Indexed Meta-descriptions' warnings. My page has quite a bit of product pages as it is a commercial chemical company which sells cleaning products for restaurants, hospitals, etc. Im wondering if this is effecting my site negatively?
Moz Pro | | ACSmt0 -
What Should I Do About Duplicate Title Warning From Category Pages Of Store?
I know a lot of the MOZ warnings can be ignored, however, I'm trying to figure out of this one should be added to that list: my store has urls setup like this for categories: https://www.mysite.com/sweaters https://www.mysite.com/sweaters/page/2 The meta title is "Sweaters" for both pages. Is that bad practice? I don't think I can automatically change the meta title to to Sweaters Page 2 or even want to. or should I do that? Or just ignore these type of warnings?
Moz Pro | | IcarusSEO0 -
Getting warning message when attach facebook account with my campaign
Hi I am getting warning message "Our access to this account will expire in 1¼ hours. Please reauthorize now to ensure that we can continue to collect data for this account." When i attach feacbook account with in campaign and social tab of seomoz application. can you tell me what is wrong here? Thank you.
Moz Pro | | Webworld_Norway0 -
Warnings, Notices, and Errors- don't know how to correct these
I have been watching my Notices, Warnings and Errors increase since I added a blog to our WordPress site. Is this effecting our SEO? We now have the following: 2 4XX errors. 1 is for a page that we changed the title and nav for in mid March. And one for a page we removed. The nav on the site is working as far as I can see. This seems like a cache issue, but who knows? 20 warnings for “missing meta description tag”. These are all blog archive and author pages. Some have resulted from pagination and are “Part 2, Part 3, Part 4” etc. Others are the first page for authors. And there is one called “new page” that I can’t locate in our Pages admin and have no idea what it is. 5 warnings for “title element too long”. These are also archive pages that have the blog name and so are pages I can’t access through the admin to control page title plus “part 2’s and so on. 71 Notices for “Rel Cononical”. The rel cononicals are all being generated automatically and are for pages of all sorts. Some are for a content pages within the site, a bunch are blog posts, and archive pages for date, blog category and pagination archive pages 6 are 301’s. These are split between blog pagination, author and a couple of site content pages- contact and portfolio. Can’t imagine why these are here. 8 meta-robot nofollow. These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes. 8 Blocked my meta-robots. And are also for the same 4 blog posts but duplicated twice each. We use All in One SEO. There is an option to use noindex for archives, categories that I do not have enabled. And also to autogenerate descriptions which I do not have enabled. I wasn’t concerned about these at first, but I read these (below) questions yesterday, and think I'd better do something as these are mounting up. I’m wondering if I should be asking our team for some code changes but not sure what exactly would be best. http://www.seomoz.org/q/pages-i-dont-want-customers-to-see http://www.robotstxt.org/meta.html Our site is http://www.fateyes.com Thanks so much for any assistance on this!
Moz Pro | | gfiedel0 -
How can I submit my sitemap file to SEOmoz? (I mean the same file I send to google and bing, not the sitemap page)
I'm new to SEOmoz. As far as I understand, SEOmoz goes out to my site and crawls it to find all the pages. Is there anyway I can submit my sitemap page to SEOmoz just like I submit it to Google or Bing? Thanks for your help! Fernando
Moz Pro | | fsantospt0 -
Wordpress-related warnings
As proposed by a number of people here, I have moved from WordPress.com to a self-hosted WordPress blog. I have also installed the SEO All in One plugin. This has been up and running for a month or so. My problem is that it is generated many (thousands) of warnings through my PRO Dashboard for Crawl Diagnostics. Specifically, I have a huge number of "Overly-dynamic URL" warnings. A typical URL is as folows: http://www.wednet.com/blog/2011/10/07/do-us-a-favor-dont/?utm_source=rss&utm_medium=rss&utm_campaign=do-us-a-favor-dont This has three querystring parameters, all generated by WordPress automatically. Here's another significant issue. With the SEO All In One plug in I can control the SEO-related parameters for each post (title, meta description, etc). However, WordPress generates a ton of virtual URLs which I can't (as far as I know) directly control. For example, the following page is a category page with all the posts for a single category. This is generating warnings because the meta description is missing. However, I do not know how to control such parameters since the page is automatically generated. http://www.wednet.com/blog/category/ceremony/ These type of warnings dominate the stats I have through my dashboard. How can I resolve these? Thanks. Mark
Moz Pro | | MarkWill0 -
SEOMoz's Crawl Diagnostics showing an error where the Title is missing on our Sitemap.xml file?
Hi Everyone, I'm working on our website Sky Candle and I've been running it as a campaign in SEOmoz. I've corrected a few errors we had with the site previously, but today it's recrawled and found a new error which is a missing Title tag on the sitemap.xml file. Is this a little glitch in the SEOmoz system? Or do I need to add a page title and meta description to my XML file. http://www.skycandle.co.uk/sitemap.xml Any help would be greatly appreciated. I didn't think I'd need to add this. Kind Regards Lewis
Moz Pro | | LewisSellers0 -
SEOMoz Campaign shows Warnings for pages with >200 and <300 links
We currently use SEOMoz's campaign tool to review the SEO progress of our site. One thing we are unsure of is that SEOMoz gives us a warning for over 1000 of our pages because we have around 200 links on those pages (all in the Menu Drop Downs). I read the post and watched the video, Whiteboard Friday Flat Site Architecture a while ago and Rand mentioned there is no issue with having a web page with 200 to 300 links and he even encouraged it. So why would these show up as warnings in our Campaign?
Moz Pro | | PBCLinear0