XML Sitemap Issue or not?
-
Hi Everyone,
I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues.
Issue: Url blocked by robots.txt.
Description: Sitemap contains urls which are blocked by robots.txt.
Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml
Value: http://www.example.org/author/admin/
My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked.
Do you think i m having a major problem or everything is fine?What should I do? How can I fix it?
FYI: Wordpress is what we use for our website
Thanks
-
Hi Dan
Thanks for your answer. Would you really recommend using the plugin instead of just uploading the xml sitemap directly to the website's root directory? If yes why?
Thanks
-
Lisa
I would honestly switch to the Yoast SEO plugin. It handles the SEO (and robots.txt) a lot better, as well as the XML sitemaps all within that one plugin.
I'd check out my guide for setting up WordPress for SEO on the moz blog.
Most WP robots.txt files will look like this;
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/
And that's it.
You could always just try changing yours to the above setting first,
before switching to Yoast SEO - I bet that would clear up
the sitemap issues.
Hope that helps!
-Dan ```
-
Lisa, try checking manually which URL is not getting indexed in Google. Make sure you do not have any no follows on those pages. If all the pages are connected / linked together, then Google will crawl your whole site eventually, just a matter of time.
-
Hi
when generating sitemap there are 46 URLs detected by xml-sitemaps.com but when adding the sitemap to WMT only 12 get submitted and 5 are indexed which is really kind of worrying me. This might be because of the xml sitemap plugin that I installed. May be something is wrong with my settings(doc attached 1&2)
I am kind of lost especially that SEOmoz hasn't detected any URLs blocked by Robot.txt
It would be great if you could tell me what should I do next ?
Thanks
-
The first question i would ask is how big is the difference. If the difference is a large in the # of pages on your site and the ones indexed by Google, then you have an issue. The blocked pages might be the ones linking to the ones that have not been indexed and causing issues. Try removing the no follow on those pages and then resubmit your sitemap and see if that fixes the issue. Also double check your site map to make sure you have correctly added all the pages in it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http to https redirection issue
Hi, i have a website with http but now i moved to https. when i apply 301 redirection from http to https & check in semrush it shows unable to connect with https & similar other tool shows & when i remove redirection all other tools working fine but my https version doesn't get indexed in google. can anybosy help what could be the issue?
Technical SEO | | dhananjay.kumar10 -
Google Cache issue
Hi, We’ve got a really specific issue – we have an SEO team in-house, and have had numerous agencies look at this – but no one can get to the bottom of this. We’re a UK travel company with a number of great positions on the search engines – our brand is www.jet2holidays.com. If you try ‘Majorca holidays’, ‘tenerife holidays’, ‘gran canaria holidays’ etc you’ll see us in the top few positions on Google when searching from the UK. However, none of our destination pages (and it’s only the destination pages), show a ‘cached’ option next to them. Example: https://www.google.com/search?q=majorca+holidays&oq=majorca+holidays&aqs=chrome..69i57j69i60l3.2151j0j9&sourceid=chrome&ie=UTF-8 This isn’t affecting our rankings, but we’re fairly certain it is affecting our ability to be included in the Featured Snippets. Checked and there aren’t any noarchive tags on the pages, example: https://www.jet2holidays.com/destinations/balearics/majorca Anyone have any ideas?
Technical SEO | | fredgray0 -
Woocommerce URL Structure Issue
Hi everyone ! To put you in context, I am doing an audit on an E-Commerce site selling auto parts with WooCommerce. I have some concerns regarding the url structure and here's why: Product category page url: /auto/drivetrain/cv-axle-shaft-assembly/
Technical SEO | | alexrbrg
Product page url included in the product category page: /product/acura-integra-cv-axle-shaft-90-01-honda-civic/ The way I see my situation is that the product page is considered by Google as an intern link and not as a page included in in the subfolder of the category page. 1. Am I right?
2. If yes, is there a solution to fix the issue with woocommerce to improve the category page ranking ? Thanks y'all !0 -
Google Webmaster Image Index Issue
I submitted the image sitemap in GWT and only few of them get indexed in google, but now the indexed images are also getting de-index. Any solution for it? See the attached E4hPDQE
Technical SEO | | tigersohelll0 -
Having Problems to Index all URLs on Sitemap
Hi all again ! Thanks in advance ! My client's site is having problems to index all its pages. I even bought the full extension of XML Sitemaps and the number of urls increased, but we still have problems to index all of them. What are the reasons? The robots.txt is open for all robots, we only prohibit users and spiders to enter our Intranet. I've read that duplicate content and 404's can be the reason. Anything else?
Technical SEO | | Tintanus0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
Sitemap and crawl impact
If I have two links in the sitemap (for example: page1.html and page2.html) but the web-site contains more pages (page1.html, page2.html and page3.html) is this a sign for Google to not to crawl other pages? I.e. Will Google index page3.html? Consider that any page can be accessed.
Technical SEO | | ditoroin0