Image Sitemap Indexing Issue
-
Hello Folks,
I've been running into some strange issues with our XML Sitemaps.
- The XML Sitemaps won't open on a browser and it throws the following error instead of opening the XML Sitemap. Sample XML Sitemap - www.veer.com/sitemap/images/Sitemap0.xml.gzError - "XML Parsing Error: no element foundLocation: http://www.veer.com/sitemap/images/Sitemap0.xmlLine Number 1, Column 1:"2) Image files are not getting indexed. For instance, the sitemap - www.veer.com/sitemap/images/Sitemap0.xml.gz has 6,000 URLs and 6,000 Images. However, only 3,481 URLs and 25 images are getting indexed. The sitemap formatting seems good, but I can't figure out why Google's de-indexing the images and only 50-60% of the URLs are getting indexed. Thank you for your help!
-
Hi Cyrus,
Thank you for your note and my apologies for delay in response.
The indexation number is from Google Webmaster Tools.
The two are identical and I've tested other XML sitemap files that are in GZ format that opened fine in the browser without unzipping them or prompting a DL. The sitemaps were uploaded to GWT as the .gz files only since we have many pages to upload.
I'll check with our Dev Team regarding the XML parsing error.
Please let me know what other areas we need to look into based my answers to your questions. Thank you for your help, I greatly appreciate it!
-
Some possible suggestions:
- Make sure every image has a width and height attribute defined in the HTML. Images are much more likely to be indexed this way.
- Same with the "alt" attribute
- Make sure your image subdirectory isn't blocked (robots.txt for example)
- Same with the pages
It may be Google actually is indexing those images, but not reporting them in GWT. Do an image search and narrow results to your site, to see if your images actually appear.
Aside from accessibility issues, make sure the images are on well-linked to pages. It's much more likely for an image to be indexed on a page with good link metrics and a lack of crawl problems.
-
@Cyrus
You have given very good explanation. But, I have similar issue for image sitemap. If we are talking about crawling & indexing ratio so, it's quite good. You can know more by attachment.
You can check syntax of image sitemap by following XML.
http://www.vistastores.com/patio_umbrellas_sitemap.xml
Can you give me input ::: How can I improve crawling and indexing for images?
-
Hi Corbis,
Man, you've got some tough questions! i may have to call in some outside support on this one if we can't figure it out.
First of all, are you getting the indexation #s from Google Webmaster Tools? What I mean by this - is Google saying there are 6000 URLs in your sitemap, but they are only indexing 3,481?
When I unzipped the compressed sitemap file, it opened fine in my browser, while the 2nd uncompressed file did not. Are they identical? And have you submitted both to Google?
There could be many reasons why you're getting the XML parsing error. One issue might be in the second line, referencing http://www.google.com/schemas/sitemap-image/1.1/ as a Schema location, because this is an html webpage and not an XML or DTD file. You might try removing the reference to this URL, and see if that helps.
Otherwise, if Google is reporting the correct number of URLs and Images, then you know they are aware of those URLs, and the problem may not be with the sitemap. Google doesn't necessarily index all URLs in a sitemap, but instead bases it's indexing on factors like your domain authority, link structure and crawl allowance. Addressing these issues will usually help get more pages indexed than a sitemap alone.
So if you can improve internal crawl errors, duplicate content issues, and make sure there is a good navigational architecture to your site, you should see a good rise in indexations.
-
Hi Folks,
Just following up on this query. Any insights? Thank you for your help!
-Corbis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking issue
So i am working with a review company and I am having a hard time with something. We have created a category which lists and categorizes every one of our properties. For example a specific property in the category "restaurant" would be as seen below: /restaurant/mcdonalds /restaurant/panda-express And so on and so on. What I am noticing however is that our more obscure properties are not being linked to by any page. If I were to visit the page myurl.com/restaurant I would see 100+ pages of properties, however it seems like only the properties on the first few pages are being counted as having links. So far the only way I have been able to work around this issue is by creating a page and hiding it in our footer called "all restaurants". This page lists and links to every one of our properties. However it isn't exactly user friendly and I would prefer scrapers not to be able to scrape all properties at once! Anyway, any suggestions would be greatly appreciated.
Technical SEO | | HashtagHustler0 -
Technical guide for Setting up a CDN to host our images, as well as creating an image sitemap, and setting up the CDN in GWT?
Hi All! We're thinking of setting up a CDN to host our images with a CNAME on a subdomain of our site. In terms of SEO, I was wondering if any of you knew of a pretty complete technical guide for setting it all up. Including whether or not we need to create an image sitemap, and setting it up in GWT. Thanks in advance! Vince
Technical SEO | | jbrisebois0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
Updating product pages with new images - should I redirect old images ?
Hello,
Technical SEO | | ninjahippo
We have approx 900 products on our website. Over the coming months we will be replacing the product images. At the moment they have file names like 'green_widget_54eb3a78620be.jpg'
the random jumble at the end of the filename was apparently to keep file names unique. We have removed the jumble part and will have file names like:
'black_widget_with_stripe_001.jpg' The CMS removes the old main image when a new main image is uploaded. But we could change this to leave the old image, but not use it. My question is should we: redirect the old file name to the new file name? upload the new image, and leave the old image in place Or do we just ignore.0 -
Why my website does not index?
I made some changes in my website after that I try webmaster tool FETCH AS GOOGLE but this is 2nd day and my new pages does not index www. astrologersktantrik .com
Technical SEO | | ramansaab0 -
Removing indexed website
I had a .in TLD version of my .com website floated for about 15 days, which was a duplicate copy of .com website. I did not wish to use the .in further for SEO duplication reasons and had let the .in domain expire on 26th April. But still now when I search from my website the .in version also shows up in results and even in google webmaster it shows the the website with maximum (190) number of links to my .com website. I am sure this is hurting the ranking of my .com website. How can the .in website be removed from googles indexing and search results. Given that is has expired also. thanks
Technical SEO | | geekwik0 -
Duplicate Content Issue with
Hello fellow Moz'rs! I'll get straight to the point here - The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate. We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or? Thank you all and happy rankings, James
Technical SEO | | JamesPiper0 -
How do https pages affect indexing?
Our site involves e-commerce transactions that we want users to be able to complete via javascript popup/overlay boxes. in order to make the credit card form secure, we need the referring page to be secure, so we are considering making the entire site secure so all of our site links wiould be https. (PayPal works this way.) Do you think this will negatively impact whether Google and other search engines are able to index our pages?
Technical SEO | | seozeelot0