Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google doesn't index image slideshow
-
Hi,
My articles are indexed and images (full size) via a meta in the body also. But, the images in the slideshow are not indexed, have you any idea? A problem with the JS
Example : http://www.parismatch.com/People/Television/Sport-a-la-tele-les-femmes-a-l-abordage-962989
Thank you in advance
Julien
-
You can do a "site:" search directly in Google like this and I currently see this --> http://screencast.com/t/ZVqq5iumQ - you can probably do a site: search on the whole domain, a subfolder or a specific page etc.
-
Ok, what is the best method that you recommend for verify images indexation directly in Google ?
I would post a message explaining the change after change sitemaps.
Thanks for all
-
Thanks! OK, yes I'd make your Sitemap and HTML image URLs the same.
Also, that's a LOT of images, so I'm not surprised Google is taking time to index them.
Also, there can sometimes be a delay in Search Console data. You can always be checking Google itself to see what files are indexed.
-
Not really, it seem be ok
-
Thanks! Hmmm did it clear Search Console without any errors? I see an error in my browser --> http://screencast.com/t/VLWhg8EyR3Dd
-
The images are here :
http://www.parismatch.com/var/exports/sitemaps/sitemap_images_parismatch-10.xml
-
Is this your current sitemap?
http://www.parismatch.com/var/exports/sitemaps/sitemap_parismatch-index.xml
What is the direct address of the image sitemap(s)?
Thanks!
-
Thanks Dan. Unfortunately, we have changed the images of host, on a different CDN...
Before the redesign, we used exactly this configuration, visible on this page (it's just an article, we don't have a slideshow example):
http://www.parismatch.com/Chroniques/Art-de-vivre/Lodge-Story-925785We have perhaps a problem with the image sitemaps because we have in Google Sitemaps:
<image: loc="">http://cdn-parismatch.ladmedia.fr/var/news/storage/images/paris-match/culture/cinema/le-fils-de-saul-la-critique-763334/8067828-1-fre-FR/Le-Fils-de-Saul-la-critique.jpg</image:>
and in the HTML source:
the perhaps should be put in the same sitempas URLs as used in HTML?
Many thanks for your help !
-
I see, thanks. Hmmm... did anything else change besides the re-design? Did the images URLs change, or did where they were being hosted change?
The current implementation doesn't show any issues, but I wonder if things were properly done in moving to the new design. Did you always have a slideshow format? Did the code change or just the design?
-
Thanks Dan !
I'm agree with you. It's problematic because since website redesign, we record a fall of images traffic by Google
-
Hi There
There does not appear to be any accessibility issues. I can crawl and access the images just fine with my crawler.
My guess is that since the images are duplicate, and they also exist on other websites, Google may be avoiding indexing them since they already are indexed and they are technically not being linked to with a normal tag.
Is this causing a particular issue for the site? Or is it just a pesky technical bug?
-
The display image is resized and indexed :
and the full size image is in META but not indexed :
-
How are your images being fed into the site? Are you using a CDN?
-Andy
-
The robots.txt file doesn't block the images, I check it. The website is under Easy Publish.
-
Hi Julien,
I always start with robots.txt in these cases, but that looks OK.
Is anything being blocked by JS? Something else to look at is if you are using something like Wordpress, there are plugins that can block access to these without you realising.
Looking at the URL of the image, this appears to be hosted on a 3rd party site?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0