All images are noindex will opening this at once be an issue?
-
Hi,
All images are noindex will opening this at once be an issue?
Not sure how a few months ago all my images were set as noindex which i realized last week. We have 20K images which were indexed fine but now when i check
Site:sitename it shows 10 or 12 and the inspect element via Chrome i see the noindex is set for all images. We have been renaming the images and adding ALT tags for most of them and would it be an issue if we change the noindex in one shot or should we do them few at a time?
Thanks
-
Hi,
I can share from my personal experience, we had noindex to images earlier - though it was much lower as against yours as just 1500 images. after removing noindex from it - there was no issue at all
The images started showing in google images after 2 weeks and almost after 6 weeks - we could start seeing most of them. You are doing correct by using Alt attribute, only thing is naming the image correctly (for ex, if someone searches for honda city then image should contain name as honda-city.jpg or like this along with Alt text)
-
Thanks
label images is being done as image-name-that-it-represents.jpg
5+ words of file name and 5+ words of ALT keys would that be fine?
-
I don't think anything will happen. You're conversion rate may drop because 20K images will now be available in image search. I remember once I started ranking for the keyword: car. I saw a big increase in organic, but a drop in conversions. Try to be careful how you label your images. Also, make annotation in GA so you can compare before and after.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Image Sitemap
I currently use a program to create our sitemap (xml). It doesn't offer creating an mage sitemaps. Can someone suggest a program that would create an image sitemap? Thanks.
Technical SEO | | Kdruckenbrod0 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
RSS feed issue
My Wordpress blog RSS feed is not working correctly and I can't figure why. This is the error I am getting in my sidebar where the RSS feed used to work properly. My Blog is http://www.seadwellers.com/key-largo-diving-blog/ RSS Error: This XML document is invalid, likely due to invalid characters. XML error: not well-formed (invalid token) at line 274, column 32 Any insight would be appreciated greatly! Rob
Technical SEO | | sdwellers0 -
Product Documentation Causing 23-40K issues
One of my biggest hurdles at my company is our Product Documentation library, which houses thousands of pages of publicly accessible and indexed content on old and new versions of our product. Every time a product name changes the URL changes, causing a 404, so I typically have 100s of 404s every few months from this site. It's housed off our main domain. We have 23,000+ Duplicate Pages, 40,000 missing meta descriptions, and 38,000 due to this library. It is not built the same as our main content, with page titles and meta descriptions, so everything is defaulted and duplicate. I'm trying to make a case that this is an issue, especially as we migrate our site next year to a new CMS. Does anyone have any suggestions for dealing with this issue in the short term and long term? Is it worth asking the owners of the section of content to develop page titles and meta descriptions on 40,000 pieces of content? They do not see the value of SEO and the issues this can cause. It needs to be publicly accessible, but it's not highly ranked content. It's really for customers who want to know more about the product. But I worry it is hurting other parts of our site, with the absurd amount of duplicate content, meta, and page title issues.
Technical SEO | | QAD_ERP0 -
IP redirect possible SEO issues?
Hi I help manage a small eCommerce business that sells worldwide. We currently have 4 stores www.tidy-books.com (US site) www.tidy-books.co.uk (UK) www.tidy-books.eu (EU) www.tidy-bboks.com.au (AU) We have auto redirect based on the country of your IP. The .com is the route address though. Currently I have decided to focus all my energies with link building etc on .com as it seems to carry the most weight and being a very small team we don't have the resources for separate campaigns for all websites at the moment. Whilst I've done on site SEO with country in mind i.e for US kids instead of children. I'm just wondering if you can see any pitfalls in this approach, Ideally I would like to have a team focus on each store separately and have them as separate domains but with the IP redirect we at least rank well in Google for the .com and the customers gets sent to the relevant store. Any thoughts would be great
Technical SEO | | tidybooks0 -
Impact issues when switching from .com to uk
Buon Giorno from wetherby UK 5 degrees C and rivers bursting their banks everywhere 😞 This site http://www.sandtoft.com/ has requested a switch in url forwardfing in that they want the following to happen: When u enter the .com url it forwards to the .co.uk domain (the opposite from what it is today ie when you eneter .co.uk it switches to the .com url) So my question is please... "Will changing the .com url to .co.uk via forwarding affect SERPS in any significant manner" My view is the impact will be a minor dip in the serps followed by a recovery. Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Crawling issues in google
Hi everyone, I think i have crawling issues with one of my sites. It has vanished form Google rankings it used to rank for all services i offered now it doesn't anymore ever since September 29th. I have resubmitted to Google 2 times and they came back with the same answer: " We reviewed your site and found no manual actions by the web spam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search. " How i detected that it may be a crawling issue is that 2 weeks ago i changed metas - metas are very slow in getting updated and for some of my pages never did update Do you know any good tools to check for bad code that could slow down the crawling. I really don't know where to look other than issues for crawling. I validated the website with w3c validator and ran xenu and cleaned these up but my website is still down. Any ideas are appreciated.
Technical SEO | | CMTM0 -
Google SERPs and NoIndex directives.
We have pages that have been added to robots.txt as url patterns in DisAllow. Also, we have the meta noindex tags on the pages themselves. But we are finding the pages in index. I don't think they are higher up in the rankings and they don't have any descriptions, any previews or any cached pages. Why does Google show these pages? Could it be due to internal or external linking?
Technical SEO | | gaganc0