How does Google view duplicate photo content?
-
Now that we can search by image on Google and see every site that is using the same photo, I assume that Google is going to use this as a signal for ranking as well. Is that already happening?
I ask because I have sold many photos over the years with first-use only rights, where I retain the copyright. So I have photos on my site that I own the copyright for that are on other sites (and were there first). I am not sure if I should make an effort to remove these photos from my site or if I can wait another couple years.
-
Hi Lina
What Google wants is unique markup and tagging for that image. It relies on things like image optimization (for SEO), Schema markup, and image sitemaps to assist in understanding the photo better and what it represents so it can be returned in search results.
You can learn more about reverse image search here.
Hope this helps! Good luck!
-
Well, I'm not that good at it. Overall, it's not a big deal but some of the photos are from places that are far away and that I'm not likely to go back to soon. And now I need to go through the whole site and see which ones might be on other sites. In the future, I know to keep my best photos for my own use!
I just watched that white board video and realized that I have an awful lot to work on.
-
Lina
I would look at it glass half full. I cant take a photo - so it costs me or clients 1,000's if not 10,000 for photography. You clearly can - so cost effective and you can control what goes onto your site. You are in a great position. Upsell original photography...
I also think though it is a factor it is not high ranking factor (yet!).
I also found a great WBF for you. https://moz.com/blog/panda-optimization-whiteboard-friday - states the position better than me!
good luck, photography is a great talent to have.
-
It's a shame, because many of the photos were included with CNN articles, so they have been scraped and are on hundreds of sites. The photos all have my name on the photo itself as the copyright holder, but that isn't going to mean anything to Google when I used the same photo two years later. This sort of means that photographers won't be able to resell photos, and that stock photography is a terrible idea!
-
Yes, that is already happening.
Most assume that even though "google reverse image" is "public" behind the scenes it forms part of the google algorithm. Google wants originality... and it seems only natural to use google reverse image as an indicator.
If it is one photo on a few sites i would not get too excited, but if it is on alot of sites and is not difficult to change - I would suggest you do.
test your image on google reverse image... always a good start.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
Duplicate content in product listing
We have "duplicate content" warning in our moz report which mostly revolve around our product listing (eCommerce site) where various filters return 0 results (and hence show the same content on the page). Do you think those need to be addressed, and if so how would you prevent product listing filters that appearing as duplicate content pages? should we use rel=canonical or actually change the content on the page?
Technical SEO | | erangalp0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Duplicate Homepage In Google
Hi Just found through my SEO dashboard, Google has two versions of the same homepage, the root page, plus the index.html page, causing duplicate content from both the pages. what is the best option to ensure google only have 1 version of the homepage listed?
Technical SEO | | rfksolutionsltd0 -
SEO with duplicate content for 3 geographies
The client would like us to do seo for these 3 sites http://www.cablecalc.com/ http://www.solutionselectrical.com.au http://www.calculatecablesizes.co.uk/ The sites have to targetted in US, Australia, and UK resoectively .All the above sites have identical content. Will Google penalise the sites ? Shall we change the content completly ? How do we approach this issue ?
Technical SEO | | seoug_20050 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0