Do the search engines penalise you for images being WATERMARKED?
-
Our site contains a library of thousands of images which we are thinking of watermarking. Does anyone know if Google penalise sites for this or is it best practice in order to protect revenues?
As watermarking these images makes them less shareable (but protects revenues) i was thinking Google might then penalise us - which might affect traffic
Any ideas?
-
http://googlewebmastercentral.blogspot.co.uk/2009/11/pros-and-cons-of-watermarked-images.html
"Will Google rank an image differently just because it's watermarked?
Peter: Nope. The presence of a watermark doesn't itself cause an image to be ranked higher or lower." -
Thanks Russ, v helpful
-
Nope, if they did, i'd be in big trouble.
-
I am fairly certain Google does not penalize sites for watermarking images. However, losing shares of those images could potentially lose rankings.
One opportunity would be to include a section of your images free of charge with a link attribution policy. This would get you a nice link (don't manipulate the alt text, just the name of the image or your service)
Also, have you checked to see how often your images are being stolen? You can use a service like tineye.com to do this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not all images indexed in Google
Hi all, Recently, got an unusual issue with images in Google index. We have more than 1,500 images in our sitemap, but according to Search Console only 273 of those are indexed. If I check Google image search directly, I find more images in index, but still not all of them. For example this post has 28 images and only 17 are indexed in Google image. This is happening to other posts as well. Checked all possible reasons (missing alt, image as background, file size, fetch and render in Search Console), but none of these are relevant in our case. So, everything looks fine, but not all images are in index. Any ideas on this issue? Your feedback is much appreciated, thanks
Technical SEO | | flo_seo1 -
Google Only Understand Image in img src=image path and not data-src= image path
Hello Moz Team Actually for my ecommerce site at product page for one product i have many images and what my developer did for 1st image is on Img src format but other images load on after click for that they have used data-src= image path, data-big= image path so Google Only Understand Image in img src=image path and not data-src= image path? If yes that means only img src = image path will index in google and not other images?2) So in images sitemap xml I have to submit only images scr = images path images only?3) Currently in my images sitemap xml all images are there either img src, data-src etc so what will google do here? He will ignore data-src images or he will try to search on website ? Will he confuse or use by bandwidth or crawl rate? Can you pls guide me? Thanks!
Technical SEO | | Johny123450 -
Google only indexed 19/94 images
I'm using Yoast SEO and have images (attachments) excluded from sitemaps, which is the recommended method (but could this be wrong?). Most of my images are in my posts; here's the sitemap for posts: https://edwardsturm.com/post-sitemap.xml I also appear on p1 for some good keywords, and my site is getting organic traffic, so I'm not sure why the images aren't being indexed. Here's an example of a well performing article: https://edwardsturm.com/best-games-youtube-2016/ Thanks!
Technical SEO | | Edward_Sturm0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
No Google cached snapshot image... 'Text-only version' working.
We are having an issue with Googles cached image snapshops... Here is an example: http://webcache.googleusercontent.com/search?q=cache:IyvADsGi10gJ:shop.deliaonline.com/store/home-and-garden/kitchen/morphy-richards-48781-cooking/ean/5011832030948+&cd=308&hl=en&ct=clnk&gl=uk I wondered if anyone knows or can see the cause of this problem? Thanks
Technical SEO | | pekler1 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
Explain this search result
Hi folks, I came across a strange search result. Search on Google Australia for "income portfolio". http://www.google.com.au/search?sourceid=chrome&ie=UTF-8&q=income+portfolio See the first result? It's a login page. How is that search result showing? And in position #1! Where is it getting its title and descriptions tags from? Does Google have a way to somehow see what is behind the login? Appreciate your thought.
Technical SEO | | scotennis0 -
Site Disappeared off of Search
A friend of mine has a site (http://bit.ly/q4iWkM ) that was ranking number one for their key word (Drimnagh() and has now completely disappeared off of the ranking. I did some checking and can't see a problem. She does have duplicate meta and titles throughout but this shouldn't be a punishable offence that I know of and is something that I am going to correct with a quick plugin install. I couldn't see any redirects or code stopping search either. When you do site:URL it shows up OK as well. She is client of mine (for website not for SEO) and she is really upset about it so any help from the forum would be appreciated. This isn't even a site I did but you couldn't get a better person to work with so I am eager to help where and if possible. Guinness all round if someone solves it next time you are in Ireland
Technical SEO | | kdaly1000