Pictures 'being stolen'
-
Helping my wife with ecommerce site. Selling clothes. Some photos are given by producer, but at times they are not too good. Some are therefore taking their own photos and i suspect ppl are copying them and using them on their own site. Is there anyting to do about this - watermarking of course, but can they be 'marked' in anyway linking to your site ?
-
Thanks, I will have to try it out as well.
-
Google Images now has an option where you can upload a photo or point to a photo's URL and it will act like TinEye and do a search for that image. I haven't compared the two services with the same image so I can't speak for if one is more complete, but it is something else to try.
-
Hi Dan,
There are a couple of simple ways you can deal with this problem. Basically all you need is the ability to edit a .htaccess file. So simple even I can do it!
Our head of programming wrote a blog post to explain both methods for you. Take a look here Don't Steal My Images!
Hope that helps,
Sha
-
Besides watermarking I would upload the photo's to TInyEye. This will help you at least find out if people are using your images elsewhere plus helps show your ownership rights to the photos.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm struggling to understand (and fix) why I'm getting a 404 error. The URL includes this "%5Bnull%20id=43484%5D" but I cannot find that anywhere in the referring URL. Does anyone know why please? Thanks
Can you help with how to fix this 404 error please? It appears that I have a redirect from one page to the other, although the referring page URL works, but it appears to be linking to another URL with this code at the end of the the URL - %5Bnull%20id=43484%5D that I'm struggling to find and fix. Thanks
Technical SEO | | Nichole.wynter20200 -
Merging sites, ensuring traffic doesn't die
Wondering if I could get a second opinion on this, please. I have just taken on a new client, they own about 6 different niched car experience websites (hire an Aston Martin for the day, type thing). All the six sites they have seem to perform reasonably well for the brand of car they deal with, the average DA of the sites is about 24. The client wishes to move all of these different manufacturers into one site and have sections of the site, they can then also target more generic experience day type keywords. The obvious way of dealing with this move would be to 301 the old sites to the relevant places on the new site and wait for that to rank. However, looking at the backlinks profile of the niched sites, they seem to have very few backlinks and i feel the reason they are ranking so well for all the individual manufacturers is because they all feature the name in the domain. Not exact match, but the name is there. If I am thinking right, with the 301 we want to tell Google page x is now page y, index this one instead. Because the new site has a more generic name I don't think it will enjoy any of the domain keyword benefits which are helping the sub sites, and as a result I expect the rankings and traffic to drop (at least in the short term). Am I reading this correct. Would people use a 301 in this case? The easiest thing to do would be to leave the 6 sub sites up and running on their own domain and launch the new site to run alongside them, however the client doesn't want this. Thanks, Carl
Technical SEO | | GrumpyCarl0 -
Different IP's in one Server
Hi, I just want to ask if there is no bad effect in SEO if we do have different websites that has different IP address but has shared in only 1 server? Thank you
Technical SEO | | TirewebMarketing0 -
What is the best approach to specifying a page's language?
I have read about a number of different tags that can accomplish this so it is very confusing. For example, should I be using: OR
Technical SEO | | BlueLinkERP0 -
Panda or Penquin -Website Fell - Shouldn't this Recover?
On March 23rd our site fell 47% in one day. www.TranslationSoftware4u.com but we still held quite a few #1 to #7 rankings on Google and thought it would just recover. Our top keyword "translation software" was #4 , now we are #19 Over the next week I waited to see if it recovered. We have been online 10+ years and always stayed with white hat. I admit to learning as I go over the years but always felt content was king so I focused on information. I really do not see my site as using spam techniques but maybe I am missing something on the way I have it. March 23rd, major drop -47% On April 2nd I started with SEO MOZ and the Research tools showed we had duplicate content warning. This was from a blog we were trying to start that only had 7 posts but it had about 20 tags per post. I did not realize that tags actually created that post under that tag. I went in and deleted the tags again being stupid and not realizing it was then making that come up 404. The blog was so small we do not get hits on it anyway so hoping it just clears itself up. ( still get duplicate warning on our directory due to using "php Link Directory", but it's due to how it reuses the title tag and description, 2 instances per category page"). Still trying to fix the php directory issue. Seems many others are running it and did not have a drop. April 24th, we dropped another -10% It keeps falling -70% now. I have gone through the site and tried to clean up any warnings like duplicate title tags, meta descriptions. With regards to links I put up a small web directory with some reciprocal linking. Our product translates languages but software is not the same as a human so we often set clients up with human translators, the directory is a nice place to help our customers find a translator or see online tools that can help. The links were not excessive, there were maybe 100 links. After the fall I went in and found some translators had gone out of business so I deleted those, I am down to 65 links now, about 45 are exchanges. I have submitted to some online directories manually, but looking back through the links there is not really anything that makes me concerned. The link back to my site was really the most neglected SEO thing I did. Again concentrating on content. I did find a few links that I was not happy about but I did not put those links so had no control. I have been working on cleaning up my title tags, and making sure the content just reads better. I have been hoping that my site would just start recovering but it keeps sliding. Has anyone seen recovery from the updates. Should I see anything yet? I cannot seem to get Google to return to the site and reindex. Am I doing somethign spammy on my site and I do not realize it? Thanks for any advice in advance!
Technical SEO | | Force70 -
Any idea why our sitemap images aren't indexed?
Here's our sitemap: http://www.driftworks.com/shop/sitemap/dw_sitemap.xml In google webmaster tools, I can see the sitemap report and it says: Items:Web Submitted:2,798 Indexed:2,910 Items:Images Submitted:3,178 Indexed:0 Do you have any idea why our images are not being indexed according to webmaster tools? I checked a few of the image URLs and they worked nicely. Thanks in advance, J
Technical SEO | | DWJames0 -
About Bot's IP
Hi, one of my customers had probably block the IP of SEOMOZ's bot. Could you give me : IP User-agent's name thks for helping me 😉
Technical SEO | | dawa1 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0