Image and Content Management
-
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images.
Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc...
I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
-
If your boss really feels there is a site which offers such protection, please share the URL. There are sites which may disable right-clicking but that only removes a single method of copying content.
-
I totally concur. You cant stop anyone from jacking your content, or graphics.
-
That was what I was thinking.
I knew it would be nearly impossible, but I wanted to make sure I wasn't out of the loop on something. I am pretty sure he seen a website that had text select disabled or something of that nature, but there is always works arounds. Thanks
-
Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
It is not possible to prevent content theft. Anyone who can read your content can copy it. The same applies to images. It is a mistake to try. At best, you will prevent a relatively small percentage of people with little understanding from being able to copy your content, and that is not the group people normally wish to impede.
Let's say you did create a new technology which actually prevented most forms of copying content. If a person sees your webpage they can take an image of the page at the push of a single button, then paste that image into photo shop. From there they can cut out your images and save them. They could also use a OCR to capture all the text. This can be done in 60 seconds.
There is a large interest in achieving the goal you seek and I am not aware of any organization that has come close on any level to achieving it. If somehow it was achieved, the page would not be readable by search engines. Search engines have to be able to read every character of text through their web crawlers. In effect, search engines scrape your content. If you find a technology to prevent copying text, it would prevent search engines from crawling as well.
The best your boss can do is copyright the website and/or images. It is highly likely your bosses fears will be realized at some point and his content or images will be stolen if it is of a high quality nature. A copyright will provide him with the maximum amount of legal protection. You can also employ the services of companies which will embed a tracking code of sorts into images. These companies then crawl the web each month looking for your protected content on other sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to avoid duplicate content
Hi there, Our client has an ecommerce website, their products are also showing on an aggregator website (aka on a comparison website where multiple vendors are showing their products). On the aggregator website the same photos, titles and product descriptions are showing. Now with building their new website, how can we avoid such duplicate content? Or does Google even care in this case? I have read that we could show more product information on their ecommerce website and less details on the aggregator's website. But is there another or better solution? Many thanks in advance for any input!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Re-using content
Hi, I've just sold the domain for a website, so I'm free to re-purpose the content over to another website I own. How can I make sure that Gg doesn't deem it as duplicate? Do I need to let Gg naturally realise that the 'original' website no longer has the content on it? Do I need to hold-off putting the content live again? Should I notify Gg by-way of a de-index request, etc (assuming the domain won't incur any difficulty if I do this)? Thanks in advance.
Intermediate & Advanced SEO | | newstd1000 -
Galleries and duplicate content
Hi! I am now studing a website, and I have detected that they are maybe generating duplicate content because of image galleries. When they want to show details of some of their products, they link to a gallery url
Intermediate & Advanced SEO | | teconsite
something like this www.domain.com/en/gallery/slide/101 where you can find the logotype, a full image and a small description. There is a next and a prev button over the slider. The next goes to the next picture www.domain.com/en/gallery/slide/102 and so on. But the next picture is in a different URL!!!! The problem is that they are generating lots of urls with very thin content inside.
The pictures have very good resolution, and they are perfect for google images searchers, so we don't want to use the noindex tag. I thought that maybe it would be best to work with a single url with the whole gallery inside it (for example, the 6 pictures working with a slideshow in the same url ), but as the pictures are very big, the page weight would be greater than 7 Mb. If we keep the pictures working that way (different urls per picture), we will be generating duplicate content each time they want to create a gallery. What is your recommendation? Thank you!0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
SEOMoz mistaking image pages as duplicate content
I'm getting duplicate content errors, but it's for pages with high-res images on them. Each page has a different, high-res image on it. But SEOMoz keeps telling me it's duplicate content, even though the images are different (and named different). Is this something I can ignore or will Google see it the same way too?
Intermediate & Advanced SEO | | JHT0