Image and Content Management
-
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images.
Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc...
I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
-
If your boss really feels there is a site which offers such protection, please share the URL. There are sites which may disable right-clicking but that only removes a single method of copying content.
-
I totally concur. You cant stop anyone from jacking your content, or graphics.
-
That was what I was thinking.
I knew it would be nearly impossible, but I wanted to make sure I wasn't out of the loop on something. I am pretty sure he seen a website that had text select disabled or something of that nature, but there is always works arounds. Thanks
-
Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
It is not possible to prevent content theft. Anyone who can read your content can copy it. The same applies to images. It is a mistake to try. At best, you will prevent a relatively small percentage of people with little understanding from being able to copy your content, and that is not the group people normally wish to impede.
Let's say you did create a new technology which actually prevented most forms of copying content. If a person sees your webpage they can take an image of the page at the push of a single button, then paste that image into photo shop. From there they can cut out your images and save them. They could also use a OCR to capture all the text. This can be done in 60 seconds.
There is a large interest in achieving the goal you seek and I am not aware of any organization that has come close on any level to achieving it. If somehow it was achieved, the page would not be readable by search engines. Search engines have to be able to read every character of text through their web crawlers. In effect, search engines scrape your content. If you find a technology to prevent copying text, it would prevent search engines from crawling as well.
The best your boss can do is copyright the website and/or images. It is highly likely your bosses fears will be realized at some point and his content or images will be stolen if it is of a high quality nature. A copyright will provide him with the maximum amount of legal protection. You can also employ the services of companies which will embed a tracking code of sorts into images. These companies then crawl the web each month looking for your protected content on other sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switching from Http to Https, but what about images and image link juice?
Hi Ya'll. I'm transitioning our http version website to https. Important question: Do images have to have 301 redirects? If so, how and where? Please send me a link or explain best practices. Best, Shawn
Intermediate & Advanced SEO | | Shawn1241 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Same Alt tag on the images
Can We have same alt tags on all the images? Below pages have images with same alt tag "astrologer Ravi sharma". I used name of the person on every image. before today, all images were shown in google images but today no image is there. any comment. Like - http://www.astrologerravisharma.com/astrologer-ravi-sharma-photos/ http://www.astrologerravisharma.com/gallery/
Intermediate & Advanced SEO | | AlexanderWhite0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
Faceted Navigation and Dupe Content
Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B
Intermediate & Advanced SEO | | bjs20100 -
Differentiating Content
I have a piece of content (that is similar) that legitimately shows up on two different sites. I would like both to link, but it seems as if they are "flip flopping" in ranking. Sometimes one shows up, sometimes another. What's the best way to differentiate a piece of content like this? Does it mean rewriting one entirely? http://www.simplifiedbuilding.com/solutions/ada-handrail/ http://simplifiedsafety.com/solutions/ada-handrail/ I want to the Simplified Building one to be found first if I had a preference.
Intermediate & Advanced SEO | | CPollock0 -
What to do when unique content is out of the question?
SEO companies/people are always stating that unique, quality content is one of the best things for SEO... But what happens when you can't do that? I've got a movie trailer blog and of late a lot of movie agencies are now asking us to use the text description they give us along with the movie trailer. This means that some pages are going to have NO unique content. What do you do in a situation like this?
Intermediate & Advanced SEO | | RichardTaylor0 -
Duplicate Content Question
My client's website is for an organization that is part of a larger organization - which has it's own website. We were given permission to use content from the larger organization's site on my client's redesigned site. The SEs will deem this as duplicate content, right? I can "re-write" the content for the new site, but it will still be closely based on the original content from the larger organization's site, due to the scientific/medical nature of the subject material. Is there a way around this dilemma so I do not get penalized? Thanks!
Intermediate & Advanced SEO | | Mills1