E-commerce store, in need of protecting our own content
-
Dear other Moz fans,
We have an E-commerce store in Norway. Our main conversion to sale still happens in our physical store, but do to the description and information we provide online.
To warn you before you click; Our store is a boutique for "erotic items". A nice one how ever, made buy woman for woman and their man.We use enormous time writing descriptions and information for (almost) every item online.
We really want to protect our content (text information).What is the best practice to mark up "protection" of our hard work content?
Thank you for your time.
Regards form the Flirt girls in Norway. -
Thank you Tuzzel,
I will take a closer look at the article, there might be some ideas there. We have looked at the authorship options, but as you say. It's not what I'm looking for.
Thank you -
Thank you for your fast reply Remus,
But it's not what Im looking for I'm afraid. But still a wrong pointing url discovered, so thank youWe have been searching on rel=author, rel="publisher" and this is more blog related mark-ups. As far as we can see. Our Google+ page dont cover this either, due to that it is a page and not a profile.
I might to this much more complicated that it is... But it is worth a shot.
Monica
-
You have several options, while you can never stop someone coming to your site and actively taking your content you can attempt to trip them up, particularly if they are using automated tools like scrapers. There a are a few article out there (like this) that go into details but common recommendations you will see include things like adding links to your text and images that go to other pages in your site, often the sites stealing the content will then inadvertently include link back to you in their pages. To avoid issues of low quality link from these sources you should probably make these no follow to be safe. Then there is authorship etc. although that’s not quite right for product descriptions etc., though you could investigate the feasibility of this.
Other than that there is enforcing your copyright but to do so you need to locate the stolen content. Again multiple tools out there such as copyscape that Remus mentioned, but again a quick and easy one would be to set up Google alerts to look for that content. Then you can contact the webmasters and utilise DMCA takedown requests etc if necessary.
But if you are looking for methods to physically stop people taking your content im not aware of a fool proof one i am afraid.
Hope this is helpful.
-
Hello,
Maybe Copyscape? They even have a tool called Copysentry which monitors the web regularly for plagiarism.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weight of content further down a page
Hi, A client is trying to justify a design decision by saying he needs all the links for all his sub pages on the top level category page as google won't index them; however the links are available on the sub category and the sub category is linked to from the top level page so I have argued as long as google can crawl the links through the pages they will be indexed and won't be penalised. Am I correct? Additionally the client has said those links need to be towards the top of the page as content further down the page carries less weight; I don't believe this is the case but can you confirm? Thanks again, Craig.
Intermediate & Advanced SEO | | CSIMedia1 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
What is the best practice for URLs for E-commerce products in multiple categories?
Hello all! I have always worked successfully with SEO on E-commerce sites, however we are currently revamping an older site for a client and so I thought I'd turn to the community to ask what the best practices that you guys are experiencing for url structures at the moment. Obviously we do not wish to create duplicate content and so the big question is, what would you guys do for the very best structure for URLs on an E-commerce site that has products in multiple categories? Let's imagine we are selling toy cars. I have a sports car for sale, so naturally it can go in the sports cars category and it could also go in to the convertibles category too. What is the best way you have found recently that works and increases rankings, but does not create duplicate content? Thanks in advance! 🙂 Kind Regards, JDM
Intermediate & Advanced SEO | | Hatfish0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
How to best handle expired content?
Similar to the eBay situation with "expired" content, what is the best way to approach this? Here are a few examples. With an e-commerce site, for a seasonal category of "Christmas" .. what's the best way to handle this category page after it's no longer valid? 404? 301? leave it as-is and date it by year? Another example. If I have an RSS feed of videos from a big provider, say Vevo, what happens when Vevo tells me to "expire" a video that it's no longer available? Thank you!
Intermediate & Advanced SEO | | JDatSB0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Please I need some optimism for this (not provided)
Does anyone see this getting any better. It is getting absolutely ridiculous and almost to the point where it looks like soon analytics will be pointless! Can Rand pull some connects and tell Google - Hey Camon! This is ridiculous, we need to see at least a little bit more of these! notprovided.jpg
Intermediate & Advanced SEO | | imageworks-2612900