How do I use public content without being penalized for duplication?
-
The NHTSA produces a list of all recalls for automobiles. In their "terms of use" it states that the information can be copied. I want to add that to our site, so there is an up-to-date list for our audience to see. However, I'm just copying and pasting. I'm allowed to according to NHTSA, but google will probably flag it right? Is there a way to do this without being penalized?
Thanks,
Ruben
-
I didn't think about other sites, but that's a fabulous point. Best to play it safe.
Thanks for your input!
- Ruben
-
My gut says that your idea to keep the content noindexed is best. Even if the content is unique it borders on the area of auto-generated. Now, I might change your mind if there were a lot of users that were actively interacting with most of these pages. If not though, then you'll end up having a large portion of your site consisting of auto-generated content that doesn't seem useful. Plus...it's also possible that other sites are using this information so you would end up having content that is duplicated on other sites too.
I could be wrong, but my gut says not to try to use this content for ranking purposes.
-
I appreciate the follow up, Marie. Please give me your thoughts on the following idea;
The NHTSA only posts the updates for the past month. If I noindex the page for now (which is what I'm doing) and wait five months, then what would happen? At that point, yes, the current month would be duplicated, but I'd have four months of "unique" content because the NHTSA deletes there's. Plus, I could add pictures of all the automobile, too. Do you think that would be enough to index it?
(I'm most likely going to keep it noindex, because this borders on shady, or at least, I could see google taking it that way, but just as a thought experiment, what do you think?) Or anyone else?
Thanks,
Ruben
-
To expand on EGOL's answer, if you are taking someone else's content (even with their permission) and wanting Google to index it then Google can see that you have a large amount of copied content on your site. This can trigger the Panda filter and can cause Google to consider your whole site as low quality.
You can add a noindex tag as EGOL suggested or you could use a canonical tag to show Google who the originator of the content is, but probably the noindex tag is easiest.
There is one other option as well. If you think it is possible that you can add significant value to the content that is being provided then you can still keep it indexed. If you can combine the recall information with other valuable information then that might be ok to index. But, you have to truly be providing value, not just padding the page with words to make it look unique.
-
Alright, that sounds good. Thanks!
-
I had a bunch of republished articles on my website, done mostly at the request of government agencies and universities. That site got hit in one of the early Panda updates. So, I deleted a lot of that content and to the rest I added this line above the tag
name="robots" content="noindex, follow" />
That tells google not to index the page but to follow the links and allow pagerank to flow through. My site recovered a few weeks later.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I added an SSL certificate this morning and now I noticed duplicate content
Ok, so Im a newbie, therefor I make mistakes! Lots of them. I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https. So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate? I guess what's the easiest that will also be best for ranking. Thank you!! Rena
Intermediate & Advanced SEO | | palila0 -
Buying a disused website and using their content - penalty risk?
Hi all, I'm in the process of setting up a new website. I have found various old websites covering a similar topic and I'm interested in purchasing two of these websites for their content as it is very good, despite those sites struggling to make ends meet. One of these websites is still live, the other one hasn't been live for 2 years. Let's say I bought these websites for their content, then used that content on my new domain and made sure the two websites where this content came from were offline, would I run a risk of getting penalised? Does Google hold onto content from a website even if it is now offline?
Intermediate & Advanced SEO | | Bee1590 -
Question about moving content from one site to another without a 301
I could use a second opinion about moving content from some inactive sites to my main site. Once upon a time, we had a handful of geotargeted websites set up targeting various cities that we serve. This was in addition to our main site, which was mostly targeted to our primary office and ranked great for those keywords. Our main site has plenty of authority, has been around for ages, etc. We built out these geo-targeted sites with some good landing pages and kept them active with regularly scheduled blog posts which were unique and either interesting or helpful. Although we had a little success with these, we eventually saw the light and realized that our main site was strong enough to rank for these cities as well, which made life a whole lot easier, not to mention a lot less spammy. We've got some good content on these other sites that I'd like to use on our main site, especially the blog posts. Now that I've got it through my head that there's no such thing as a duplicate content penalty, I understand that I could just start moving this content over so long as I put a 301 redirect in place where the content used to be on these old sites. Which leads me to my question. Our SEO was careful not to have these other websites pointing to our main site to avoid looking like we were trying to do something shady from a link building perspective. His concern is that these redirects would undermine that effort and having a bunch of redirects from a half dozen sites could end up hurting us somehow. Do you think that is the case? What he is suggesting we do is remove all of the content that we'd like to use and use Webmaster Tools to request that this content be removed from the index. Then, after the sites have been recrawled, we'll check for ourselves to confirm they've been removed and proceed with using the content however we'd like. Thoughts?
Intermediate & Advanced SEO | | LeeAbrahamson0 -
Ticket Industry E-commerce Duplicate Content Question
Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks
Intermediate & Advanced SEO | | keL.A.xT.o1 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Coupon Website Has Tons of Duplicate Content, How do I fix it?
Ok, so I just got done running my campaign on SEOMOZ for a client of mine who owns a Coupon Magazine company. They upload thousands of ads into their website which gives similar looking duplicate content ... like http://coupon.com/mom-pop-shop/100 and
Intermediate & Advanced SEO | | Keith-Eneix
http://coupon.com/mom-pop-shop/101. There's about 3200 duplicates right now on the website like this. The client wants the coupon pages to be indexed and followed by search engines so how would I fix the duplicate content but still maintain search-ability of these coupon landing pages?0 -
Duplicate content, website authority and affiliates
We've got a dilemma at the moment with the content we supply to an affiliate. We currently supply the affiliate with our product database which includes everything about a product including the price, title, description and images. The affiliate then lists the products on their website and provides a Commission Junction link back to our ecommerce store which tracks any purchases with the affiliate getting a commission based on any sales via a cookie. This has been very successful for us in terms of sales but we've noticed a significant dip over the past year in ranking whilst the affiliate has achieved a peak...all eyes are pointing towards the Panda update. Whenever I type one of our 'uniquely written' product descriptions into Google, the affiliate website appears higher than ours suggesting Google has ranked them the authority. My question is, without writing unique content for the affiliate and changing the commission junction link. What would be the best option to be recognised as the authority of the content which we wrote in the first place? It always appears on our website first but Google seems to position the affiliate higher than us in the SERPS after a few weeks. The commission junction link is written like this: http://www.anrdoezrs.net/click-1428744-10475505?sid=shopp&url=http://www.outdoormegastore.co.uk/vango-calisto-600xl-tent.html
Intermediate & Advanced SEO | | gavinhoman0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1