Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
News sites & Duplicate content
-
Hi SEOMoz
I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world?
Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific.
Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place.
News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc.
All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this.
Thanks,
Chris Captivate -
Ditto on what Donnie said. Purple Cow, if you want that site to be an authority, it needs to be authoritative. Why would anyone buy the Washington Post if it just copied all its articles from the New York times? Get a few staff writers to combine and tweak articles as Donnie mentioned or to write original content.
Good luck!
-
Well if you are going to duplicate someones article they deserve the credit. I would recommend combining two or three article sand adding some personal value to it. Create something unique that your audience will like. Also linking to a reputable source shows Google that you are relevant to something. If you run an SEOmoz report card they recommend an external link to a relavent source.
-
Problem is this way I still give authority to another news source that I am competing against. I will be devaluing my pages.
-
You must trackback(link back) to the original content. Otherwise Google looks at your content as duplicate. As as you give the original content credit you can avoid duplicate content.
However If you don't want this content to appear in your sites index but you do want it to appear on your site for your users do: noindex, follow -
Or you could use a rel=canonical tag and call out the original post as the canonical post.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Could another site copying my content hurt my ranking?
Earlier this week I asked why a page of mine might not be ranking locally. (https://moz.com/community/q/what-could-be-stopping-us-from-ranking-locally). Maybe this might be part of the answer – another firm has copied huge chunks of my website copy: **My company: **https://idearocketanimation.com/video-production-company/ The other company: http://studio3dm.com/studio3dm-com/video/ Could this be causing my page to not rank? And is there anything I can do about it, other than huff and puff to the other firm? (Which I am already doing.)
Intermediate & Advanced SEO | | Wagster0 -
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Ecommerce Site - Duplicate product descriptions & SKU pages
Hi I have a couple of questions regarding the best way to optimise SKU pages on a large ecommerce site. At the moment we have 2 landing pages per product - one is the primary landing page with no SKU, the other includes the SKU in the URL so our sales people & customers can find it when using the search facility on the site. The SKU landing page has a canonical pointing to the primary page as they're duplicates. Is this the best way? Or is it better to have the one page with the SKU in the URL? Also, we have loads of products with the very similar product descriptions, I am working on trying to include a unique paragraph or few sentences on these to improve the content - how dangerous is the duplicate content within your own site? I know its best to have totally unique content, but it won't be possible on a site with thousands of products and a small team. At the moment I am trying to prioritise the products to update. Thank you 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Is SEOmoz.org creating duplicate content with their CDN subdomain?
Example URL: http://cdn.seomoz.org/q/help-with-getting-no-conversions Canonical is a RELATIVE link, should be an absolute link pointing to main domain: http://www.seomoz.org/q/help-with-getting-no-conversions <link href='[/q/help-with-getting-no-conversions](view-source:http://cdn.seomoz.org/q/help-with-getting-no-conversions)' rel='<a class="attribute-value">canonical</a>' /> 13,400 pages indexed in Google under cdn subdomain go to google > site:http://cdn.seomoz.org https://www.google.com/#hl=en&output=search&sclient=psy-ab&q=site:http%3A%2F%2Fcdn.seomoz.org%2F&oq=site:http%3A%2F%2Fcdn.seomoz.org%2F&gs_l=hp.2...986.6227.0.6258.28.14.0.0.0.5.344.3526.2-10j2.12.0.les%3B..0.0...1c.Uprw7ko7jnU&pbx=1&bav=on.2,or.r_gc.r_pw.r_cp.r_qf.&fp=97577626a0fb6a97&biw=1920&bih=936
Intermediate & Advanced SEO | | irvingw1