Duplicate content issue
-
Hi
I installed a wiki and a forum to subdomains of one of my sites. The crawl report shows me duplicate content on the forum and on wiki. This will hurt the main site? Or the root domain? the site by the way is clean absolutely from errors. Thanks
-
Hi Dr Peter,
I just wanted to add some extra info for the readers. This was my goal. Its ok Wiki is gone for moment, as for the forum most possible I will make it as you advice (no index). as I dont seek traffic from it.I dont have time to manage everything. Thanks for your time
-
Sorry, I'm slightly confused. Are the wiki and forum duplicating each other, or are they each duplicating content on your root domain. Even in a sub-domain, they could be diluting content in your root domain, but it really depends a lot on the situation and extent of the duplication.
You could use the canonical tag to point them to the source of the content, or you could block them (probably META NOINDEX), but I'd like to understand the goals a bit better.
-
Actually, the original poster mentioned a wiki, but said nothing about wikipedia. I'm going to ask another associate to also look at this thread and chime in to make sure we've got this figured out correctly.
-
Thanks!
-
Hey Nikos, if you are using wiki and forum in subdomains then surely you should getting duplicate content issue in webmaster because wiki takes first few content of a Wikipedia article and pastes it into your blog post. A link to the corresponding article is also included.
But I think as long as you have more unique content to over power that of a few content of wikipedia article and putting it at the end of the blog, it not being a problem as many sites use snippets of content from others as references.
I suggest you to please ensure that the main page shouldn't link to the subdomains in that case but subdomain can link to main domain. I think in that way your main domain will not get affected.
Well I advice you to read "How Changes To The Way Google Handles Subdomains Impact SEO" post, I am sure you will find solution. Thanks Cheers!
-
Cool! Is any way to eliminate them? Or is not necessary? I read one article about duplicate content article about forums and was written that google dont punish forums if they have duplicate content (inside the site) like pages,titles and stuff. Is true or that change? me personal I dont care to much about subdomains because I am promoting the main site and forum + wiki are just bonus for my visitors! ohh , by the way harald another tactic what i practise and is working is the folloing. i SCAN ALL the top 20 results and I am building the same backlinks. I mean from the same page. this what I can ofcource. but if you analyze them all you collect few good. rest I am buying them to pass my competitors. I am not an expert or whatever , so any advice will be appreciated. thanks
-
Hi Nikos, if this are sub domains it will not hurt your main side -
sub domains a treated separately.Harry
-
I think the best solution is to use rel cannonical in the head section. Do you tried it?
You will need to include in every page (pages you want to be the official one) inside the head tag the follow:
where ww.example.com/index.html is your page adress. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO and duplicate content: what should I do when hreflangs are not enough?
Hi, A follow up question from another one I had a couple of months ago: It has been almost 2 months now that my hreflangs are in place. Google recognises them well and GSC is cleaned (no hreflang errors). Though I've seen some positive changes, I'm quite far from sorting that duplicate content issue completely and some entire sub-folders remain hidden from the SERP.
Intermediate & Advanced SEO | | GhillC
I believe it happens for two reasons: 1. Fully mirrored content - as per the link to my previous question above, some parts of the site I'm working on are 100% similar. Quite a "gravity issue" here as there is nothing I can do to fix the site architecture nor to get bespoke content in place. 2. Sub-folders "authority". I'm guessing that Google prefers sub-folders over others due to their legacy traffic/history. Meaning that even with hreflangs in place, the older sub-folder would rank over the right one because Google believes it provides better results to its users. Two questions from these reasons:
1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong? 2. Can I solve this using canonical tags?
Instead of trying to fix and "promote" hidden sub-folders, I'm thinking to actually reinforce the results I'm getting from stronger sub-folders.
I.e: if a user based in belgium is Googling something relating to my site, the site.com/fr/ subfolder shows up instead of the site.com/be/fr/ sub-sub-folder.
Or if someone is based in Belgium using Dutch, he would get site.com/nl/ results instead of the site.com/be/nl/ sub-sub-folder. Therefore, I could canonicalise /be/fr/ to /fr/ and do something similar for that second one. I'd prefer traffic coming to the right part of the site for tracking and analytic reasons. However, instead of trying to move mountain by changing Google's behaviour (if ever I could do this?), I'm thinking to encourage the current flow (also because it's not completely wrong as it brings traffic to pages featuring the correct language no matter what). That second question is the main reason why I'm looking out for MoZ's community advice: am I going to damage the site badly by using canonical tags that way? Thank you so much!
G0 -
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
SEO for video content that is duplicated accross a larger network
I have a website with lots of content (high quality video clips for a particular niche). All the content gets fed out 100+ other sites on various domains/subdomains which are reskinned for a given city. So the content on these other sites is 100% duplicate. I still want to generate SEO traffic though. So my thought is that we: a) need to have canonical tags from all the other domains/subdomains that point back to the original post on the main site b) probably need to disallow search engine crawlers on all the other domains/subdomains Is this on the right track? Missing anything important related to duplicate content? The idea is that after we get search engines crawling the content correctly, from there we'd use the IP address to redirect the visitor to the best suited domain/subdomain. any thoughts on that approach? Thanks for your help!
Intermediate & Advanced SEO | | PlusROI0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
Redirect issue launching duplicate product categories on another TLD
Dear Mozzerz We run this e-commerce website (superstar.dk) where we are selling all different kinds of wristwatches from different brand names (Casio, Garmin, Suunto etc). We just bought another website selling watches (xxx.com) and therefore we would like to move some of the content from superstar.dk to the new website xxx.com, making superstar.dk into a more niche website. So we are basically taking a brand with all the products in it and shutting it down on superstar.dk and instead launching it on xxx.com. Superstar.dk will still be running, just with a more niche product- and brand selection. So my question is, should we redirect all the old product categories that we are shutting down to the new website on another TLD where we are opening them again and the same for the products (e.g. superstar.dk/garmin -> xxx.com/garmin)? Or would it be better to keep the redirects within the same website/TLD (e.g. superstar.dk/garmin -> superstar.dk)? A few examples:
Intermediate & Advanced SEO | | superstardenmark
superstar.dk/garmin -> xxx.com/garmin
superstar.dk/suunto -> xxx.com/suunto
etc..
superstar.dk/product1 -> xxx.com/product1
superstar.dk/product2 -> xxx.com/product2
etc.0 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
Dealing with close content - duplicate issue for closed products
Hello I'm dealing with some issues. Moz analyses is telling me that I have duplicate on some of my products pages. My issue is that: Concern very similar products IT products are from the same range Just the name and pdf are different Do you think I should use canonical url ? Or it will be better to rewrite about 80 descriptions (but description will be almost the same) ? Best regards.
Intermediate & Advanced SEO | | AymanH0