Up to my you-know-what in duplicate content
-
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google.
The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages?
Thanks.
-
You've pretty much confirmed my suspicions. I can set the redirects up myself, its just been about 5 years since I've done any SEO work. What I meant was should I mod_rewrite or "redirect 301 /oldurl /newurl" ...I've forgot a lot of stuff that I used to do with ease. My own sites were always started off right and weren't as bad as the one I'm working on now, so I'm in unfamiliar territory. Thanks for your advice, I appreciate it
-
I want to make sure that you are getting the proper advice. Can you provide me the URLs here, or PM them to me to keep them private? Once I see the problem firsthand, I can reply with the answer here for you. I am pretty sure my advice above is the way to go, but it doesn't hurt to double check!
You need to choose ONE domain for going forward. I don't care which one it is, but choose one. It makes sense to choose the one with the better rankings, at least from my perspective.
After that, you 301 redirect all versions the URLs to the proper URL (which would be WWW if it was my choice).
Yes, mod_rewrite is a server-side redirect that you can choose. Make sure whoever sets them up knows what he is doing. Having a ton of server-side redirects can increase load times and cause issues with site speed if it is not done properly. Don't be afraid of doing it, but just make sure you know what you are doing, especially since you're dealing with thousands of URLs.
You want to use permanent 301 redirects, yes.
-
Thanks I appreciate the advice. So you don't think having 2 seperate domains pointing (or redirecting) to each other occasionally will hurt anything? I have like 1000+ URLs I need to redirect already on the completely separate domain.com, as for the keyworddomain.com forum I don't think I need too many redirects as just one from seperate.domain.com to keyworddomain.com, and then one there from nonWWW to WWW should fix all the broken URLs right? When you say 301 do you mean "redirect 301" or mod_rewrite? Thanks for the help
-
I would first, choose which version you want to use going forward. You have three versions: subdomain, non-www, and www. Don't use the subdomain, that is a given. I personally like using WWW instead of non-WWW, however there are reasons to use non-WWW over WWW. But, given this scenario, it makes sense to use the WWW version. I know that the non-WWW version has more pages indexed, but pages indexed doesn't mean much in the grand scheme of things. Given that WWW has good rankings and is more identifiable to a user, I would choose that. Of course, if you choose non-WWW my advice below will remain the same.
Now that you have chosen what version you want to use going forward, you need to do a few things:
-
Implement a .htaccess 301 server-side redirect and redirect non-WWW to WWW (or vice versa if you so choose), make sure it's permanent. This way going forward, it'll fix your non-www and WWW issue.
-
Next, you need to redirect all non-WWW indexed pages and URLs to their WWW version. This is not easy, especially with thousands of pages. However, it must be done to help preserve the PR and link-juice so it passes as much as it can through. What I recommend is seeing if there is a plugin or extension for whatever forum software you use that can aid you in this effort, or hire a programmer to build you one. It's actually not that complex to do and I have done it before in a similar situation and it does work. If you need more advice on that, PM me.
-
You need to take care of the subdomain by setting up a permanent redirect to the main WWW version if someone goes to the subdomain, and also setup redirects for existing subdomain pages/URLs that have PR/Rank/LinkJuice.
-
From there, make sure that you are utilizing sitemaps properly, that can greatly increase your indexing rate and volume.
I hope that these help, if you need anything further please do not hesitate to PM me or post here.
Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query Strings causing Duplicate Content
I am working with a client that has multiple locations across the nation, and they recently merged all of the location sites into one site. To allow the lead capture forms to pre-populate the locations, they are using the query string /?location=cityname on every page. EXAMPLE - www.example.com/product www.example.com/product/?location=nashville www.example.com/product/?location=chicago There are thirty locations across the nation, so, every page x 30 is being flagged as duplicate content... at least in the crawl through MOZ. Does using that query string actually cause a duplicate content problem?
Technical SEO | | Rooted1 -
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | | vforvinnie1 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230 -
WordPress Duplicate Content Issues
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc... My question is, how do you handle these issues? Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc? By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages? I hope this makes sense. Regards, Bill
Technical SEO | | wparlaman0