Duplicate Content www vs. non-www and best practices
-
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess
Rule for duplicate content removal : www.domain.com vs domain.com
RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC]The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com
I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites.
-----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this? -
_If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. _
Absolutely NOT, unfortunately. Search engines specifically consider these two versions of the URLS to be two totally different sites. The redirect rule you currently have is specifically in place to correct this problem so the two versions of your site (in the eyes of the engines) aren't competing with each other.
The previous developer knew what he was doing. Leave the redirect as-is. Just be careful that all links you create use the primary version of the URL - you'll retain a bit more "link juice" that way than having them go through the redirect. (i.e. always write links as www.my-customer-site.com/whatever for links in content, menus, incoming links where possible)
Paul
P.S. For proof that search engines consider those URLs different sites, Google's own Webmaster Tools has a setting where you can tell Google which version of the site URL you want to be primary. Much better to do this with a proper 301-redirect though so that you can tell ALL search engines, not just Google.
-
-----Can you comment on whether this is a best practice for all domains?
Yes, it is.
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?
You shouldn't worry about that at all. 301's are just fine. They don't only redirect visitors, search engines like Google also follow them to pass authority signals to the redirected page.
-
You want to commit to one and put a 301 on the other. Googlebot should be smart enough, but it isn't really. Some things aren't best to be left to chance.
Here's the Moz 301 redirect article: http://moz.com/learn/seo/redirection
Edit: Here's another article about www.mysite.com vs mysite.com http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/#.UlbGl1Cko2s
-
Ideally one version of the site should redirect to the other version using a 301 to transfer any link juice from one version of the domain to the other. In an issue where both versions have links pointing to them, the best solution is to see which version has the highest domain authority and the most links and use that as your preferred domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Robots.txt - Do I block Bots from crawling the non-www version if I use www.site.com ?
my site uses is set up at http://www.site.com I have my site redirected from non- www to the www in htacess file. My question is... what should my robots.txt file look like for the non-www site? Do you block robots from crawling the site like this? Or do you leave it blank? User-agent: * Disallow: / Sitemap: http://www.morganlindsayphotography.com/sitemap.xml Sitemap: http://www.morganlindsayphotography.com/video-sitemap.xml
Intermediate & Advanced SEO | | morg454540 -
Link-building best SEO practice (one-off VS periodic blogging)
Hi all, Generally, what would be best when building a website's ranking through link building? Having the same links from the same bloggers or receiving new links from different bloggers every time? A lot of the SEO services offer 4-8 blog backlinks per month. Would it be best if these links came from different sources every time or most from the same sources each month? I know there's a lot of factors but I hope this question is clear. Happy holidays and thank you for your insightful feedback. Carlos
Intermediate & Advanced SEO | | 90miLLA0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Duplicate content from development website
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched. I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Press Release and Duplicate Content
Hello folks, We have been using Press Releases to promote our clients business for a couple of years and we have seen great results in referral traffic and SEO wise. Recently one of our clients requested us to publish the PR on their website as well as blast it out using PRWeb and Marketwire. I think that this is not going to be a duplicate content issue for our client's website since I believe that Google can recognize which content has been published first, but I will be more than happy to get some of the Moz community opinions. Thank you
Intermediate & Advanced SEO | | Aviatech0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
Duplicate page content
Hi. I am getting error of having duplicate content on my website and pages its showing there are: www.mysitename.com www.mysitename.com/index.html As my best knowledge it only one page, I know this can be solved with some conical tag used in header, but do not know how. Can anyone please tell me about that code or any other way to get this solved. Thanks
Intermediate & Advanced SEO | | onlinetraffic0