Duplicate Content www vs. non-www and best practices
-
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess
Rule for duplicate content removal : www.domain.com vs domain.com
RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC]The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com
I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites.
-----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this? -
_If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. _
Absolutely NOT, unfortunately. Search engines specifically consider these two versions of the URLS to be two totally different sites. The redirect rule you currently have is specifically in place to correct this problem so the two versions of your site (in the eyes of the engines) aren't competing with each other.
The previous developer knew what he was doing. Leave the redirect as-is. Just be careful that all links you create use the primary version of the URL - you'll retain a bit more "link juice" that way than having them go through the redirect. (i.e. always write links as www.my-customer-site.com/whatever for links in content, menus, incoming links where possible)
Paul
P.S. For proof that search engines consider those URLs different sites, Google's own Webmaster Tools has a setting where you can tell Google which version of the site URL you want to be primary. Much better to do this with a proper 301-redirect though so that you can tell ALL search engines, not just Google.
-
-----Can you comment on whether this is a best practice for all domains?
Yes, it is.
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?
You shouldn't worry about that at all. 301's are just fine. They don't only redirect visitors, search engines like Google also follow them to pass authority signals to the redirected page.
-
You want to commit to one and put a 301 on the other. Googlebot should be smart enough, but it isn't really. Some things aren't best to be left to chance.
Here's the Moz 301 redirect article: http://moz.com/learn/seo/redirection
Edit: Here's another article about www.mysite.com vs mysite.com http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/#.UlbGl1Cko2s
-
Ideally one version of the site should redirect to the other version using a 301 to transfer any link juice from one version of the domain to the other. In an issue where both versions have links pointing to them, the best solution is to see which version has the highest domain authority and the most links and use that as your preferred domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTP vs HTTPS duplication where HTTPS is non-existing
Hey Guys, **My site is **http://www.citymetrocarpetcleaning.com.au/ Goal: I am checking if there is an HTTPS version of my site (duplication issue) What I did: 1. I went to Screaming Frog and run https://www.citymetrocarpetcleaning.com.au/. The result is that it is 200 OK (the HTTPS version exists - possible duplication) 2. Next, I opened a browser and manually replace HTTP with HTTPS, the result is "Image 1" which doesn't indicate a duplication. But if we go deeper in Advanced > Proceed to www.citymetrocarpetcleaning.com.au (unsafe) "Image 2", it displays the content (Image 3). Question: 1. Is there an HTTP vs HTTPs duplication here? 2. Do I need to implement 301 redirection/canonical tags on HTTPS pointing to HTTP to solve duplication? Please help! Cheers! uIgJv DsNrA El7aI
Intermediate & Advanced SEO | | gamajunova0 -
Best Practices for Creating Back Links from "Thought Leader" Content
What is the best way to use articles from a "thought leader" to build high-quality links to my website? I have heard that it is possible to pay bloggers to post business articles that link back to a website. That assuming these blogs have domain authority this is a good technique to improve ranking. Is this in fact true, and if so where would I find blogs to post our content. The purpose would be to promote real estate brokerage website. Any suggestions? Is this possible, advisable, best use of quality content? Alternatively, where else can we post engaging content to create links back to our site? Social media? The nature of the content would be such topics as how to find the best value in Manhattan office of loft space rentals, etcera. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Product Syndication and duplicate content
Hi, It's a duplicate content question. We sell products (vacation rental homes) on a number of websites as well as our own. Generally, these affiliate sites have a higher domain authority and much more traffic than our site. The product content (text, images, and often availability and rates) is pulled by our affiliates into their websites daily and is exactly the same as the content on our site, not including their page structure. We receive enquiries by email and any links from their domains to ours are nofollow. For example, all of the listing text on mysite.com/listing_id is identical to my-first-affiliate-site.com/listing_id and my-second-affiliate-site.com/listing_id. Does this count as duplicate content and, if so, can anyone suggest a strategy to make the best of the situation? Thanks
Intermediate & Advanced SEO | | McCaldin0 -
Redirecting non-www pages to www ones
Hello:
Intermediate & Advanced SEO | | romanbond
I'm trying to consolidate all the link juice and see that some of my pages are linked to by using both www.mysite.com/whatever.html and mysite.com/whatever.html.
Is there a safe re-write rule that not just redirects non-www(s) to www(s), but designates the redirect as 301, so link juice will be transfered as well. If not RewriteRule, are there any other ways to accoplishe this? And the last question: can this be solved by simply setting Preffered domain in google webmaster tools to display www URL? Any help will be appreciated.0 -
What are recommended best practices for hosting native advertising sponsored content on domains for large publishers?
On top of clear on-page sponsored content copy would you add meta robots to noindex native advertising? Google recently came out against this type of content in GNews, is there a similar directive for the main index?
Intermediate & Advanced SEO | | hankazarian0 -
Is this duplicate content?
My client has several articles and pages that have 2 different URLs For example: /bc-blazes-construction-trail is the same article as: /article.cfm?intDocID=22572 I was not sure if this was duplicate content or not ... Or if I should be putting "/article.cfm" into the robots.txt file or not.. if anyone could help me out, that would be awesome! Thanks 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Cross-Domain Canonical and duplicate content
Hi Mozfans! I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
Intermediate & Advanced SEO | | MaartenvandenBos
The thing is that the client has about 3 sites with the same Jobs on it. I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why. Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A). Thanks! Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday0