Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practice for www and non www
-
How is the best way to handle all the different variations of a website in terms of www | non www | http | https?
In Google Search Console, I have all 4 versions and I have selected a preference.
In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score.
eg.
- http://mydomain.com DA 25 PA 35
- http://www.mydomain.com DA 19 PA 21
Each version of the home page having it's only set of links and scores.
Should I try and "consolidate" all the scores into one page?
Should I set up redirects to my preferred version of the website?
Thanks in advance
-
thanks for your answer
that was helpful
-
Thanks for taking the time to put together such a wonderfully detailed answer.
-
Hi Samantha,
What you have is what are called "canonical issues." By allowing multiple versions of your domain open and crawlable to search engines you "split" your ranking authority and result in the issues you are seeing right now.
The best practice is to choose one version of your domain as the "true canonical" and then 301 redirect the others at the server level by means of mod_rewrite code. Doing so will consolidate your content, incoming links and PageRank and greatly increase the root domain authority of your site.
To search engines, if your site hasn't instituted 301 redirect commands at the server level, all of these versions of your site home page would be treated as "separate pages" and each would accumulate authority individually:
http://yoursite.com/
http://www.yoursite.com/
http://yoursite.com/index.php
http://www.yoursite.com/index.php
https://yoursite.com
https://www.yoursite.comYou get the idea.
Most websites are run on one of three different types of servers...
- Unix-based servers running Apache.
- Unix-based servers running Nginx.
- Microsoft Windows-based servers running IIS or similar.
If you're unsure of what kind of server runs your site, ask your hosting company. Most sites are run on Unix-based servers with Apache. In that case, the server's behavior is configured using something called the .htaccess file.
If your site's root domain already contains a
.htaccess
file, you can simply scroll to the end of whatever code is already there and append your 301 redirect code at the bottom of the file, starting on a new line. While this may sound complicated, it's actually very, very simple to do. If you can upload files to and from your Web server, then chances are you'll have no trouble managing (i.e. altering or creating and uploading) your.htaccess
file(s).But yes, bottom line, you ALWAYS want to consolidate URLs and present one uniform "preferred" URL format to search engines and users. In your case, that would appear to the be the non-www domain which has the higher Domain Authority.
You can learn all about redirection best practices at the Moz resource here: https://moz.com/learn/seo/redirection
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect non existant pages
Hi I have 100's of URL's appearing in Search Console for example: ?p=1_1 These go to on to 5_200 etc.. I have tried to do htaccess and the mod rewrite is on as I can redirect directories to the root i.e RewriteRule ^web_example(.*)$ /$1 [R=301,N,L] However I have tried all kinds of variations to redirect ?p= and either it doesn't work at all or it crashes the website. Can anyone point me in the right direction to fix this.
Technical SEO | | Cocoonfxmedia0 -
Redirect non slash to slash
Hello SEO gurus We have an issue here ( www.xyz.com.au) is having 200 responses www.xyz.com.au and www.xyz.com.au/ ( when i ran the crawl test i found this ) We have been advised to do a 301 from non slash to slash ( as our other pages are showing up with slash ) for the consistency we decided to go with this but our devs just couldnt do it. Error is - redirect loop and this site is a wordpress one Can anyone help us with this issue? Help is much appreciated.
Technical SEO | | Pack0 -
Best practice for URL - Language/country
Hi, We are planning on having our website localized into more languages. We already have an English and German version. The German version is currently a sub-domain: www.example.com --> English version de.example.com --> German version Is this recommended? Or is it always better to have URLs with language prefixes such a: www.example.com/de www.example.com/es Which is a better practice in terms of SEO?
Technical SEO | | Kilgray1 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Best strategy to handle over 100,000 404 errors.
I recently been given a site that has over one-hundred thousand 404 error codes listed in Google Webmasters. It is really odd because according to Google Webmasters, the pages that are linking to these 404 pages are also pages that no longer exist (they are 404 pages themselves). These errors were a result of site migration that had occurred. Appreciate any input on how one might go about auditing and repairing large amounts of 404 errors. Thank you.
Technical SEO | | SEO_Promenade0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Best free tool to check internal broken links
Question says it all I guess. What would your recommend as the best free tool to check internal broken links?
Technical SEO | | RikkiD225 -
Which is the best wordpress sitemap plugin
Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?
Technical SEO | | simoncmason0