Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Solved Should I consolidate my "www" and "non-www" pages?
-
My page rank for www and non-www is the same. In one keyword instance, my www version performs SO much better.
Wanting to consolidate to one or the other. My question is as to whether all these issues would ultimately resolve to my chosen consolidated domain (i.e. www or non-www) regardless of which one I choose. OR, would it be smart to choose the one where I am already ranking high for this significant keyword phrase?
Thank you in advance for your help.
-
It may be that one version (www or non-www) has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience in practice this type of migration tends to be smooth.
-
Require the www Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^www\.askapache\.com$ [NC] RewriteRule ^(.*)$ https://www.askapache.com/$1 [R=301,L]
-
Yes. I would recommend picking the version (either www or non-www) that has the historical data showing it performs better than the other version. Check the list of indexed pages for each of the versions to compare. Ideally both the www and non-www version of the website will be indexed in Google so it will help you to decide which version makes the most sense to consolidate to.
Once you identify the preferred version, set 301 redirects from the non-preferred URLs to the preferred version of each URL (the one that has more traffic, links, authority, etc.) of the site. This should be done site-wide so that all URLs are either www or non-www, it shouldn’t be a mix of both. In my experience, I’ve found that between 90-99% of the Site’s SEO Authority is preserved when setting a permanent 301 redirect.
-
@meditationbunny Sorry for the slow reply - but yes, I'd expect Page Authority to increase slightly, if the "other" version had any value to it.
For Page Optimization, yes. For example, for my own site I see:
http://tcapper.co.uk redirects to https://www.tcapper.co.uk/. This on-page analysis is for https://www.tcapper.co.uk/.
-
It may be that one version (www or non-www) has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience in practice this type of migration tends to be smooth.
-
@tom-capper
Thank you. Yes, I should be more clear. I am calling it page rank, when I am actually referring to Moz's domain authority and Moz's keyword ranking. Still, I believe you answered my question. Under page optimization, I can see what appear to be duplicate listings of my pages along with different SERP ranking. It was confusing until I realized that one was the www and the other was non-www. I have since added code to my .htaccess file that will send everything to www. Can I expect the page optimization section to now only show www versions of the pages? Also, can I expect page authority to increase because it is no longer a mish-mash and is all headed to the same domain and same pages (i.e. www version)? -
It may be that one version ("www" or "non-www") has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience, in practice, this type of migration tends to be smooth.
-
This post is deleted! -
This post is deleted! -
This post is deleted!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0 -
How to block "print" pages from indexing
I have a fairly large FAQ section and every article has a "print" button. Unfortunately, this is creating a page for every article which is muddying up the index - especially on my own site using Google Custom Search. Can you recommend a way to block this from happening? Example Article: http://www.knottyboy.com/lore/idx.php/11/183/Maintenance-of-Mature-Locks-6-months-/article/How-do-I-get-sand-out-of-my-dreads.html Example "Print" page: http://www.knottyboy.com/lore/article.php?id=052&action=print
Technical SEO | | dreadmichael0 -
Hreflang on non-canonical pages
Hi! I've been trying to figure out what is the best way to solve this dilemma with duplicate content and multiple languages across domains. 1 product info page 2 same product but GREEN
Technical SEO | | LarsEriksson
3 same product but RED
4 same product but YELLOW **Question: ** Since pages 2,3,4 just varies slightly I use the canonical tag to indicate they are duplicates of page 1. Now I also want to indicate there are other language versions with the_ rel="alternate" hreflang="x" _element. Should I place the _rel="alternate" hreflang="x" _on the canonical page only pointing to the canonical page with "x" language. Should I place the _rel="alternate" hreflang="x" _on all pages pointing to the canonical page with the "x" language? Should I place the _rel="alternate" hreflang="x" _on all pages and then point it to the translated page (even if it is not a canonical page) ? /Lars0 -
Redirect non-www if using canonical url?
I have setup my website to use canonical urls on each page to point to the page i wish Google to refer to. At the moment, my non-www domain name is not redirected to www domain. Is this required if i have setup the canonical urls? This is the tag i have on my index.php page rel="canonical" href="http://www.mydomain.com.au" /> If i browse to http://mydomain.com.au should the link juice pass to http://www.armourbackups.com.au? Will this solve duplicate content problems? Thanks
Technical SEO | | blakadz0 -
Why crawl error "title missing or empty" when there is already "title and meta desciption" in place?
I've been getting 73 "title missing or empty" warnings from SEOMOZ crawl diagnostic. This is weird as I've installed yoast wordpress seo plugin and all posts do have title and meta description. But why the results here.. can anyone explain what's happening? Thanks!! Here are some of the links that are listed with "title missing, empty". Almost all our blog posts were listed there. http://www.gan4hire.com/blog/2011/are-you-here-for-good/ http://www.gan4hire.com/blog/2011/are-you-socially-awkward/
Technical SEO | | JasonDGreatMaeM3.png TLcD8.png
0 -
Meta tag "noindex,nofollow" by accident
Hi, 3 weeks ago I wanted to release a new website (made in WordPress), so I neatly created 301 redirects for all files and folders of my old html website and transferred the WordPress site into the index folder. Job well done I thought, but after a few days, my site suddenly disappeared from google. I read in other Q&A's that this could happen so I waited a little longer till I finally saw today that there was a meta robots added on every page with "noindex, nofollow". For some reason, the WordPress setting "I want to forbid search engines, but allow normal visitors to my website" was selected, although I never even opened that section called "Privacy". So my question is, will this have a negative impact on my pagerank afterwards? Thanks, Sven
Technical SEO | | Zitana0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0