Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Solved Should I consolidate my "www" and "non-www" pages?
-
My page rank for www and non-www is the same. In one keyword instance, my www version performs SO much better.
Wanting to consolidate to one or the other. My question is as to whether all these issues would ultimately resolve to my chosen consolidated domain (i.e. www or non-www) regardless of which one I choose. OR, would it be smart to choose the one where I am already ranking high for this significant keyword phrase?
Thank you in advance for your help.
-
It may be that one version (www or non-www) has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience in practice this type of migration tends to be smooth.
-
Require the www Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^www\.askapache\.com$ [NC] RewriteRule ^(.*)$ https://www.askapache.com/$1 [R=301,L]
-
Yes. I would recommend picking the version (either www or non-www) that has the historical data showing it performs better than the other version. Check the list of indexed pages for each of the versions to compare. Ideally both the www and non-www version of the website will be indexed in Google so it will help you to decide which version makes the most sense to consolidate to.
Once you identify the preferred version, set 301 redirects from the non-preferred URLs to the preferred version of each URL (the one that has more traffic, links, authority, etc.) of the site. This should be done site-wide so that all URLs are either www or non-www, it shouldn’t be a mix of both. In my experience, I’ve found that between 90-99% of the Site’s SEO Authority is preserved when setting a permanent 301 redirect.
-
@meditationbunny Sorry for the slow reply - but yes, I'd expect Page Authority to increase slightly, if the "other" version had any value to it.
For Page Optimization, yes. For example, for my own site I see:
http://tcapper.co.uk redirects to https://www.tcapper.co.uk/. This on-page analysis is for https://www.tcapper.co.uk/.
-
It may be that one version (www or non-www) has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience in practice this type of migration tends to be smooth.
-
@tom-capper
Thank you. Yes, I should be more clear. I am calling it page rank, when I am actually referring to Moz's domain authority and Moz's keyword ranking. Still, I believe you answered my question. Under page optimization, I can see what appear to be duplicate listings of my pages along with different SERP ranking. It was confusing until I realized that one was the www and the other was non-www. I have since added code to my .htaccess file that will send everything to www. Can I expect the page optimization section to now only show www versions of the pages? Also, can I expect page authority to increase because it is no longer a mish-mash and is all headed to the same domain and same pages (i.e. www version)? -
It may be that one version ("www" or "non-www") has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience, in practice, this type of migration tends to be smooth.
-
This post is deleted! -
This post is deleted! -
This post is deleted!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Noindex, follow" for thin pages?
Hey there Mozzers, I have a question regarding Thin pages. Unfortunately, we have Thin pages, almost empty to be honest. I have the idea to ask the dev team to do "noindex, follow" on these pages. What do you think? Has someone faced this situation before? Will appreciate your input!
Technical SEO | | Europarl_SEO_Team0 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
Rel="external"
Hi all, I got a link and its off a site and marked up with rel="external". Is this a followed or nofollowed link? Does it pass link juice? Thanks
Technical SEO | | Sharer0 -
Hyphenated Domain Names - "Spammy" or Not?
Some say hyphenated domain names are "spammy". I have also noticed that Moz's On Page Keyword Tool does NOT recognize keywords in a non-hyphenated domain name. So one would assume neither do the bots. I noticed obviously misleading words like car in carnival or spa in space or spatula, etc embedded in domain names and pondered the effect. I took it a step further with non-hyphenated domain names. I experimented by selecting totally random three or four letter blocks - Example: randomfactgenerator.net - rand omf act gene rator Each one of those clips returns copious results AND the On-Page Report Card does not credit the domain name as containing "random facts" as keywords**,** whereas www.business-sales-sarasota.com does get credit for "business sales sarasota" in the URL. This seems an obvious situation - unhyphenated domains can scramble the keywords and confuse the bots, as they search all possible combinations. YES - I know the content should carry it but - I do not believe domain names are irrelevant, as many say. I don't believe that hyphenated domain names are not more efficient than non hyphenated ones - as long as you don't overdo it. I have also seen where a weak site in an easy market will quickly top the list because the hyphenated domain name matches the search term - I have done it (in my pre Seo Moz days) with ft-myers-auto-air.com. I built the site in a couple of days and in a couple weeks it was on page one. Any thoughts on this?
Technical SEO | | dcmike0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0