Www v.s non www
-
The canonical URLs (and all our link building efforts) is on the www version of the site.
However, the site is having a massive technical problem and need to redirect some links (some of which are very important) from the www to the non www version of the site (for these pages the canonical link is still the www version).
How big of a SEO problem is this?
Can you please explain the exact SEO dangers?
Thanks!
-
Thanks for all your responses - I will use this as the basis of my answer to the technical team.
-
I'm endorsing Stephen's idea, because if you really have no choice, I think it's a good potential alternative. THB's comments (which I thumbed up) are very important, though.
If you really have no choice, I do think the 302 is safer here - the canonical tag should override it. There is some risk, though, and it's definitely not ideal.
I'm not clear on the problem, but could you return a 503? It basically says "We've got a temporary problem - come back later" and, if it really is temporary, Google won't de-index the pages. If you're talking a couple of days, this may be a better solution. If you're talking a few weeks, you may have to take Stephen's advice. You might want to pull in expert help, though, because my gut reaction is that there's a better way to fix what's broken here.
-
Hehe.
Generally speaking, and I've actually come across this quite a bit lately, it's better to just put your efforts towards fixing the technical issues than to try and manipulate the site using redirects and canonical tags. But it's easy to say when it's not my technical problem, nor my money/time on the line to fix it! However, that is always the best-case scenario in my opinion.
-
Agreed. It's a problem waiting to bite you in the proverbials....
-
I worry about setting up a canonical tag that points to a URL Google can't access (as it's just being redirected via 302 back to the non-www version anytime it will try and read the canonical URL). And since a canonical tag is kinda sorta like a 301, you'd ultimately be 301'ing (kinda sorta) back to the www version, only to have a 302 header sent, 302'ing Google back to the non-www. And endless loop, so-to-speak. I'm not sure how Google would handle this.
How about just working 24/7 to resolve the "technical problem" that is causing this? I know, easy for me to say
-
I'm no expert on this but I think you'll be fine IF you:
1 - 302 redirect (temporary redirect) to the non-www page
2 - Add a rel canonical on the non-www page giving the WWW version link credit.
When you've fixed your tech issues remove the 302 redirect.
I THINK google will play nice on this.
Hope that helps
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WWW vs Non WWW for EXISTING site.
This one has sort of been asked already but I cannot find an answer. When we evaluate a new SEO client, previously with Majestic we would review the root domain vs sub domain (www) for which had the higher Trust Flow and Citation flow, and if there was a major difference, adjust the Google indexed domain to the higher peforming one. Is there a way to do this with Moz, Domain Authority, and Sub Domain authority are always returning the same DA for me. Thanks in advance.
Technical SEO | | practiceedge10 -
Best Practice for www and non www
How is the best way to handle all the different variations of a website in terms of www | non www | http | https? In Google Search Console, I have all 4 versions and I have selected a preference. In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 25 PA 35 http://www.mydomain.com DA 19 PA 21 Each version of the home page having it's only set of links and scores. Should I try and "consolidate" all the scores into one page? Should I set up redirects to my preferred version of the website? Thanks in advance
Technical SEO | | I.AM.Strategist0 -
Is it a Panda/Penguin hit? Or it's just a natural ranking drop?
My traffic comes from google. This is the traffic profile. Does it look like a Panda or Penguin hit? I have a hard time determining it myself. Thanks. ne0r7kg.png
Technical SEO | | ChelseaP0 -
Should we use & or and in our url's?
Example: /Zambia/kasanka-&-bangweulu or /Zambia/kasanka-and-bangweulu which is the better url from the search engines point of view?
Technical SEO | | tribes0 -
Building URL's is there a difference between = and - ?
I have a Product Based Search site where the URL's are built dynamically based on the User input Parameters Currently I use the '=' t o built the URL based on the search parameters for eg: /condition=New/keywords=Ford+Focus/category=Exterior etc Is there any value in using hypen's instead of = ? Could you please help me in any general guidelines to follow
Technical SEO | | Chaits0 -
Setting preferred domain as www or none www
Way back before panda I used to rank for certain keywords pretty well. Of course like many others after panda I lost some of those rankings. I have been getting better since then so its not that bad. I was poking around in Google Webmaster Tools and I noticed something which I need some clarification in. History my site freescrabbledictionary.com used to be indexed as a none www. Then some time ago I can't remember when I set it to www. Tonight I was looking through my webmaster tools and I noticed something that did not make sense to me. In my content keywords section for the none www my list is as follows Content Keywords <form action="https://www.google.com/webmasters/tools/keywords-list?hl=en&siteUrl=http://freescrabbledictionary.com/" method="GET"> Keyword Significance 1. scrabble 2. words (2 variants) 3. dictionary 4. cheat 5. finder 6. friends 7. maker (2 variants) 8. noun 9. letter (2 variants) 10. hasbro 11. mattel 12. spear 13. found (2 variants) 14. sowpods 15. freescrabbledictionary 16. builder 17. affiliated 18. search 19. solver 20. lists </form> Then I looked at my www lists and its Content Keywords <form action="https://www.google.com/webmasters/tools/keywords-list?hl=en&siteUrl=http://www.freescrabbledictionary.com/" method="GET"> Keyword Significance 1. words (3 variants) 2. scrabble (2 variants) 3. letter (4 variants) 4. points 5. cheat (3 variants) 6. friends (2 variants) 7. finder (2 variants) 8. anagram (2 variants) 9. dictionary 10. tool (2 variants) 11. hasbro 12. mattel 13. spear 14. game (4 variants) 15. mobile 16. affiliated (3 variants) 17. berkshire 18. canada 19. calculations (5 variants) 20. coming (4 variants) </form> My none www version has the order (especially the first 5 keywords) that I want, my www version is no were near it. If I change back to the none www version could I possible see an change in rank? or can it effect it if I change it? I am starting to think I shot myself in the foot when I switched...
Technical SEO | | cbielich0 -
Would duplicate listings effect a client's ranking if they used same address?
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services. Now my client went from a rank around 3 - 4 to not even in the top 50 within a week. -- -- -- Would duplication cause this sudden drop? Not a lot of competition for a client using keyword (janitorial services nh); -- -- -- would a competitor that recently optimized a site cause this sudden drop? Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation). --- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking? And they there's Google Places:
Technical SEO | | CeCeBar
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page. The very odd thing though is that Google is still saying that need to re-verify their Google places. I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too? _Cindy0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0