Www v.s non www
-
The canonical URLs (and all our link building efforts) is on the www version of the site.
However, the site is having a massive technical problem and need to redirect some links (some of which are very important) from the www to the non www version of the site (for these pages the canonical link is still the www version).
How big of a SEO problem is this?
Can you please explain the exact SEO dangers?
Thanks!
-
Thanks for all your responses - I will use this as the basis of my answer to the technical team.
-
I'm endorsing Stephen's idea, because if you really have no choice, I think it's a good potential alternative. THB's comments (which I thumbed up) are very important, though.
If you really have no choice, I do think the 302 is safer here - the canonical tag should override it. There is some risk, though, and it's definitely not ideal.
I'm not clear on the problem, but could you return a 503? It basically says "We've got a temporary problem - come back later" and, if it really is temporary, Google won't de-index the pages. If you're talking a couple of days, this may be a better solution. If you're talking a few weeks, you may have to take Stephen's advice. You might want to pull in expert help, though, because my gut reaction is that there's a better way to fix what's broken here.
-
Hehe.
Generally speaking, and I've actually come across this quite a bit lately, it's better to just put your efforts towards fixing the technical issues than to try and manipulate the site using redirects and canonical tags. But it's easy to say when it's not my technical problem, nor my money/time on the line to fix it! However, that is always the best-case scenario in my opinion.
-
Agreed. It's a problem waiting to bite you in the proverbials....
-
I worry about setting up a canonical tag that points to a URL Google can't access (as it's just being redirected via 302 back to the non-www version anytime it will try and read the canonical URL). And since a canonical tag is kinda sorta like a 301, you'd ultimately be 301'ing (kinda sorta) back to the www version, only to have a 302 header sent, 302'ing Google back to the non-www. And endless loop, so-to-speak. I'm not sure how Google would handle this.
How about just working 24/7 to resolve the "technical problem" that is causing this? I know, easy for me to say
-
I'm no expert on this but I think you'll be fine IF you:
1 - 302 redirect (temporary redirect) to the non-www page
2 - Add a rel canonical on the non-www page giving the WWW version link credit.
When you've fixed your tech issues remove the 302 redirect.
I THINK google will play nice on this.
Hope that helps
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's wrong with this robots.txt
Hi. really struggling with the robots.txt file
Technical SEO | | Leonie-Kramer
this is it: User-agent: *
Disallow: /product/ #old sitemap
Disallow: /media/name.xml When testing in w3c.org everything looks good, testing is okay, but when uploading it to the server, Google webmaster tools gives 3 errors. Checked it with my collegue we both don't know what's wrong. Can someone take a look at this and give me the solution.
Thanx in advance! Leonie1 -
Pro's & contra's: http vs https
Hi there, We are planning to take the step and go from http to https. The main reason to do this, is to mean trustfull to our clients. And of course the rumours that it would be better for ranking (in the future). We have a large e-commerce site. A part of this site ia already HTTPS. I've read a lot of info about pro's and contra's, also this MOZ article: http://moz.com/blog/seo-tips-https-ssl
Technical SEO | | Leonie-Kramer
But i want to know some experience from others who already done this. What did you encountered when changing to HTTPS, did you had ranking drops, or loss of links etc? I want to make a list form pro's and contra's and things we have to do in advance. Thanx, Leonie0 -
I broke Google! (random snippet appearing in non-personalized search)
Hello all, so either I broke Google or Google doesn't know how to index my page properly (onradpad.com/paymyrent). If you search "pay rent with credit card", whether you're logged in to Google or not, you'll see a snippet from our signup process (which is js) right under the ad slot in the serps (Awesome! You're signed up!) and it will repeat where my meta data should be. It's been like this for well over a month now and I cannot figure out how to get rid of it. Additionally, if you search for the branded title of the page "pay with radpad", it pulls language that's not on that page (perhaps from somewhere in the js signup form). Though if you search for "pay rent with radpad" you'll see what my meta description is supposed to look like in the serps. Any ideas as to what the heck is going on?
Technical SEO | | RadMatt0 -
Secure and non-secure Schema.org Markup?
Is it possible to have schema.org itemtypes for both secure and insecure ports? I run a static-ish site made in Jekyll, and am implementing Schema.org on the individual pages. As a result, I'm trying to use the following: This doesn't validate with Google's Rich Snippet Tool. It doesn't register the Items as existing. Is there a good way to implement Schema.org in a static page hosted on both SSL and non-SSL ports?
Technical SEO | | RoxBrock0 -
We can't figure out why competitors have better position(s) in Google
We are using MOZ analytics for some days now, and it really helps us with important information about our rankings.
Technical SEO | | wilcoXXL
I hope you guys can help us out with the following particular case; In google.nl (dutch) we rank position #18 with the following searchterm 'sphinx 345' one of our competitors rank position #3.
We used the MOZ On Page Grade tool to find out some details about the two pages:
Our page #18: http://goo.gl/cTsbmI
Competitor page #3: http://goo.gl/qk21sM Our page hits an A and Keyword usage for "sphinx 345" = 52
The competitors page hits an A too and Keyword usage for "sphinx 345" = 45 About the link structure; for our page there is no link data found in Open Site Explorer. The url exists about a year and a half now.
I'm also very sure we have many internal links to this url.
Does Google and other crawlers have a hard time to crawl our site?(it's a Magento site, our competitors do have custom-made e-commerce systems, maybe that has something to do with it?) As i were saying;we can't figure this out. I hope you guys can help to get us any further. Regards, Wilco0 -
Capitals URLs to Non Capitals...
Hi, I am working on a website which has capital urls and non capital urls which will be generating duplicate content, and I know it is better to use all lower case. The problem is that the page authority is better for the capital versions and I was wondering will it negatively impact the SEO of we 301 redirect the uppercase urls to the lowercase counterparts? Thanks.
Technical SEO | | J_Sinclair0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0