HTTPS & 301s
-
Hi
- We have like most set up a redirect from HTTP to HTTPS.
- We also changed our website and set up redirects from .ASP pages to PHP pages
- We are now seeing 2 redirects in place for the whole of the website.
http.www.domain.com > https.www.domain.com (1) >> oldwebpage.asp >> new webpage.php (2)
The question is: Is there anyway of making the redirect 1 and not 2?
thanks
Enver -
Just to make sure I understand. Can you clarify the sequence of the changes and for how long? Do you know if one set of URLs has links to it or was ever indexed.
Let me explain.
It sounds like you had a site that was using http and was an asp site. So you had URLs like
http://www.website.com/file.asp (we will call this URL type A)
You then converted to https so the URLs were like
https://www.website.com/file.asp (we will call this URL type B)
You then updated to a PHP site so now with URLs are like this
https://www.website.com/file.php (we will call this URL type C)
You can setup 301s to go from A to B and then another set to go from B to C. Your question is can you setup a 301 to go from A to C, the answer is yes. You should do this. Anytime you can reduce the number of hops the better.
What you need to think about is, well, that about the A to B and the B to C redirects? Well, I would say at a minimum, you need to eliminate the A to B 301s as you have now decided to skip the "B" and go right to C. That works. What about the B to C 301 redirect? It depends. If you had version B of the website out for a while, and it was indexed by Google and you have links that are built to B version URLs, then yes, you need to leave the B to C redirects. You don't want to lose any of that equity.
Likewise, let's say you have a version D of the site that comes out a year later. You have lots of links into the C version of the site.
https://www.website.com/file.html
You then need the A urls to 301 to the D URLs (and get rid of the A to C 301s), you need the B URL to 301 to the D URLs and so on. In other words, go through another process of cleaning up the 301s and reducing the hops.
Why do all this. Two reasons. There will still be links to the A, B, C versions of the site. Google will still find them and crawl them and you want to get credit for those links to your site. Also, Google keeps an internal log of URLs and will check them from time to time, even if no one is linking to them. You want Google to find the right URL. In either case, if Google hits a version A URL, it would then have to go to version B via a 301 and then to version C. It can do it, but it would rather have 1 hop.
Side note. Try not to use global 301s, where you just 301 a bunch of pages to the home page. That does nothing for you as far as link equity. Try and make the 301s a 1 to 1 relationship as much as possible.
Take a look at this video and this backs up what I just said. The number of hops is discussed at about 3 min in. The whole video is worth watching https://www.youtube.com/watch?v=r1lVPrYoBkA
-
I'm not sure I understand. What is wrong with the ASP -> PHP redirect?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Once on https should Moz still be picking up errors on http
Hello, Should Moz be picking up http errors still if the sites on https? Or has the https not been done properly? I'm getting duplicate errors amoung other things. Cheers, Ruth
Technical SEO | | Ruth-birdcage1 -
Http v https Duplicate Issues
Hello, I noticed earlier an issue on my site. http://mysite.com and https://mysite.com both had canonical links pointing to themselves so in effect creating duplicate content. I have now taken steps to ensure the https version has a canonical that points to the http version but I was wondering what other steps would people recommend? Is it safe to NOINDEX the https pages? Or block them via robots.txt or both? We are not quite ready to go fully HTTPS with our site yet (I know Google now prefers this) Any thoughts would be very much appreciated.
Technical SEO | | niallfred0 -
Https - should I do change of address on WMT
We have added a SSL cert to our site - Should I submit change of address on WMT and submit a new sitemap from the http?
Technical SEO | | webguru20140 -
Existing content & 301 redirects
Hi All, I will try to keep this to the point. One of our websites was hit by penguin for unnatural linking. We are building a new site (same business, different domain), but we would like to take some of the pages/content off the old website and use it on our new site. Is it just a case of copying each page onto our new site and 301 redirect the old URL? Or should I just be completely rewording/recreating the old content so it is unique? Any help on this would be great, but I am also open to alternate methods too. Thanks Lewis
Technical SEO | | SO_UK0 -
Robots.txt best practices & tips
Hey, I was wondering if someone could give me some advice on whether I should block the robots.txt file from the average user (not from googlebot, yandex, etc)? If so, how would I go about doing this? With .htaccess I'm guessing - but not an expert. What can people do with the information in the file? Maybe someone can give me some "best practices"? (I have a wordpress based website) Thanks in advance!
Technical SEO | | JonathanRolande0 -
SEO basics for Q&A tool
Hi everyone, our company wants to launch a Q&A forum on our website. The goal is to keep the useres interacting with our website, generate leads (of course) and... last but not least... to generate UGC for our website (and Google of course)... [We organise career events with big companys for students, professionals, give career advice etc..] From a SEO perspective, I find the following points difficult to overcome: the possible problem of "thin" content, many URL's with a question and only 1 or 2 answers will not look good for Google, especially when there are a lot of it (Panda-Update). One solution could be to noindex pages with thin content, but imagine that you have an active community, this could take ages and we got other things to do... the problem of finding ALL content: what would be the best solution to make sure that G finds all UGC, even the older content? Would it be enough to link to older questions on the page of the actual question? Let's say, this page contains links to the 5 questions before and so on... Or should there be categories of questions, where you list all of the questions ever asked??? would you/can one optimise the content? Users do not ask questions with the beloved keywords and if there would be a standard solution that the URL and the Title-Tag contains the question, there could be a lot of strange/not useful pages on our domain... I hope I could make clear what my problems are and I hope someone can give me some good advice... Thanx!!
Technical SEO | | accessKellyOCG0 -
Pagerank and 301s
Hi all For various reasons some of our pages were renamed from: http://www.meresverige.dk/rejser/malmoe to: http://www.meresverige.dk/rejser/malmo We have made proper 301 redirects and also updated sitemap.xml accordingly. The change was done about 5th of September. The content on the pages remain identical. This page, and all pages below it in the structure now get very low or no page-rank at all. Much lower than it was before the name change. Any ideas to how long, if ever, it will take for Google to transfer the page-rank from the old page? Any suggestions to what we can do to make the process faster?
Technical SEO | | Resultify0 -
Google Web Master Tools - Keyword Variants & misspelling
We have millions of urls and the technical expertise to write code to fix the spelling of keyword variants Google has discovered and shows us in Web Master tools. Since Google has recognized these as variants, is it worth our time to write code that will fix the spelling of obvious misses?
Technical SEO | | snoopcat0