Wrong redirect used
-
Hi Folks,
I have a query & looking for some opinions.Our site migrated to https://
Somewhere along the line between the developer & hosting provided 302 redirect was implemented instead of the recommended 301 (the 301 rule was not being honured in the htaccess file.)
1 week passed, I noticed some of our key phrases disappear from the serps When investigated, I noticed this the incorrect redirect was implemented.The correct 301 redirect has now been implemented & functioning correctly.
- I have created a new https property in webmaster tools,
- Submitted the sitemap,
- Provided link in the robots.txt file to the https sitemap
- Canonical tags set to correct https.
My gut feeling is that Google will take some time to realise the problem & take some time to update the search results we lost.
Has anyone experienced this before or have any further thoughts on how to rectify asap.
-
Hi Vettyy & Michael,
Thanks a million for your response.
Implemented both your suggestions, Screaming frog showed the previous 302 are now 301 so everything seems ok on that end. I have also updated Google My Places listing to reflectUsing site:mydomain.com has showed a mixture of https vs http so I am guessing I just need to wait & monitor - cross my fingers & hope for the best.
-
It sounds like you have done all the right things. I agree with Vettyy that you should use something like Screaming Frog to crawl all the old URLs just to double check there are no hanging 404 pages or missed http pages. Switching to 301 will take a few days to filter through, so you could run cache:domain.com in Google on your most important pages to monitor when they are being crawled. Also do you have a mix of http and https in Google in present? It may very well be something to just wait and monitor.
A good tool for sniffing URLs headers is Fiddler.
-
I normally do a fetch and render in Google Search Console (webmaster tools) and submit to index. It gives you the option of submitting an individual page but I normally fetch and render the main domain and then select 'crawl this url and its direct links'. Then if there are only a couple of important pages not being indexed or recognized by Google, fetch and render and submit those individual ones.
type in site:www.yourdomain.com" in Google to monitor manually which pages are indexed as well.
If you have the time I would pop a list of your original urls in screaming frog to see which page their redirecting to, or test a couple of pages using this http://www.wheregoes.com/retracer.php to see how many redirects are in place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting Ecommerce Site
Hi I'm working on a big site migration I'm setting up redirects for all the old categories to point to the new ones. I'm doing this based on relevancy, the categories don't match up exactly but I've tried to redirect to the most relevant alternative. Would this be the right approach?
Intermediate & Advanced SEO | | BeckyKey1 -
HTTPS & Redirects
Hi We're moving to https imminently & I wondered if anyone has advice on redirects. Obviously we'll be redirecting all http versions to https - but should I be checking how many redirects are in each chain and amending accordingly? If there's 4-5 in a chain, remove the middle unnecessary URLS ? Advice please 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
What's wrong with the algorithm?
Is it possible that Google is penalising a specific page and in the same time it shows unrelated page in the search results? "rent luxury car florence" shows https://lurento.com/city/munich/on the 2nd page (that's Munich, Germany) and in the same time completely ignores the related page https://lurento.com/city/florence/ How I can figure out if the specific page has been trashed and why? Thanks,
Intermediate & Advanced SEO | | lurento.com
Mike0 -
Robots.txt and redirected backlinks
Hey there, since a client's global website has a very complex structure which lead to big duplicate content problems, we decided to disallow crawler access and instead allow access to only a few relevant subdirectories. While indexing has improved since this I was wondering if we might have cut off link juice. Since several backlinks point to the disallowed root directory and are from there redirected (301) to the allowed directory I was wondering if this could cause any problems? Example: If there is a backlink pointing to example.com (disallowed in robots.txt) and is redirected from there to example.com/uk/en (allowed in robots.txt). Would this cut off the link juice? Thanks a lot for your thoughts on this. Regards, Jochen
Intermediate & Advanced SEO | | Online-Marketing-Guy0 -
Is it worth redirecting?
Hello! Is there any wisdom or non-wisdom in taking old websites and blogs that may not be very active, but still get some traffic, and redirecting them to a brand new website? The new website would be in the same industry, but not the same niche as the older websites. Would there be any SEO boost to the new website by doing this? Or would it just hurt the credibility of the new website?
Intermediate & Advanced SEO | | dieselprogrammers0 -
Google Ranking Wrong Page
The company I work for started with a website targeting one city. Soon after I started SEO for them, they expanded to two cities. Optimization was challenging, but we managed to rank highly in both cities for our keywords. A year or so later, the company expanded to two new locations, so now 4 total. At the time, we realized it was going to be tough to rank any one page for four different cities, so our new SEO strategy was to break the website into 5 sections or minisites consisting of 4 city-targeted sites, and our original site which will now be branded as more of a national website. Our URL structures now look something like this:
Intermediate & Advanced SEO | | cpapciak
www.company.com
www.company.com/city-1
www.company.com/city-2
www.company.com/city-3
www.company.com.city-4 Now, in the present time, all is going well except for our original targeted city. The problem is that Google keeps ranking our original site (which is now national) instead of the new city-specific site we created. I realize that this is probably due to all of the past SEO we did optimizing for that city. My thoughts are that Google is confused as to which page to actually rank for this city's keyword terms and I was wondering if canonical tags would be a possible solution here, since the pages are about 95% identical. Anyone have any insight? I'd really appreciate it!0 -
How would you use this broken link building opportunity?
I've found a good opportunity to build some links and I'd love your opinions on my options here. There's a big event that happens once a year in my city. Let's say the event used to have a website called www.CityEvent.com. The event decided not to use this website anymore, but instead to put all of their event information on their facebook page. It looks like they let their domain name expire and someone else snapped it up. It's now sitting as an empty wordpress blog with one line of text. This empty website has 1300 links pointing to it. I can see two opportunities here: 1. Write a very thorough article on my website (that I am trying to build links to) describing the event and giving people all of the information that they need to know about it. (The amount of information on the Facebook page is minimal.) or 2. Create a new website called www.EventCity.com and put up a static page with all of the information that people need to know. There would be a link on this page pointing to the site that I am trying to rank. In both cases there would be much more information than is available on the Facebook page including a collection of youtube videos about the event and many helpful links for people who are interested in this type of event. Then the plan is to contact the sites who are linking to the dead page and invite them to link to my new page (either on my site or the new site that I could create). I see a few pros and cons to each method. For option #2 I think people would be more likely to link to a more official looking page rather than an article on a separate website. (My website has information about the city in question but is not closely related to the event at all.) However, I would only be getting one link back to my site. One negative to this is that the actual organizers of the event may not be pleased that someone has created an official looking page. But then again, perhaps they would be happy to have a free website. For option #1 I would possibly get more links from sites that are authoritative in my city that point directly to the site I am trying to rank. However, people would be less likely to link to us because we are not an official site for the event, but simply a very good article about the event. There are no other good articles for this event that are ranking on Google. Hopefully that makes sense. What would you do? EDIT - Just thought of a third option - try to buy the domain.
Intermediate & Advanced SEO | | MarieHaynes0 -
301 Redirect - How Long Until Recovery?
How long after one moves a page and sets up the 301s should the site take to regain its previous rankings? Context: i've ported a site to a new framework. Along the way, several high ranked pages needed to have new URLs setup, as well as the site moved from www.domain.com to simply domain.com. About 1 week after the change, the site's traffic went down 70% and has been there for about another 2 weeks. I suppose it could be something about the new framework that is causing problems though according to SEOMoz tools, the new framework is checking out pretty well. I assume the problem is reconciling all those old www inbound links with the new non-www location. It is all 301'd however ... so it should be working, but is not. So my questions are: 1. How long should it take Google to reconcile these changes and put us back to original SERP positions 2. is there something inherently problematic with switching from www to non-www?
Intermediate & Advanced SEO | | NealCabage0