Awful ranking after site redesign
-
Hello everyone,
I have a situation here and I’d like to have your opinion about it.
I am working on a site which has been recently redesigned from scratch: in a nutshell, as soon as the new site went live their rankings dropped and, of course, so did their visitors and so on..
The guys who redesigned the site didn’t do any 301 redirect whatsoever, so now the old pages are just 404s and blocked by robots. My question is: if they 301 redirect now, do you think it would be possible they could get their rankings back?
One more thing: when they launched the new site, the indexed pages basically doubled overnight; there were 700 and now there are 1400. Do you think this could affect their ranking as well?
Thank you for you insights
Elio
-
Hello everyone and thank you for your answers. I sincerely appreciate it!
I didn’t follow the redesign phase, I’ve just jumped on board now so I actually have no idea why they didn’t go for the 301 solution.
As Monica pointed out the 404ed pages were actually valuable pages and, at least in my opinion, this is proved by the fact that now their traffic is close to 0. Their traffic literally dropped in a matter of days (kind of scary to see such a steep fall). I agree with Travis when he says that just the valuable pages should be 301ed, but the thing is that they sell their products online, meaning that hypothetically every (product) page is equally important. They were neither old nor poor quality…I guess they just skipped the 301 step. I will do some more research but I guess that, as you guys suggest, the best way to go is 301 all those pages and see what happens.
I have no idea if they did anything on the social side but that’s worth investigating some more.
Thank you very much for now! I will keep you updated
Cheers
-
I would imagine if the pages were previously ranking they had value. The rule of thumb is to discard pages not ranking on pages 1-3. Since there has been such a decrease in traffic it is reasonable to assume that valuable pages have been 404ed when they should have been 301ed.
I have migrated 7 sites over the past 5 years, so I feel reasonably comfortable saying the duplicated pages are causing the influx in indexed pages. Redirecting the 404 pages is the strongest strategy right now. They basically created 700 valueless pages that won't rank until they are fully indexed and gain some value to the engine, which could take months. It is starting over from 0, which is why the 301 redirects are "normally" best practice.
Any 301 will lose a little bit of link juice. It goes from having a strong page rank alone to diluting its value by sharing it with another link. While it will help salvage some of the sites juice, it won't put them on page 1.
You can wait for these pages to start ranking alone, but that could take months based on the level of on page op and if there are any good links pointing at those pages currently. I am not a fan of the wait and see game, therefore, I try to do everything I can up front. The 301 redirects of the old pages would be best practice in this situation.
-
You can recover page authority from a 404 page for a surprising amount of time--I once did a 301 redirect on some pages that had been 404 for a couple of years and they quickly gained rank. What was the thinking behind not redirecting old pages? Were they poor quality? You don't have to redirect all of them at once--you can start with the best pages (and at least some of them must have been good since you had traffic to lose).
-
When a page is a 404, The Googles will come back to it in an undisclosed period of time. This is in order to make sure the page is really gone. Now if the pages that are gone used to receive referral traffic, it would be super handy to get those pages up soon, forget about the search engines. That way, you're recovering links and pages for the right reasons.
What should be your first order of questioning is if those pages were worth anything to begin with. I can rank a site for 'left handed profession city st', overnight. It doesn't mean any of that is going to work for the client.
But if they didn't redirect any of the old pages to their new, relevant, equivalents - I highly doubt they took the time to block those pages via robots.txt. If they did, wow. I'll leave it at that.
The increase of indexed pages could be due to any number of things. Perhaps a site search function is misconfigured? Perhaps the site uses tags in a way I wouldn't recommend? Perhaps the CMS, if there is one, is prone to duplicate content.
That's pretty much the best I can do without a specific example. Anyone with more 'skeelz' than I would be guessing as well. But thanks much for your question.
-
Ugh... I hate when this happens. It is such a pain in the butt to fix.
1st, you absolutely need those 301 redirects. Don't wait any longer to get them done. Those 404s are affecting your rankings considerable at this point. Basically you have 700 of them, whoa.
Secondly, the double index is because you have 700 new pages added to the 700 old pages. You can wait it out if you want to, but I don't recommend it. Get rid of the no follow on those old pages, 301 them so that the rankings might be salvaged. Once the new pages start ranking on their own you can get rid of the 301s. But, for now, get them going.
The 301s add a little bit of juice to the new pages, and that is a good thing. The reason they are important is because they are still ranking and bringing traffic to your site. The new pages will start to get some traffic which in turn will help their rankings.
Did you do anything on social with the site redesign? If you send out a post you might be able to salvage some traffic from you followers. Social signals will also help the rankings of the new pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Eligible To Rank For Few Queries
I can't figure out why this site (www.liveathomeseniors.com) is eligible for so few search queries on Google Webmaster Tools. I know there is a lot of work to be done, but this is my biggest puzzle right now. What am I not seeing? 379 pages are indexed and yet the site is has only been deemed eligible to rank for 3 queries over the past 3 months. Is it all the repetition in the way the content has been structured? I'd appreciate people's thoughts on this. I can't see the forest for the trees. Donna
Technical SEO | | DonnaDuncan0 -
Https Cached Site
Hi there, I recently switch my site to a new ecommerce platform which hosts the SSL certificate on their end so my site no longer has the HTTPS status unless a user is going through the checkout. Google has cached the HTTPS version of the site so in search it comes up sometimes which leads to a nasty warning that the site may not be what they are looking for. Is there a way to tell google NOT to look at the https version of the site anymore? Thanks! Bianca
Technical SEO | | TheBatesMillStore0 -
Website not ranking but the blog is!
I am hoping someone might be able to help me, I am doing some work on a website. A new version of the site was recently launched and since then rankings have plummeted and the new blog pages are ranking better! When the new version of the site went live, the domain changed to the non-www version, plus an incorrect robots.txt file and we have never really been able to fully recover (both of these things were beyond my control!). The robots.txt file was corrected and some of the external links links changed to the non-www but there is a 301 redirect in place so changing to the non-www shouldn't have been the reason to drop the site out completely. Before the launch of the new website, the site was ranking on the front page of Google for a lot of relevant keywords such as outdoor blinds, outdoor blinds Perth, cafe blinds, patio blinds, etc. The quality of the links is pretty bad and I am attempting to remove them before doing a disavow of all the really bad quality links but unless we were really unlucky I don't think it's the links right now that are causing the problem. I have ran the site through numerous crawl tests, checked the robots.txt, there are no messages in GWMT, the pages are indexed but I have a feeling there is something wrong with the site that is stopping this site from ranking well. If anyone could give me any insights I would be really grateful. I know the site could be better structured from a keyword/ structure perspective but the site was ranking fine!
Technical SEO | | Karen_Dauncey0 -
Alexa Ranking - Improvement
Hey Guys, I have very decent traffic for my website but the Alexa rank is fluctuating very frequently.If the traffic is growing I cant see any changes and if the traffic dips down a bit then my rankings are going high 😞 Please suggest me what to do for getting an Alexa rank of **7000. **At present I have 9859 Hers ie my website link : http://www.teluguone.com/ Thanks in Advance Saikiran
Technical SEO | | logobite0 -
Redirecting a old aged site to a new exact match site?
Hi All, I have a question. I have 2 sites with me in the same sector and want some help. site 1 is a old site started back in 2003 and has some amount of links to it and has a pr 3 with some good links to it but doesn't rank much for any keywords for the timing. site 2 is a aged domain but newly developed with unique content and has a good amount of exact match with a .com version. so will there be any benefit by redirecting site 1 to site 2 to get the seo benefits and a start for link bulding? or is it best to develop and work on each site? the sector is health insurance. Thanks
Technical SEO | | macky71 -
Pros & Cons of deindexing a site prior to launch of a new site on the same domain.
If you were launching a new website to completely replace an older existing site on the same domain, would there be any value in temporarily deindexing the old site prior to launching the new site? Both have roughly 3000 pages, will launch on the same domain but have a completely new url structure and much better optimized for the web. Many high ranking pages will be redirected with 301 to the corresponding new page. I believe the hypothesis is this would eliminate a mix of old & new pages from sharing space in the serps and the crawlers are more likely to index more of the new site initially. I don't believe this is a great strategy, on the other hand I see some merit to the arguments for it.
Technical SEO | | medtouch0 -
Replacing a site map
We are in the process of changing our folder/url structure. Currently we have about 5 sitemaps submitted to Google. How is it best to deal with these site maps in terms of either (a) replacing the old URLs with the new ones in the site map and (b) what affect should we have if we removed the site map submission from the Google Webmaster Tools console. Basically we have in the region of 20,000 urls to redirect to the new format, and to update in the site map.
Technical SEO | | NeilTompkins0 -
.COM vs .CA rankings - .CA ranks on Google.com
Hi SEOMOZers, We have a fairly large retail client with both .COM and .CA domains. Each of the sites are almost identical in design and, in most cases, content (these would be product pages). The .US site has been live for nearly 2.5 years while the Canadian probably over a year younger or so. Both sites are hosted in the US. What we're starting to see as of the last few months are searches that used to rank .COM product pages now rank the Canadian page above the US page on Google.com. We've checked Webmaster Tools for each site and they target the appropriate country. With nearly all examples we've seen, we haven't noticed any more links pointing to the Canadian page, and where this is becoming a widespread occurence we're not convinced it's a linking issue. My question is why Google might see both versions but rank the Canadian page above the US page on Google.com for a search being performed in the US? Does anyone have any ideas on why this may be happening?
Technical SEO | | HarborOneBank0