Bad site migration - what to do!
-
Hi Mozzers - I'm just looking at a site which has been damaged by a very poor site migration.
Basically, the old URLs were 301'd to a page on the new website (not a 404) telling everyone the page no longer existed.
They did not 301 old pages to equivalent new pages. So I just checked Google WMT and saw 1,000 crawl errors - basically the old URLs.
This migration was done back in February, since when traffic to the website has never recovered. Should I fix this now? Is it worth implementing the correct 301s now, after such a timelapse?
-
Thanks for advice Paul - I've edited pages with poor backlink profiles out of redirect and 301'd the rest to relevant pages within the new website, and traffic has spiked up a bit. There were some very powerful links (national newspapers, etc) the new site was missing out on.
-
Gonna have to disagree with Rob a bit here on one point. Definitely agree that it's still worth implementing the correct 301 redirects. However, by only doing it for page already getting traffic or having links, you're going to miss out on potentially a substantial amount of future traffic, especially long-tail (which is very valuable).
There can be many pages which Google has indexes and which can return for search results even though the pages don't have incoming links. To my mind, it would be a shame to throw away that traffic. It is possible to collect a list of all the pages of your site Google has in it's index, but its often easier to just correctly redirect them all. Often groups of pages can be redirected using a single redirect to reduce the amount of work involved.
At the very least, make certain you have implemented an effective 404 page, and also ensure you have included your Google Analytics tracking code on the 404 page. Then you can use a combination of Analytics and Webmaster Tools to catch any incoming 404s and get them redirected immediately.
Does that make sense?
Paul
-
Thanks SErOb - good advice!
-
At this point, I'd only adjust 301's for the pages actual users are trying to access and pages with good backlinks.
Google will always have a record of the old pages in WMT, don't fret over that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Two Week eCommerce Site Migration - how to handle visibility
Good day All, We have a eComm site with over 100K pages and migrating to a new site design, new CMS and with a new URL structure for top level URLs only. Product URLs not changing (thank goodness!). We have outlined our strategy for redirects, indexation, 404s, etc. The missing piece of the puzzle is that we're only allowing 10% visitors to see new site at launch and increase visibility over two week; therefore, my question is do we not allow indexing of new site until 100% visibility to all users? How do we manage the redirects for limited visibility? My gut says don't bother for such a short period of time and block new site from SEs until 100% visibility. Since the site would be blocked how are redirects managed? Should we be using a 302 initially then switch to 301 or use a 503 code to indicate "hey, maintenance happening - come back later" with a time frame? Hope that's clear and any tips greatly appreciated. Cheers, WMCA
Intermediate & Advanced SEO | | WMCA0 -
Is this site worth subscribing to?
Hi everyone is, the below site worthwhile submitting to? I see one of our competitors is on here and the article they have published has in turn be picked up by other sites. Is the financial cost worth the back link reward? https://app.prweb.com/Main.aspx?Entity=Home
Intermediate & Advanced SEO | | Hardley10 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Sites banned from Google?
How do you find out sites banned from Google? I know how to find out sites no longer cached, or is it the same thing once deindexed? As always aprpeciate your advice everyone.
Intermediate & Advanced SEO | | pauledwards0 -
Clone sites at new company
I just came in house to our company for SEO. We have one main site and 182 that are exact duplicates and almost exact clones of the main site. It's no surprise that half of these clones are deindexed already. The meta tags are just the domain name URL. I want to add unique text on the home page to each site, fix the meta tags and switch them up so they aren't clones. Other than a huge rewrite of the code for each site, I'm not sure what else to do to prevent the rest from getting deindexed. Is there any way to prevent the rest from getting deindexed?
Intermediate & Advanced SEO | | CFSSEO0 -
Feedback about Bad Behavior Plugin
Hi Guys, I wanted to use this Wordpress plugin to minimize the access of spammers on my site. The plugin is called Bad Behavior. Any feedback with regards to this if its safe to use? Thanks in advance....
Intermediate & Advanced SEO | | Trigun0 -
Multiple sitemaps for one site?
Excuse my sitemap ignorance here. I've got a site and it's got a blog in a sub-folder. The blog gets updated frequently, the main site does not. Is it best to; a) Have 2 sitemaps.. one in the root and one in the /blog folder. b) Have 1 sitemap that is regularly updated The reason being, I know there's various plugins that create blog sitemaps on the fly, so that would be much easier than updating the main sitemap every time a change was made. If the answer is 2 sitemaps; Would you stop the root sitemap from detailing the contents of the blog folder or just update it every so often with the contents of the blog folder?
Intermediate & Advanced SEO | | PeterAlexLeigh0