Safest way to launch a redesign
-
Hey MozFolk,
I was wondering what the best and SAFEST way to handle this situation is; I am doing a redesign of our current website, but the new site will have different content. Should we just forward the entire root domain in the HT Access file? Or redirect each and every URL using a 301? I know these, terms but never actually done it myself, and cannot risk losing the SEO weight of this website.
How do I handle a group of pages that they don't want to continue to use also? Do I just leave those URLs be, or do I forward all of them to one new page (or homepage) on the new site?
Please help me look like a rockstar and save the ship from sinking itself!
-
Hey thanks, the site is less than 30 pages. But would it be just as acceptable to just forward the entire domain, or is it better practice to 301 each individual page. And if I do 301 each page, do we ALSO need to forward the domain?
I appreciate your help!
Derek
-
It can be a pretty big job if you change the URL structures - depending on how large the website is.
Modesto Siotos wrote a blog on it here.
If you change the structure then ideally, you should forward every page to the closest replacement page on the new design using a 301.
Give the blog a good read, it should hopefully help.
Matt
-
How big is the site? It would be ideal to either keep the existing site structure so you don't need to worry about redirecting pages OR 301 as many pages as possible to the new URLs for those pages. If you have any pages on the current site that won't have a counterpart on the new site, then I recommend 301ing as many of those pages to the most relevant page on the new site as possible.
Of course, if your site is thousands or millions of pages large, then it's likely not feasible to manually go in and redirect every single page so I'd recommend at least redirecting the most popular pages to their new counterparts and then possibly redirect all of the remaining pages to the most relevant mid-level pages or even the homepage.
The last thing you want to do is simply redirect all of the pages to the root domain...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the fastest way disassociate an old URL with a new domain name?
We have a client with an old domain which was spammy (bad links). Until two months ago, it was forwarding to his current domain and (I believe) causing a penalty. Two months ago we transferred ownership of the spammy URL to a third party and setup an unrelated blog for Google to pick up on. Google did pick up on the URL. After two months Google Webmaster Tools is still showing 200 links from the old domain, to the new domain (from the spammy domain). Also, when you search the company name, the spammy domain still appears in the results (page two). Is there a faster way disassociate the old domain entirely from the business? I.e., just delete the domain, forward the domain to another website, etc.? If you have experience in this, I'd love to hear from you. Thanks!
Intermediate & Advanced SEO | | mgordon0 -
Only homepage is ranking after site re-launch
We've been moving all our sites over to a new platform (Demandware) this year. In the process, they've all gotten updated designs (from the same template), on-page optimizations, etc. Since they're all on the same platform and are essentially copies from one template, any technical issues found have been fixed across all sites. The problem I'm seeing is there are a few sites that haven't really seen much/any recovery from the site launch, and these are sites that were done 4-5 months ago. There's one in particular that's especially concerning, since it's showing issues that none of the other sites seem to have. In my Moz reports, it looks like of all the keywords that are ranking, they're only ranking the https version of the homepage (and from what I'm seeing, the https version wasn't picked up and ranked until the beginning of October, which was also the time that WMT shows a huge drop in clicks and impressions). I've crawled the site (ScreamingFrog), done a site search in Google (all pages look to be indexed), etc. and I haven't come across any specific problems there that would suggest a technical issue. We're wondering if it might be a link authority problem, since this site had the most dramatic change in navigation. The navigation used to be product based (Boots, Shoes, etc.) and is now broken up by gender. I've noticed that a few other pages that are ranking are dual gender pages that also existed on the old site, whereas all of these new categories aren't ranking at all and I'm not seeing this happen with any of our other sites. I've gone down a bunch of different paths trying to figure this out, but I haven't come up with any concrete answers as to why this is happening and how to fix it. Any thoughts as to what else I can look into or try for this?
Intermediate & Advanced SEO | | WWWSEO0 -
Quickest way to deindex large parts of a website
Hey there, my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations. Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages. I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow. It would be great if you could share your thoughts on that. Cheers, Jochen Hey there,
Intermediate & Advanced SEO | | Online-Marketing-Guy
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen0 -
Drop in traffic after redesign
Is it common for a site to see slight traffic drops after a site redesign (containing cleaner code, more usability and basically just being more helpful for the end user)? A new site of ours went live last Wednesday and has experienced a drop in traffic. If you have seen this in your own site, how did you recover? And how long did the recovery take?
Intermediate & Advanced SEO | | Gordian0 -
Is there a way to keep sitemap.xml files from getting indexed?
Wow, I should know the answer to this question. Sitemap.xml files have to be accessible to the bots for indexing they can't be disallowed in robots.txt and can't block the folder at the server level. So how can you allow the bots to crawl these xml pages but have them not show up in google's index when doing a site: command search, or is that even possible? Hmmm
Intermediate & Advanced SEO | | irvingw0 -
Easy way to get some do-follow links for a new site
I am launching a new website and when I search for "list of do-follow websites" I find lots of people posting their list. Rather than individually sign up for hundreds of sites for one link at a time, is there a tool that can automate this?
Intermediate & Advanced SEO | | StreetwiseReports0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Has anyone found a way to get site links in the SERPs?
I am wanting to get some site links in the serps to increase the size of my "space", has anyone found a way of getting them? I know google says that its automatic and only generated if they feel it would benifit browsers but there must be a rule of thumb to follow. I was thinking down the line of a tight catagorical system that is implimented throughout the site that is clearly related to the content (how it should be I guess)... Any comments, suggestions welcome
Intermediate & Advanced SEO | | CraigAddyman0