Website Redesign & Ensuring Minimal Traffic/Rankings Lost
-
Hi there,
We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense.
So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost?
Thank you so much for your help.
-
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned
use a complete search and replace when necessary across the entire site just to make sure everything’s in place.
I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month.
Redirect mapping process
If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.
The redirect mapping file is a spreadsheet that includes the following two columns:
- Legacy site URL –> a page’s URL on the old site.
- New site URL –> a page’s URL on the new site.
When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.
Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.
Increasing efficiencies during the redirect mapping process
Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.
Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping.
https://moz.com/blog/website-migration-guide
Appendix: Useful tools
Crawlers
- Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
- Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
- Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
- Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
- On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.
Handy Chrome add-ons
- Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
- User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
- Ayima Redirect Path: A great header and redirect checker.
- SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
- Scraper: An easy way to scrape website data into a spreadsheet.
Site monitoring tools
- Uptime Robot: Free website uptime monitoring.
- Robotto: Free robots.txt monitoring tool.
- Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
- SEO Radar: Monitors all critical SEO elements and fires alerts when these change.
- UltraDNS TOOLS change to DNS
Site performance tools
- NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required.
- PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
- Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
- Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.
- DareBoost very helpful & accurate as well. finding everything you need to know.
Structured data testing tools
- Google’s structured data testing tool & Google’s structured data testing tool Chrome extension
- Bing’s markup validator
- Yandex structured data testing tool
- Google’s rich results testing tool
Mobile testing tools
Backlink data sources
I hope this helps,Tom
-
You may want to:
1. Update your mycompany.com sitemap
2. Create an additional sitemap for your blog that sits on blog.mycompany.com
3. List both sitemaps or sitemap index files in your root robots.txt file
4. "Submit" the sitemaps to Google through Google Search Console. (I say "submit" because you really just point them to the URL. Their crawlers should find it regardless, however, this might make the discovery process swifter.)
In Google Search Console, you'll need to make sure you have claimed ownership (& verified) at the domain level. This will include your domain and new subdomain. It's up to you if you want to also claim ownership at the URL-prefix property so that blog.mycompany.com is broken out separately and can have the new blog sitemap added there. https://support.google.com/webmasters/answer/34592
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallow: /sr/ and Disallow: /si/ - robots.txt
Hello Mozzers - I have come across the two directives above in a robots.txt file of a website - the web dev isn't sure what they meant although he implemented robots.txt - I think just legacy stuff that nobody has analysed for years - I vaguely recall sr means search request but can't remember. If any of you know what these directives do, then please let me know.
Web Design | | McTaggart0 -
URL & Link Hierarchy - juice flow direction from backlinks?
Our site is very regional, so we focus all of our seo efforts on each of these region landing pages. For Example: domain.com/toys/us/ca/san-francisco We added an informational page (ex. reviews) and gave it a url like this: domain.com/toys/us/ca/san-francisco/reviews Question: Will external backlinks to domain.com/toys/.../reviews provide any link juice value to it's hierarchical parent page: domain.com/toys/us/ca/san-francisco?
Web Design | | 42Floors0 -
Website organic traffic unchanged, impressions took a 98% drop in the last week.
Hi all, I have a very curious predicament and I'd be grateful if someone could shed some light on the situation. As mentioned in the title, organic traffic to our website has remained unchanged, but organic impressions have taken a 98% drop in the last week. This happened suddenly over one day; on October 22, impressions were 700, on October 23, they were 500, and on October 24 they drastically dropped to 50. The next two days they were at 22 and then up to 35. Organic traffic, however, showed the normal "weekend drop" as of October 24, and is still showing normal level (even increased a bit) continuing into this week. These are organic impressions according to Google Analytics and Google Webmaster tools. We did perform a complete site redesign a month ago. Could this be an effect from the redesign? We also noticed drop in Domain Authority, but our competitors suffered a similar (if not greater) drop as well, so we wondered if it could be due in part to the algorithm update. If anyone could shed some light on the situation I would be so appreciative! Thanks!
Web Design | | Joanne_Pendon0 -
The impact of using directories without target keyword on our Rankings
Hello all, I have a question regarding a website I am working on. I’ve read a lot of Q en A’s but couldn’t really find the best answer. For one of our new websites we are thinking about the structure of this website and the corresponding URL-structure. Basically we have a main product (and a few main keywords) which should drive the most traffic to our website, and for which we want to optimize our homepage. Besides those main keywords, we have an enormous base of long-tail keywords from which we would like to generate traffic. This means we want to create a lot of specific pages which are optimized. My main question is the following: We are thinking of two options: Option 1: www.example.com/example-keyword-one Option 2: www.example.com/directory/example-keyword-one With option 1 we will link directly from our homepage to the most important pages (which represent our most important keywords). All the pages with the long tail content will be linked from another section on our website, which is one click away from our homepage (specifically a /solutions page which is linked from the footer). All the pages with long-tail content will have this structure www.example.com/example-keyword-one so the URLs will not contain the directory /solutions With option 2 we will use more subdirectories in our URLs. Specifically, for all the long tail content we would use URLs like this: www.example.com/solutions/example-keyword-one
Web Design | | NielsB
The directories we want to use wouldn't really have added value in terms of SEO, since they don’t represent important keywords. So what is the best way to go? Option 1, straightforward, short URL’s which don’t really represent the linking structure of our website, but only contain important keywords. Or option 2, choose for more directories in our URLs which represent the linking structure of our website, but contain directories which don’t represent important keywords. Would the keyword ‘solutions’ in the directory (which doesn’t really relate to the content on the page) have a negative impact on our rankings for that URL?0 -
Does a loading homepage animation effect rankings?
Our website ( panphoenix dot com) has a Javascript animation when you load it for the first time which takes just over 2 seconds to load. Does having this animation effect rankings negatively? Would appreciate your thoughts!Thanks Rob
Web Design | | roberthseo0 -
Website subscribe form.
Hello, Im working on a clients website and I have 2 box's. One is a subscription box and the other is a newsletter sing up. Subscription box is a google feedburner where every time there is a new post, it automatically notifies the readers. Whats the best strategy to have subscribe box since its confusing for readers when you have 2 forms. Thank you for your help.
Web Design | | KentR0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Landing Page/Home Page issues
Hi. I was speaking with my designer last night (we are setting up a new website) and we were discussing the design of our homepage, now the designer said he wanted the first page of the website to be a sort of landing page page were the visitor has to click and enter, im sure everyone has all come across these before. However, I am concerned as to the SEO implications of this? Any help guys?
Web Design | | CompleteOffice0