Website Redesign & Ensuring Minimal Traffic/Rankings Lost
-
Hi there,
We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense.
So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost?
Thank you so much for your help.
-
Having performed maybe upwards of 80 & without any real traffic loss for more than a week. It is because I follow the rules very thoroughly when you get to the bottom of this how do you please use one of the crawlers mentioned
use a complete search and replace when necessary across the entire site just to make sure everything’s in place.
I don’t know what type of website you’re running however if it is WordPress or if you I want toget some extra traffic I would make sure that the blog is a sub folder. if it is WordPress you can do this on a managed managed host platform like Pagely , ServeBolt or Kinsta for just $50 a month.
Redirect mapping process
If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.
The redirect mapping file is a spreadsheet that includes the following two columns:
- Legacy site URL –> a page’s URL on the old site.
- New site URL –> a page’s URL on the new site.
When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.
Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.
Increasing efficiencies during the redirect mapping process
Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.
Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping.
https://moz.com/blog/website-migration-guide
Appendix: Useful tools
Crawlers
- Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
- Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
- Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
- Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
- On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.
Handy Chrome add-ons
- Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
- User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
- Ayima Redirect Path: A great header and redirect checker.
- SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
- Scraper: An easy way to scrape website data into a spreadsheet.
Site monitoring tools
- Uptime Robot: Free website uptime monitoring.
- Robotto: Free robots.txt monitoring tool.
- Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
- SEO Radar: Monitors all critical SEO elements and fires alerts when these change.
- UltraDNS TOOLS change to DNS
Site performance tools
- NewRelic this is by far the most comprehensive site performance and site measuring tool listed. However the price is very steep it’s my favorite tool doesn’t mean it’s required.
- PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
- Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
- Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.
- DareBoost very helpful & accurate as well. finding everything you need to know.
Structured data testing tools
- Google’s structured data testing tool & Google’s structured data testing tool Chrome extension
- Bing’s markup validator
- Yandex structured data testing tool
- Google’s rich results testing tool
Mobile testing tools
Backlink data sources
I hope this helps,Tom
-
You may want to:
1. Update your mycompany.com sitemap
2. Create an additional sitemap for your blog that sits on blog.mycompany.com
3. List both sitemaps or sitemap index files in your root robots.txt file
4. "Submit" the sitemaps to Google through Google Search Console. (I say "submit" because you really just point them to the URL. Their crawlers should find it regardless, however, this might make the discovery process swifter.)
In Google Search Console, you'll need to make sure you have claimed ownership (& verified) at the domain level. This will include your domain and new subdomain. It's up to you if you want to also claim ownership at the URL-prefix property so that blog.mycompany.com is broken out separately and can have the new blog sitemap added there. https://support.google.com/webmasters/answer/34592
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is the size of website a ranking factor?
I've heard it said that by increasing the number of pages on a website (by adding new content) that this in iself can boost a site's ranking. Is this an old wives tale or is there a grain of truth here? My website - www.skydas.lv metāla durvis Metāla durvju montāža - http://sosdienests.lv
Web Design | | Felter0 -
Advice on estate agent website SEO next steps
Hi everyone, I have been working on the SEO for this website for a while now and have had a good amount of success increasing the traffic and rankings. However our main hurdle is improving conversions on the actual website - we want to encourage more people to book a valuation. Does anybody have any suggestions on how we can improve this? The website is www.richardkendall.co.uk Thank you
Web Design | | sophiecrosby970 -
How to optimise non-homepage to beat homepage in rankings? Are we on right path?
Hi all, We have a page with "keyword" in slug like "website.com/keyword" where our homepage is "website.com". This "keyword" difficulty is very high, so every minor factor contributing here and we have noticed competitors' pages with "keyword" in URL ranking. Then we also planned to rank this page "website.com/keyword" and interlinked high, so it'll be favoured by Google which didn't happen. So may be we should reduce the interlinking to homepage and optimise this keyword page more to rank for this keyword. Still I doubt the chances of ranking this page is difficult and need much more. How to make this page ranked replacing homepage? If there are no options for this, we are planning to redirect our homepage to this page; so Google will slowly adopt it. Suggestions please. Thank you.
Web Design | | vtmoz0 -
Yes or No for Ampersand "&" in SEO URLs
Hi Mozzers I would like to know how crawlers see the ampersand (& or &) in your URLs and if Google frown upon this or not? As far as I know they purely recognise this as "and" is this correct and is there any best practice for implementing this, as I know a lot of people complained before about & in links and that it is better to use it as &, but this is not on links, this is on URLs. Reason for this is that we looking to move onto an ASP.Net MVC framework (any suggestions for a different framework are welcome, we still just planning out future development) and in order to make use of the filter options we have on our site we need a parameter to indicate the difference on a routing level (routing sends to controller, controller sends to model, model sends to controller and controller sends to view < this is pattern of a request that comes in on the framework we will be using). I already have -'s and /'s in the URLs (which is for my SEO structuring) so these syntax can't be used for identifying filters the user clicks or uses to define their search as it will create a complete mess in the system. Now we looking at & to say; OK, when a user lands on /accommodation and they selects De Kelders (which is a destination in our area) the page will be /accommodation/de-kelders on this page they can define their search further to say they are looking for 5 star accommodation and it should be close to the beach, this is where the routing needs some guidance and we looking to have it as follow: /accommodation/de-kelders/5-star&close-to-the-beach. Now, does the "&" get identified by search engines on a URL level as "and" and does this cause any issues with crawling or indexation or would it be best to look at another solution? Thanks, Chris Captivate
Web Design | | DROIDSTERS0 -
Multiple Sites, multiple locations similar / duplicate content
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content. So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site. So, from an SEO standpoint we have considered: Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content. Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO. Rewrite all the same content 50 times. You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
Web Design | | SysAdmin190 -
SEO list for creating the *perfect* website
If you could build your website from scratch and have your developers do anything you want (within reason), what list of SEO requirements would you send them? Does anyone know of any good articles on the perfect SEO wish list? Happy Holidays!
Web Design | | MirandaP1 -
Meta author. Is it relevant for website design company in its seo?
We don't usually add the meta author in the websites that we develop. I wonder if it would have any positive effect in our seo. We usually add a link in the footer like this "Diseño Web Vigo "(Website Development in Vigo). I am worry about this links. I'm not sure if they are positive because they are in the footer and so the link appears in all of the pages. Besides all these websites we develop are hosted in two different servers, and google could easily think that it is manipulative thing. What do you think? Thanks!!! 🙂
Web Design | | teconsite.com0