Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
We switched the domain from www.blog.domain.com to domain.com/blog.
-
We switched the domain from www.blog.domain.com to domain.com/blog. This was done with the purpose of gaining backlinks to our main website as well along with to our blog. This set us very low in organic traffic and not to mention, lost the backlinks. For anything, they are being redirected to 301 code. Kindly suggest changes to bring back all the traffic.
-
Hi Arun
It's certainly best practice to move to a root directory. As you say visitors are then coming to one domain, not a subdomain. All you need to do is page by page redirect through a 301. When you say they are being 'redirected to 301 code' this is perfectly OK. The 301 code just tells Google that the page has moved permanently.
It takes Google a short while to recognise the new pages as replacing the old ones and for that period you can see old and new in Google, causing a short period of duplication which could affect the rankings.
You just need to sit it out - by all means, do a Fetch in Search Console to help speed up the process.
Search Console>Crawl>Fetch as Google.
Regards
Nigel
-
Let's separate your case in parts so is much easier to face the problem.
From my point of view, you did the right choice. But as any choice need some level of preparation.
(This a personal opinion based on my experience and knowledge, probably you know some or all of the tools and process I will mention, I prefer to mention instead of assuming that you know it)1-Subdomains are Essentially a Different Website
When you use the blog.website.com subdomain solution, you are essentially setting up an entirely different website. And while it is true that Google will crawl and index both of them, you are limiting the full potential of your online marketing efforts.
When you separate your website and blog, it creates two separate entities that need your attention. And now, with things like Time On Site and Bounce rate contributing to your website rankings, you can’t let users spend their time on pages that Google sees as a different domain.
When your blog and website are properly integrated, on the other hand, Google will see that the traffic to your website as a whole continues to go up. This, to Google, translates as a website that has some obvious authority and deserves higher rankings.
As long as you keep your blog in a subdirectory or subfolder, it will keep the Google bots coming to your main website to recrawl and index your site over and over.
2- SEO Considerations For Any Website Migration
In my case to ensure that any website migration goes smoothly and leads to improved business, I follow these essential recommendations. In order to improve the user experience of your website, make sure you’re putting all of that valuable data to use by reviewing:
- Top-viewed website content – Make sure you aren’t cutting content your audience loves.
- Least-viewed website content – Even the best sites have some junk, take this opportunity to drop it or improve it.
- Click maps – Looking at where people are clicking (or trying to click) can help to design an intuitive and frustration free navigation interface.
- Paths to conversion – Regardless of what your website goals are (i.e. build subscribers, generate leads), understanding the paths which your visitors are taking to key conversion points can help to optimize these paths to make it easier and more enticing for visitors to convert into customers.
Web analytics tools that you need to check:
Map Url Redirects
If your website has been around for any amount of time, there’s a good chance that you’ve built up search equity in the form of links and social shares. In addition to tight keyword optimization, these are the primary factors that help to increase the visibility of your content in search engines and since they are tied to the URLs on your site, a migration in domain or URL structure can snuff out the valuable search equity you’ve spent time and effort building
To avoid starting from SEO square one with your new website, it’s important to strategically implement 301 redirects from your old page URLs to the new ones, as this will effectively tell search engines where your new site pages are and that they are replacements for the old versions. In addition, it will ensure that people and bots who follow links to your old URLs will end up in the right place rather than an error page.
In order to map redirects effectively, start by documenting for all your existing pages:
- URL
- Page topic
- Target keyword
- Organic search traffic (I recommend looking at a minimum 6-month)
- Links to page
- keyword rank
Also document for your planned new site pages:
- URL
- Page topic
- Target keyword
Once you have these two lists compiled, the next step is to map each page on your current site to it’s planned new location on your soon-to-be-launched site. Redirect mapping isn’t rocket science, but it does take some thought (when done correctly). Fortunately, the previous exercise should give you all the information you need.
Of primary concern is topic relevance, in particular for highly trafficked and linked-to pages. When planning redirects, always consider what the experience of a visitor would be if they ended up on the redirect page rather than the original. Would it serve their needs as well or better than the old page? Would it feel confusing? Ideally, the new page should be such a seamless transition that people don’t even notice the switch.
Redirect mapping tools:
- OpenSiteExplorer – Links and social shares
- Google Analytics – Traffic
- SEMRush – Keyword rankings
- Microsoft Excel
Choose Ideal Timing
Even the best planned and executed website migrations come with some downtime and a temporary decrease in traffic (approx. 30%) and search rankings. It’s a price worth paying, as a new and improved website can drive significant improvements in business over an outdated and clunky site. However, it’s important to time the transition for when it’s likely to have the least amount of negative impact on your business.The best time of year to implement a website migration is when business is likely to be the slowest. Companies vary in the degree of seasonality they experience, but most have a ‘slow season’. You probably already know when this is, but if not, take a look at your historic yearly web traffic or revenue patterns to determine when your slow season typically occurs.
As with time of year, it also makes sense to migrate your site on a slow day of the week during off hours. For many B2B focused websites, this is late on Friday or Saturday, but make sure to make the decision based on your own analytics, as every site and audience is different.
Analytics
As mentioned earlier, a temporary decrease of approximately 30% in website search traffic and visibility can be expected in the period immediately following a migration, but it’s very important to monitor closely to make sure it is indeed temporary and that things are headed in the right direction.Make sure to keep a close eye on:
- Organic search traffic
- Visit bounce rate
- Conversion rates
- Keyword rankings
Crawl Errors
Generally, crawl errors like broken links, 404 not found pages or duplicate content will be at their lowest levels on a brand new site, but it’s still important to check and fix any errors, especially as this can be an indicator of a mistake during the migration.There are many good automated crawl tools available, but make sure you use one that can find:
- Broken links and 400 error pages
- 500 error pages
- Duplicate content
- Inaccessible content
In Summary
A website migration may seem like a lot of work, and it most certainly is (when done correctly). But the potential payoffs in an improved experience for your site visitors and increased business for you are more than worth the investment.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO advice on ecommerce url structure where categories contain "/c/"
Hi! We use Hybris as plattform and I would like input on which url to choose. We must keep "/c/" before the actual category. c stands for category. I.e. this current url format will be shortened and cleaned:
Technical SEO | | hampgunn
https://www.granngarden.se/Sortiment/Husdjur/Hund/Hundfoder-%26-Hundmat/c/hundfoder To either: a.
https://www.granngarden.se/husdjur/hund/hundfoder/c/hundfoder b.
https://www.granngarden.se/husdjur/hund/c/hundfoder (hundfoder means dogfood) The question is whether we should keep the duplicated category name (hundfoder) before the "/c/" or not. Will there be SEO disadvantages by removing the duplicate "hundfoder" before the "/c/"? I prefer the shorter version ofc, but do not want to jeopardize any SEO rankings or send confusing signals to search engines or customers due to the "/c/" breaking up the url breadcrumb. What do you guys say and prefer from the above alternatives? Thanks /Hampus0 -
Schema for blogs
When I run a wordpress blog through the structured data testing tool I see that there is @type hentry. Is this enough for blogs etc? Is this a result of Wordpress adding in this markup? Do you recommend adding @blogposting type and if so why? What benefit to add a specific type of schema? How does it help in blogging? Thanks
Technical SEO | | AL123al4 -
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Disallow: /404/ - Best Practice?
Hello Moz Community, My developer has added this to my robots.txt file: Disallow: /404/ Is this considered good practice in the world of SEO? Would you do it with your clients? I feel he has great development knowledge but isn't too well versed in SEO. Thank you in advanced, Nico.
Technical SEO | | niconico1011 -
Robots.txt to disallow /index.php/ path
Hi SEOmoz, I have a problem with my Joomla site (yeah - me too!). I get a large amount of /index.php/ urls despite using a program to handle these issues. The URLs cause indexation errors with google (404). Now, I fixed this issue once before, but the problem persist. So I thought, instead of wasting more time, couldnt I just disallow all paths containing /index.php/ ?. I don't use that extension, but would it cause me any problems from an SEO perspective? How do I disallow all index.php's? Is it a simple: Disallow: /index.php/
Technical SEO | | Mikkehl0 -
Having www. and non www. links indexed
Hey guys, As the title states, the two versions of the website are indexed in Google. How should I proceed? Please also note that the links on the website are without the www. How should I proceed knowing that the client prefers to have the www. version indexed. Here are the steps that I have in mind right now: I set the preferred domain on GWMT as the one with www. I 301 redirect any non www. URL to the www. version. What are your thoughts? Should I 301 redirect the URL's? or is setting the preference on GWMT enough? Thanks.
Technical SEO | | BruLee0 -
Do Domain Extensions such as .com or .net affect SEO value?
In the beginning of SEO days, it was going around that .com is the best for SEO and that .net is not as good. Is there any truth to this, and what about .org or .edu? I always hear that .edu sites have high PR. Is there any rhyme or reason to this, or all they all equal? Thank you, Afshin Christian-Way.com
Technical SEO | | applesofgold0