Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Merging several sites into one - best practice
-
I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com)
Both sites were ranking very good for several keywords.
I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker)
Google still delivers the old domains but the redirect take people directly to the new content.
I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results.
How long does it take until google accepts the new domain and delivers the new content in the search results?
Was it ok what I did? Or is there some room for improvement?
SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains....
-
Argh. It's too late for 2) - totally missed that.
Would it help to redirect the sitemap to the new domain's site map instead?
-
Thanks - seems google picked up the new domain
I had one bug - though php would do a 301 redirect but does a 302 by default. This is why it was not updated for some time.
-
Hi Andreas,
Looks like you're doing everything right, but I want to make sure all the bases are covered. Depending on the size, link profile, link structure and domain authority of your site, it can take several weeks for Google and other search engines to complete migrate a domain. Here are some important processes not to overlook.
1. Did you file a change of address within Google webmaster? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=83106
2. When migrating domains, it's important to leave an old sitemap up, so that Google will try crawling the old URLs and register the 301s. If you neglect this step, it may take much longer for Google to crawl the old URLs to see that they've moved.
3. As Joel pointed out, make sure to update as many internal and external links as possible.
That should cover the basics, but there are a million more details you can explore to make the process go more smoothly. For a detailed approach, here's a couple of excellent guides written my some very smart folks.
- https://seogadget.co.uk/domain-migration/
- http://www.seomoz.org/blog/web-site-migration-guide-tips-for-seos
Hope this helps. Best of luck with the migration!
-
You did the right move by doing the 301 redirect and by submitting a new sitemap. Another thing you should consider doing if possible is identifying the links you previously had for those two sites, contact the site owners or webmasters and politely ask them to change the old link for your new URL. You should also make sure you are promoting your new site on all social media and you might consider a small PPC campaign. Also analyze the keywords you had on both sites and see how they compare to your new site. Keyword density may not be as powerful as before but still relevent. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
What is SEO best practice to implement a site logo as an SVG?
What is SEO best practice to implement a site logo as an SVG?
Technical SEO | | twisme
Since it is possible to implement a description for SVGs it seems that it would be possible to use that for the site name. <desc>sitename</desc>
{{ STUFF }} There is also a title tag for SVGs. I’ve read in a thread from 2015 that sometimes it gets confused with the title tag in the header (at least by Moz crawler) which might cause trouble. What is state of the art here? Any experiences and/or case studies with using either method? <title>sitename</title>
{{ STUFF }} However, to me it seems either way that best practice in terms of search engines being able to crawl is to load the SVG and implement a proper alt tag: What is your opinion about this? Thanks in advance.1 -
One robots.txt file for multiple sites?
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site. Thanks for the help, Rena
Technical SEO | | renalynd270 -
What is the best way to refresh a webpage of a news site, SEO wise?
Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you!
Technical SEO | | pkontopoulos0 -
Disallow: /404/ - Best Practice?
Hello Moz Community, My developer has added this to my robots.txt file: Disallow: /404/ Is this considered good practice in the world of SEO? Would you do it with your clients? I feel he has great development knowledge but isn't too well versed in SEO. Thank you in advanced, Nico.
Technical SEO | | niconico1011 -
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages. We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be: 1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories). 2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404) I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option. Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
Technical SEO | | zeepartner1 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0