To merge or not to merge? That is the question.
-
I am planning to do something I never did, and I am wondering if it's really a good idea or not.
I have four websites, all of the same company, each one with a different domain and different content:
- one has been the main official site for 16 years, 200 unique per month, indexed for 134 keywords, Domain Authority 17, 13 linking root domains
- one has been used as the main site from 2003 to 2006, it's focused on a specific business they actually discontinued, still online, no update since 2006, 500 unique per month, indexed for 92 keywords, Domain Authority 13, 8 linking root domains
- another has been a built on 2010 and maintained for less than year, and it's focused on a business they never really started, still online, no update since 2010, 3000 unique per month, indexed for 557 keywords, Domain Authority 25, 84 linking root domains
- a fourth one has been also built on 2010 and focused on a business never really started, still online, no update since 2010, 100 unique per month, indexed for 4 keywords, Domain Authority 6, 3 linking root domains
Each website has traffic and links, all links being natural, they never tried to gain links in any way, they never did on page optimization, they never ever thought about SEO. They are not event interlinked.
So, my idea is to merge all of them, putting websites 2, 3 and 4 as subfolders of the main site and replicating the old content there. Because those sites have traffic, incredibly one of the abandoned sites has 3000 unique per month, while the main site just 200!
My doubts are:
- does it make sense to merge everything from a SEO prospective?
- A part from doing 301 correctly, what else should I be careful to do or not to do?
- website number 4 it's really outdated, content and structure is not easy to merge with the rest, traffic is really small, is it worth spending the time to merge it?
Finally I also have a problem; customer didn't want to merge them, they agreed to, but they don't want visitors of the main site to be able to navigate to the old ones, so once moved and redirected I would have to put them in the sitemap of the main site but avoid linking to them on the actual "main" site.
As far as I know google crawler doesn't like to find pages in sitemaps which are not reachable through a linking path on the website, is that correct? Is that going to make all the merging work useless?
Should I convince the client to at least put small links in the footer or on a page linked from the footer?
-
Thanks for you answer.
The first place google bot goes is the sitemap, yes, but is it true or not that finding a page in the sitemap and not being able to reach the same page when crawling the website makes google devaluate the page juice?
About the footer I would just put a link to each subfolder, so just 3 or 4 links, what I don't know is if small links in the footer are enough to make google bot happy, or if it would still devaluate the page juice of the pages. Since they wouldn't have much interlinkage in the main website.
-
Well, from a business prospective, since site-2 and site-3 are about technologies which could still be used by customers of my client, even if they don't provide them anymore, in my mind they could bring in leads. site-3 is still originating natural links even if they didn't update it in four years, it must still be valuable to someone.
The 301 redirect doesn't scares me for site-2 and site-3 because they are wordpress installation, I will download the content and sitemaps, upload the content and use the sitemaps to generate 1-1 redirect rules with a script, it's not complex.
Site-4 is an application and I have no idea where to start to move it, that's why I now think is better to drop it.
-
Thanks. I think I will combine the three more related and discard the last one, it's traffic is small and they would not care of pennies from adsense.
-
I understand all the websites get traffic and rankings for different keywords. Although, you are stating that some of the services and products provided are no longer active and since that is the case (as stated above a service that never launched) then I would look at the point of keeping the content alive as it will not convert any new clients since that service is not provided. How would you get a return on investment for all the merge work.
Like sureshchowdary said above, making a list of all the pages and doing a 1-1 redirect is a lot of work (believe me I know -> in february 2014 I did the same thing for a client redirecting the entire site to new location (+/- 1000 pages)).
So If I were you I would look at the effort needed to perform all the work, make an estimate in what the investment would be and what would be the return on the work. It might just be wise to decide to add some content to the oldest site and redirect all the links but leave the rest of the content.
Just my 2 cents and for your consideration
Jarno
-
I agree. I would not mix diverse topics. But, if they are related, I would combine them as long as the content is meaningful to someone. I would monetize the traffic with adsense or other ads.
-
That was also my concern but all the four websites are related to the same branch of business I would say.
It's a software house basically, site 1 is just about them as a company. Site 2 and site 3 are about old technologies they do not resell/implement anymore. Site 4 is an online webservice to exchange messages anonymously, they actually never launched it. It's the one more distant from what they are doing and I agree with them it really looks unfinished and not so professional.
-
How closely related are the topics of these websites?
Are you mixing fishing, knitting and hydraulic jacks?
-
Hi Max,
Its good to know that even though the websites are not optimized the keywords are ranking and generating traffic.
1)It does make sense to merge. But all the weight age goes to the 1st website. You need to do a 1-1 mapping of all the url's while redirecting which I think is a big task. As you said the customers didn't want to merge them.
2)When we are doing a 301 redirect for the whole website I don't think see any such things which we need to look after.
3)If the keywords are ranking for the 4th website then definitely its worth redirecting your 4th website.
The first place google bot goes when it enters your website is the sitemap.
Its also not a good practice to put more links in the footer section. Limit the number of links in the footer section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical SEO Question: Why is our new platform showing a small decline in traffic?
Hi there! We are in the process of transitioning to a faster platform, and we recently moved a subset of URLs over. The subset that moved over saw a drop. We didn't change the URL pattern, or the content. The only thing that is different is the new platform. Here's a link to one of the URLs that is currently served from the new platform: http://bit.ly/1YjXD7H And, here is an example of a URL currently served from the older platform: http://bit.ly/1Jtx7Di Any ideas why the newer platform is seeing a decline in organic traffic?
Intermediate & Advanced SEO | | nicole.healthline0 -
Question about structuring @id schema tags
We are using JSON-LD to apply schema. My colleague had question about applying @id tags in the schema parent lists: While implementing schema, we've included @id as a parameter to both the "list" child of "ListItem" of a "BreadcrumbList" - on the same schema, we've added an @id parameter to mainContentOfPage and both @id parameters are set to the pages URL. Having this @id in both places is giving schema checker results that have the child elements of "mainContentOfPage" appearing under the "list" item. Questions: is this good or bad? Where should @id be used? What should @id be set to? Thanks for the insight!
Intermediate & Advanced SEO | | RosemaryB0 -
Question about Indexing of /?limit=all
Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?
Intermediate & Advanced SEO | | RobAus0 -
Technical 301 question
Howdy all, this has been bugging me for a while and I wanted to know the communities ideas on this. We have a .com website which has a little domain authority and is growing steadily. We are a UK business (but have a US office which we will be adapting too soon) We are ranking better within google.com than we do on google.co.uk probably down to our TLD. Is it a wise idea to 301 our .com to .co.uk for en-gb enquiries only? Is there any evidence that this will help improve our position? will all the link juice passed from 301s go to our .co.uk only if we are still applying the use of .com in the US? Many thanks and hope this isn't too complicated! Best wishes,
Intermediate & Advanced SEO | | TVFurniture
Chris0 -
[Need advice!] A particular question about a subdomain to subfolder switch
Hello Moz Community! I really was hoping to get your help on a issue that is bothering me for a while now. I know there is a lot of about this topic but I couldn’t find a good answer for my particular question. We are running several web applications that are similar but are also different from each other. Right now, each one has its own subdomain (which was mainly due to technical reasons). Like this: webapp1.rootdomain.com, webapp2.rootdomain.com etc. Our root domain currently points with 301 to webapp1.rootdomain.com. Now, we are thinking about making two changes: changing to a subfolder level like this: rootdomain.com/webapp1 , rootdomain.com/webapp2 etc. Changing our rootdomain to a landing page (lisitng all the apps) and take out the 301 to webapp1 We want to do these changes mainly for SEO reasons. I know that the advantages are not so clear between subdomain/subfolder but we think it could be the right way to go to push the root domain and profit more from juice passing to the different apps. The problem is that we had a bad experience when we first switched from our first wep app (rootdomain.com) to an subdomain (webapp1.rootdomain.com) to set them equal with the other apps. Our traffic dropped a lot and it took us 6 weeks to get back on the same level as before. Maybe it was the 301 not passing all juice or maybe it was the switch to the subdomain. We are not sure. So, I guess my question is do you think it is the right thing to do for web apps to go with subfolders to pass more juice from root to subfolders? Will it bring again huge drops in traffic once we make that change? Is it worth taking that risk or initial drop because it will pay off in the future? Thanks a lot in advance! Your answers would help me a lot.
Intermediate & Advanced SEO | | ummaterial0 -
Privacy Policy & T&C's SEO related question
With Adwords they request a Privacy Policy and T&C's sometimes for an Ad to be approved. Silly question I know but do you think Google looks out for pages like this to identity websites which are more genuine for organic? Thanks
Intermediate & Advanced SEO | | activitysuper0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0 -
Crawl questions
My first website crawl indicating many issues. I corrected the issues, requested another crawl and received the results. After viewing the excel file I have some questions. 1. There are many pages with missing Titles and Meta Descriptions in the Excel file. An example is http://www.terapvp.com/threads/help-us-decide-on-terapvp-com-logo.25/page-2 That page clearly has a meta description and title. It is a forum thread. My forum software does a solid job of always providing those tags. Why would my crawl report not show this information? This occurs on numerous pages. 2. I believe all my canonical URLs are properly set. My crawl report has 3k+ records, largely due to there being 10 records for many pages. These extra records are various sort orders and style differences for the same page i.e. ?direction=asc. My need for a crawl report is to provide actionable data so I can easily make SEO improvements to my site where necessary. These extra records don't provide any benefit. IF the crawl report determined there was not a clear canonical URL, then I could understand. But that is not the case. An example is http://www.terapvp.com/forums/news/ If you look at the source you will clearly see Where is the benefit to including the 10 other records in the Crawl report which show this same page in various sort orders? Am I missing anything? 3. My robots.txt appropriately blocks many pages that I do not wish to be crawled. What is the benefit to including these many pages in the crawl report? Perhaps I am over analyzing this report. I have read many articles on SEO, but now that I have found SEOmoz, I can see I will need to "unlearn what I have learned". Many things such as setting meta keyword tags are clearly not helpful. I wish to focus my energy and I was looking to the crawl report as my starting point. Either I am missing something, or the report design needs improvement.
Intermediate & Advanced SEO | | RyanKent0