Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
-
Hi All!
So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic.
For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path.
We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain.
What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
-
Hello Pat,
I do not have experience merging a Ruby site with another type of site, but I think we are confusing issues here anyway. You can have content in a database that gets served up anywhere. That is, you can pull that content into ten different websites if you wanted to. The database issue is almost irrelevant to the SEO issues, which have mainly to do with loss of pagerank from URL changes, and possible duplicate content issues. A 301 redirect from the old URL to the new one would take care of both of these issues.
If you are unable to redirect all of the old content my suggestion would then be to figure out which URLs have external links and redirect those. Let all of the other ones return a 404 or 410 status code so those URLs will be removed from the index since the content will exist on the new URL and you don't want two URLs with the same content indexed simultaneously.
Please let us know if we have missunderstood the question or if we can provide more help with your original question. You may want to post your Ruby question in another thread to ensure the right people see it.
Thanks!
-
Hi Chris,
Sorry for the confusion. The plan is to merge both databases (our vulnerability database on our corporate site and our exploit database on our other website) into one and place them on a subdomain off of our corporate site. Right now the exploit database that is on our second website gets a LOT of traffic, it contributes about three quarters of the traffic to the domain. I would like to minimize the loss of traffic when placing this on this subdomain and looking for ways to do this.
@ryan - I am not sure exactly why, but our web producer told me that we need to use a subdomain and cannot put this on our domain. I will follow up with her to find out why.
Update - I guess one of the databases is written on a different platform (ruby) so it cannot be hosted on the same server - changes are harder to make as a result. I guess this could still be done however it may be a little harder to update - anybody have experience with this?
Thanks for the help guys!
Pat
-
Would like to offer an opinion but can't quite figure out what you're saying in paragraph 3.
-
Not quite sure that I understand the need to put these on a subdomain. Why not have both of these reside/exist on the corporate domain? One of them already exists on your corporate, so you can keep that database/search there, and then move the other over to a similar location. yes, that would require a ton of 301 redirects, but that should be ok given the scope of the project.
In my experience, moving to a new domain or even a subdomain, you always experience some traffic loss that never really comes back (unless you are naturally growing anyway). Keel the main company domain going, put everything under a folder off the root, dont worry about the subdomain issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
Spammy keywords in our sub-domain but no penalty?
Hi, We have cigarettes and viagra as keywords in our sub-domain where our clients can post their business content. We have decent number of impressions and clicks for these related keywords. I have seen that these two words, especially "viagra" is most spammed. So are these hurting us? We dropped post Penguin update. Any correlation? Do you think that these keywords penalise us? We don't have messages or suggestion from Google Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Two Domains, Same Products/Content
We're an e-commerce company with two domains. One is our original company name/domain, one is a newer top-level domain. The older domain doesn't receive as much traffic but is still searched and used by long-time customers who are loyal to that brand, who we don't want to alienate. The sites are both identical in products and content, which creates a duplicate content issue. I have come across two options so far: 1. a 301 redirect from the old domain to the new one. 2. Optimize the content on the newer domain (the strongest of the two) and leave the older domain content as is. Does anyone know of a solution better than the two I listed above or have experience resolving a similar problem in the past?
Intermediate & Advanced SEO | | ilewis0 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
How to move blog to new domain with different theme & categories
I have a wordpress blog hosted on a separate domain. I have a new empty blog on a subdomain of my-commerce main site. The new blog has a different wordpress theme & categories than the old blog. What is a good way to populate the new blog with content from the old? What do I do with the old blog once the move is done? Thank you for your thoughts on this Handcrafter
Intermediate & Advanced SEO | | stephenfishman0 -
Question on Moving Content
I just moved my site from a Wordpress hosted site to Squarespace. We have the same domain, however, the content is now located on a different URL (again, same base domain). I'm unable to easily set up 301 redirects for the old content to be mapped to the new content so I was wondering if anyone had any recommendations for a workaround. Basically, I want to make sure google knows that Product A's page is now located at this new URL. (www.domain.com/11245 > www.domain.com/product-a). Maybe it's something that I don't have to worry about anymore because the old content is gone? I mean, I have a global redirect set up that no matter what you enter after the base domain, it now goes to the homepage but I just want to make sure I'm not missing something here. Really appreciate your help!
Intermediate & Advanced SEO | | TheBatesMillStore1 -
Retail Store Detail Page and Local SEO Best Practices
We are working with a large retailer that has specific pages for each store they run. We are interested in leveraging the best practices that are out their specifically for local search. Our current issue is around URL design for the stores pages themselves. Currently, we have store URL's such as: /store/12584 The number is a GUID like character that means nothing to search engines or, frankly, humans. Is there a better way we could model this URL for increased relevancy for local retail search? For example: adding store name:
Intermediate & Advanced SEO | | mongillo
www.domain.com/store/1st-and-denny-new-york-city/23421
(example http://www.apple.com/retail/universityvillage/) fully explicit URI www.domain.com/store/us/new-york/new-york-city/10027/bronx/23421
(example http://www.patagonia.com/us/patagonia-san-diego-2185-san-elijo-avenue-cardiff-by-the-sea-california-92007?assetid=5172) the idea with this second version is that we'd make the URL structure more rich and detailed which might help for local search. Would there be a best practice or recommendation as to how we should model this URL? We are also working to create an on-page optimization but we're specifically interested in local seo strategy and URL design.0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0