New site or subdomain
-
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site?
will the new site be placed in the google sandbox?
the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
-
I'm not sure there's a one-size-fits-all answer. From an SEO point of view, if the subdomain has a similar design and structure to the main domain, then the new product 'may' benefit from the domain authority of the root domain.
On a new domain, the site starts from scratch, although I wouldn't worry too much about a sandbox effect.
-
I think it comes down to the branding requirements of the new product. Is this a major product launch, or just another product amongst an existing product line the company has had for ages?
The advantages of having the new product under the current domain (as a subdomain, or a subdirectory as Kurt pointed out above) would be that you could piggy back off the existing brand and online presence, and it would be easy for new customers to find the product, even if they weren't looking for it.
If the product is new, your company is investing a lot in it, and it deserves it's own branded presence, than a new domain might be the way to go. Search Engines and Marketers have been moving towards "branding" for a while now, and if this product warrants it's own name, personality, or community - this would probably be the way to go.
-
Hi Michael,
By making the product part of the parent site, I'd say you are conveying the quality and reputation of the main brand. I know people say that Google treats subdomains as seperate sites, but personally, I think they connect it more than a completely different site. But my question would be why not make it a subdirectory of the main site instead of a subdomain? Then it's definitely part of the main site for both the search engines and users.
As for the sandbox, you can rank even with a new site. The thing is that new sites usually don't have any links, shares or reputation whatsoever, so they don't rank well for much, if anything. On the other hand, if you put the new product in a subdirectory, it's part of the main site and you definitely won't have any risk of the sandbox.
Is there a reason you want to separate out the product from the company?
Kurt Steinbrueck
OurChurch.Com
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Will Google penalize 2 sites for targeting "like" keyword phrases?
I own (2) different websites, one an HTML site that has been live for 20 years and a ecommerce site that has been live for 7 years. We sell custom printed (branded) tents for use at trade shows and other indoor and outdoor events. While our ecomm site targets "trade show" tents our HTML site targets "event" tents. I believe that the keyword phrases are dissimilar enough that targeting "trade show tents" on one site and "event tents" on the other should not cause Google to penalize one or the other or both sites for having similar content. The content is different on both sites. I'm wondering if anyone has experience with, or opinions on, my thoughts... either way. Thanks,
Algorithm Updates | | terry_tradeshowstuff
Terry Hepola0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Should I use subdomains?
I'm thinking of a little project website, but wonder whether I should use subdomains, or just simply categorize the site. For example, (I haven't chosen my domain yet) If I had www.flowers.com, and wanted to produce pages for each type of flower, should i use rose.flower.com
Algorithm Updates | | Gordon_Hall
or
flower.com/rose For SEO purposes, or usability, does it matter? Thanks in advance.0 -
How to optimise a news site? - tomorrows chip paper terms
Are there any specific tips to how to gain traffic from very short lived search terms? If the site you are SEO/SEMing want to go for search related to things like the latest celebrity breakup, or a fashion event that lasts less than a week The onsite stuff seems pretty good as SEO onsite tools generally give it an A grade Is it just a case of doing the same stuff as normal, but faster? 😉
Algorithm Updates | | Fammy0 -
Data on Google Vs Bing, et al and changes to sites.
I am curious to know if anyone has any data that correlates site/page changes like content or Title Tag, H1, etc. and subsequent movement in rankings on Google and Bing and Yahoo? The equation is for example: ABCSite.com/home-page/ makes a change to the H1 and H2 and one paragraph of content is changed. Over next 6 to 12 weeks changes in page rank for the 3 engines is tracked to see where it started and where it "stopped." Obviously, there are more factors than individual algorithms in play here. An example of that would be that a significant number of sites will be indexed in Google by a dev and not in the others. We see this regularly. So, at least from a timing standpoint, different sites are entering/leaving the fray at different rates. We are going to begin to track this but I would love to see any data already around or speak with anyone involved in such a study about what they found. Thanks
Algorithm Updates | | RobertFisher0 -
Whats the best thing to do after rebuilding a site to get old rankings back ?
A website changed its platform from the old one to magento ecommerce. In webmaster tools google says that yesterday was the last time that crawled the site, but the old rankings for keywords are gone , traffic went down big time and now i'm not sure where to start working in order to bring everything like it was. any advice ?
Algorithm Updates | | footballearnings0