One Site vs. Many
-
This is a question that I am not sure has a "right" answer. I am just wondering what everyone's thoughts are on this. I can see benefit of both sides of the coin.
In your opinion, is it better to have one large e-commerce site with all of your content on the same domain or is it better to have multiple more targeted domains with your content broken up into smaller chunks?
The reason I ask is, I feel like while multiple more targeted sites certainly have the benefit of focus, aren't you taking all your traffic and content, splitting it up and leaving you with several sites that most likely are getting less traffic than one large site would.
All opinions welcome.
-
Sure, you can absolutely create brand pages and from there branch to product pages. A bigger site is more effective and easier to maintain and promote and analyze than multiple small sites.
-
Ok, we are a vitamin and supplement company.. so all of our products fit within that scope. We have a couple of additional sites for more in depth weight loss programs (our best selling products) that we sell the supporting supplements for on our main site. These additional sites are branded for those programs and have lots of supporting content blogs, recipes, etc.
I suppose we could create some heavily branded areas in our main site specific to those products and build all of that out further. We are in a highly competitive market so any ground we can gain is helpful. Also, we are a smaller company in comparison to many of our competitors, so my fear is that splitting all of our content into these different sites could put us at a disadvantage.
Thanks for the advice so far, still not sure what route to take... but appreciate you taking the time to respond.
-
One site if all somewhat related, several sites if completely different. You don't want shoes, car parts and landscaping on one site. You do want car parts, van parts and motorcycle parts on one site. The single site will be stronger than several smaller stand alone sites.
-
One Site To Rule Them All
*sorry, I had too, lol
-
That makes sense. In that case, I would analyze the competition level for your "money" keywords and topics and see if it makes sense to split them up. You would also need to think about if there is enough content to write about, and constantly feed for the highly localized kind of site.
It has been my experience that an entire site whose content is wholly focused on one topic will rank better generally speaking than a larger site with a broader base of topics (for that keyword and topic). This is good and bad because you will only be able to use the 1 site for about 1 topic.
The competition level will play a large factor in the decision IMO. That would be the next thing I would look at.
-
Thanks for the response.
Well, from an administration perspective, we have multiple sites already... so I certainly know the challenges and amount of time that can take. I personally am leaning toward having only one site, but am looking for some reasons why one strategy may be better than the other. I was wondering more from an SEO perspective since the administrative complexity would only be simplified with one site.
I also agree, any answers I get will probably be sweeping generalizations without knowing all the specifics of our business, and analytics.. so I am taking that into account.
-
IMO, one site is enough to manage all by itself.
Having multiple sites, just from an administration perspective is very challenging in particular when you go over a certain amount. You would have to have full time admins just to manage the sites, let alone the content of each site.
You would have to add up how much expense you have per unit of site versus how much net revenue (profit) you have directly from that site. I think in a lot of cases the expenses per site are going to exceed the profits, but before making a sweeping generalization, do a study and make your best guesses before investing in the infrastructure, IT and employees.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO question regarding rails app on www.site.com hosted on Heroku and www.site.com/blog at another host
Hi, I have a rails app hosted on Heroku (www.site.com) and would much prefer to set up a Wordpress blog using a different host pointing to www.site.com/blog, as opposed to using a gem within the actual app. Whats are peoples thoughts regarding there being any ranking implications for implementing the set up as noted in this post on Stackoverflow: "What I would do is serve your Wordpress blog along side your Rails app (so you've got a PHP and a Rails server running), and just have your /blog route point to a controller that redirects to your Wordpress app. Add something like this to your routes.rb: _`get '/blog', to:'blog#redirect'`_ and then have a redirect method in your BlogController that simply does this: _`classBlogController<applicationcontrollerdef redirect="" redirect_to="" "url_of_wordpress_blog"endend<="" code=""></applicationcontrollerdef>`_ _Now you can point at yourdomain.com/blog and it will take you to the Wordpress site._
Intermediate & Advanced SEO | | Anward0 -
Linking from a corporate site to a brand site.
Is there an SEO impact to a large corporation linking from a corporate and/or a divisional site to a specific brand site with it's own top level domain? We would like to keep the traffic coming, but not if it will be seen as a black hat tactic. My guess is that Google will be smart enough to see that the corporation owns the brand and at least not penalize us, but I am wondering if anyone else has this experience? Google Analytics is calling it self-referral.
Intermediate & Advanced SEO | | mrbobland0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Removing A Blog From Site...
Hi Everyone, One of my clients I am doing marketing consulting for is a big law firm. For the past 3 years they have been paying someone to write blog posts everyday in hopes of improving search traffic to site. The blog did indeed increase traffic to the site, but analyzing the stats, the firm generates no leads (via form or phone) from any of the search traffic that lands in the blog. Furthermore, I'm seeing Google send many search queries that people use to get to the site to blog pages, when it would be much more beneficial to have that traffic go to the main part of the website. In short, the law firm's blog provides little to no value to end users and was written entirely for SEO purposes. Now the law firm's website has 6,000 unique pages, and only 400 pages of the site are NON-blog pages (the good stuff, essentially). About 35% of the site's total site traffic lands on the blog pages from search, but again... this traffic does not convert, has very high bounce rate and I doubt there is any branding benefit either. With all that said, I didn't know if it would be best to delete the blog, redirect blog pages to some other page on the site, etc? The law firm has ceased writing new blog posts upon my recommendation, as well. I am afraid of doing something ill-advised with the blog since it accounts now for 95% of the pages of the website. But again, it's useless drivel in my eyes that adds no value and was simply a misguided SEO effort from another marketer that heard blogs are good for SEO. I would certainly appreciate any guidance or advice on how best to handle this situation. Thank you for your kind help!
Intermediate & Advanced SEO | | gbkevin0 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0 -
Interesting site migration question.
Hi all. I'm looking for some thoughts on a migrations option we have. At the moment we have two E-Com sites ranking well for some of the same terms. An older site, and a nice new site. The older site is ranking very well for category and product terms, the new one is slowly coming up. Ideally we would like to have one site, the nice new one, and get rid of the old one. If I 301 the old site url's to the new sites will that bring the new site url's into the same position as the old ones? I'm just not sure how this effects sites that are already ranking well. Any ideas are welcomed but I'm really looking for a definitive answer. It's a big decision after all.
Intermediate & Advanced SEO | | PASSLtd0 -
Merging three sites to one
Hi guys, I just wanted confirmation if this is the right way to go about doing this. I need to merge three websites and I've never done three websites in to a brand new site before. Ok so we have Sitex.com
Intermediate & Advanced SEO | | Profero
Sitey.com
Sitez.com We've created a SiteB.com SiteB.com has SiteB.com/SiteXCat
SiteB.com/SiteYCat
SiteB.com/SiteZCat Each X,Y and Z have over 1,000 pages. They only have about 10 pages each with Page Authority above 10 and the domains arn't that strong. What i plan to do is: 301 redirect each site domain (X,Y,,Z) to it's corresponding category. e.g. Sitex.com > SiteB.com/SiteXCat 301 redirect each page off X,Y,Z that has a Page Authority above 10 to their new pages on SiteB.com Then, I'm unsure if i should 410 every other URL... I don't think its worht 301 every single URL if they arn't in search results much - but maybe it is if they have a lot of inbound links even with low page authority? Any ideas and does the above seem the best practise? Thanks.0 -
Badges For a B2b site
love this seo tactic but it seems hard to get people to adopt it Has anyone seen a successful badge campaign for a b2b site? please provide examples if you can.
Intermediate & Advanced SEO | | DavidKonigsberg0