Multiple sites - ownership & link structure
-
Hi All
I am in the process of creating a number of sites within the garden products sector; each site will have unique, original content and there will be no cross over. So for example I will have one on lawn mowers, one on greenhouses, another on garden furniture etc.
My original thinking was to create a single limited company that would own each of the domains, therefore all the registrant details will be identical.
Is this a sensible thing to do? (I want to be totally white hat)
And what, if any, are the linking opportunities between each of the sites? (16 in total). Not to increase ranking, more from an authoritative perspective.
And finally, how should I link between each site? Should I no follow the links? Should I use keyword contextual links?
Any advice ideas would be appreciated
Please note: It has been suggested that I just create one BIG site. I've decided against this as I want to use the keyword for each website in the domain name as I believe this still has value.
Thanks
-
Yes, one of my exact match domains outranks a much stronger domain even with thin content and zero marketing efforts. It don't think it should and I don't think it always will outrank stronger domains.
-
Thanks EGOL, much appreciated.
-
Sounds like it has to be an exact keyword match for the domain to be effective according to EGOL. I have some exact keyword domains and I will have to look into this too. I sounds like it is okay to add a link to a main site as long as it is a no follow from the responses so far. Hopefully more info is forthcoming.
Boo
-
Most of the sites that I own are exact match keyword domains. I think that they still have some bonus value in the search engines but I believe that they were cut significantly in February 2011.
If you have domains like GardenFurniture.com or LawnMowers.com then I might agree with you and use the method that you propose.... however, if you have domains like LawnMowerReview.com then I would strongly vote for putting everything into one huge site.
Anybody can register a domain like LawnMowerBob or LawnMowerSomething... why should google give extra credit for that?
In my opinion the "exact match" keyword domains are effective, but "keywords in the domain" are nothing special.
And... any linking that you would do between your sixteen sites would be more dangerous than valuable.
That's just one opinion... but my opinion on this is firm enough that I would use it as a rule for my business.
-
Thanks Brian...should have checked webmaster tools befor posing the question!
As I replied to EGOL the reason for the individual sites is that I wanted to get the primary keyword for each in the domain (for example gardenfurniturereview.co.uk, or shedsreview.co.uk) and build focussed authoritative sites for each. Do you think the keyword in the domain is still relevant?
-
Thanks EGOL, the reason for the individual sites is that I wanted to get the primary keyword for each in the domain (for example gardenfurniturereview.co.uk, or shedsreview.co.uk) and build focussed authoritative sites for each. Do you think the keyword in the domain is still relevant?
-
If you go to Google Webmaster Central, you will see a mantra of non-following interlinks for sites under your control. The amount of risk you are undertaking with interlinking depends how you do it and the how your link profile looks overall. Do-follow contextual links in the footer would be the most risky and a bad idea. If you put a single do-follow link to another store/site (using the store name in the anchor) in the "About Us" page, you may be safe but I have read reports of getting a -10 for doing even this.
Ultimately, the more you leverage your sites for juice - the more risk you undertake (pardon the stating the obvious).
BTW, I concur with EGOL about doing one big site. I've done it both ways and everything is so much easier if you focus on a single store. Better to spread your icing nice and thick over a cupcake than thinly over a sheet cake.
-
If you create one big store your average shopping cart value will be higher because people who need garden furniture might also want something for their plants. And, one big site would impress the visitor more and be much less work for you to maintain.
And what, if any, are the linking opportunities between each of the sites? (16 in total). Not to increase ranking, more from an authoritative perspective.
huh? Authoritative perspective?
Wouldn't one big site be more authoritative?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site structure for location + services pages
We are in the process of restructuring our site and are trying to figure out Google's preference for location pages and services. Let's say we are an auto repair company with lots of locations and each one of them offer some unique services, while other services are offered by all or most other locations. Should we have a global page for each service live with a link to the location page for each shop that offers that service? OR Should we built a unique page about each service for every location as a subfolder of each location (essentially creating a LOT of sub pages because each location has 15-20 services. Which will rank better?
Intermediate & Advanced SEO | | MJTrevens1 -
Will link juice still be passed if you have the same links in multiple, outreach articles?
We are developing high quality, unique content and sending them out to bloggers to for guest posts. In these articles we have links to 2 to 3 sites. While the links are completely relevant, each article points to the same 2 to 3 sites. The link text varies slightly from article to article, but the linked-to site/URLs remain the same. We have read that it is best to have 2 to 3 external links, not all pointing to the same site. We have followed this rule, but the 2 to 3 external sites are the same sites on the other articles. I'm having a hard time explaining this, so I hope this makes sense. My concern is, will Google see this as a pattern and link juice won't be passed to the linked-to URLs, or worst penalize all/some of the sites being linked to or linked from? Someone I spoke to had suggest that my "link scheme" describes a "link wheel" and the site(s) will be penalized by Penguin. Is there any truth to this statement?
Intermediate & Advanced SEO | | Cutopia0 -
Is it bad for SEO to have a page that is not linked to anywhere on your site?
Hi, We had a content manager request to delete a page from our site. Looking at the traffic to the page, I noticed there were a lot of inbound links from credible sites. Rather than deleting the page, we simply removed it from the navigation, so that a user could still access the page by clicking on a link to it from an external site. Questions: Is it bad for SEO to have a page that is not directly accessible from your site? If no: do we keep this page in our Sitemap, or remove it? If yes: what is a better strategy to ensure the inbound links aren't considered "broken links" and also to minimize any negative impact to our SEO? Should we delete the page and 301 redirect users to the parent page for the page we had previously hidden?
Intermediate & Advanced SEO | | jnew9290 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Should I 'nofollow' links between my own sites?
We have five sites which are largely unrelated but for cross-promotional purpose our company wishes to cross link between all our sites, possibly in the footer. I have warned about potential consequences of cross-linking in this way and certainly don't want our sites to be viewed as some sort of 'link ring' if they all link to one another. Just wondering if linking between sites you own really is that much of an issue and whether we should 'nofollow' the links in order to prevent being slapped with any sort of penalty for cross-linking.
Intermediate & Advanced SEO | | simon_realbuzz0 -
IP address guideline for 2 sites on same server linking each other.
Hi Guys! I have two websites which link to each other but are on the same server. Both the sites have a great PR and link juice. I want to know what steps shall I take in order to make google feel that both the sites are not owned by me. Like shall i get different IP and different servers for both or something more? Looking forward for you thoughts and help!
Intermediate & Advanced SEO | | HiteshBharucha0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0