Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
-
For example, lets say I have these 3 domains:
The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com)
The purpose of this would be to capitalize on the Exact match domain opportunities.
I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml
The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain.
What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
-
Sure, from what I know the EMDs hit were spammy but the thing is... why work on three sites & create partial duplicate copy in the hopes that them being EMDs would give you a boost when you could create great non-duplicate copy for one site and have it rank based on putting quality work & effort into it? Odds are you'll have better luck that way than you would linking between your own EMDs like that.
And if the plan is to 301 redirect the two EMDs to the primary then you'd really need a strong link profile on the two EMDs to have them help enough once redirected. Instead you could spend the time working on the primary site instead of spreading your effort across three sites.
-
It was my understanding that you can still benefit from an EMD that is a high quality, relevant website.
I thought that the EMD's that did take a hit were spammy, low quality sites?
http://moz.com/blog/googles-emd-algo-update-early-data
What is your opinion about this?
Assuming this is correct, do you have any suggestions pertaining to the original question?
-
Exact Match Domains took a hit recently in Google so trying to capitalize on something like that won't necessarily help you. Instead of splitting your work over 3 sites hoping to make one rank better, you'd be better served by targeting your efforts on the main site and giving it the best, most relevant content you can while building its link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you choose PA/DA over PR when purchasing expiring domains?
Hey guys, So a lot has been said about private blog network. I have but only 1 question: Do you choose PA/DA over PR when purchasing expiring domains or PR is most critical? Thanks a lot!
White Hat / Black Hat SEO | | nicenike0 -
Do I need to undo a 301 redirect to dissavow links from the source domain?
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A. Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account). So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B? Thanks! Chris [edited for formatting]
White Hat / Black Hat SEO | | SEOdub0 -
Goddady's Domain Masking and 301's
I have a client who's 7 domains and single website (instantpages®) exists within the clutches of GoDaddy. They own 6 kewyord rich domain names that 301 redirect with masking to the main branded domain. In effect, what this provides is the ability to add a title tag and meta description for a keyword rich domain name that displays content through an iframe. So really it's not duplicate content but this practice sets off my spidey sense that this is not a best practice regarding SEO. I want to suggest for the client to drop the idea of masking and do a straight 301 redirect to main branded domain. I'm sure that is fine but these domains are Not similar variations but actually vary widely: massage-city.com, city-massage.com, city-acupuncture.com, acupuncture-city.com, city-chiropractic.com, chiropractic-city.com etc ---- Doesn't Google frown on redirecting 6 domains to a single domain if they vary widely? Words of wisdom appreciated.
White Hat / Black Hat SEO | | superZj0 -
Content within a toggle, Juice or No Juice?
Greetings Mozzers, I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized. Thoughts and advice is greatly appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Can you have too many NOINDEX meta tags?
Hi, Our magento store has a lot of duplicate content issues - after trying various configurations with canonicals, robots, we decided it best and easier to manage to implement Meta NOINDEX tags to the pages that we wish the search engines to ignore. There are about 10000 URL's in our site that can be crawled - 6000 are Meta No Index - and 3000 odd are index follow. There is a high proportion of Meta No Index tags - can that harm our SEO efforts? thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0