Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
-
For example, lets say I have these 3 domains:
The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com)
The purpose of this would be to capitalize on the Exact match domain opportunities.
I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml
The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain.
What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
-
Sure, from what I know the EMDs hit were spammy but the thing is... why work on three sites & create partial duplicate copy in the hopes that them being EMDs would give you a boost when you could create great non-duplicate copy for one site and have it rank based on putting quality work & effort into it? Odds are you'll have better luck that way than you would linking between your own EMDs like that.
And if the plan is to 301 redirect the two EMDs to the primary then you'd really need a strong link profile on the two EMDs to have them help enough once redirected. Instead you could spend the time working on the primary site instead of spreading your effort across three sites.
-
It was my understanding that you can still benefit from an EMD that is a high quality, relevant website.
I thought that the EMD's that did take a hit were spammy, low quality sites?
http://moz.com/blog/googles-emd-algo-update-early-data
What is your opinion about this?
Assuming this is correct, do you have any suggestions pertaining to the original question?
-
Exact Match Domains took a hit recently in Google so trying to capitalize on something like that won't necessarily help you. Instead of splitting your work over 3 sites hoping to make one rank better, you'd be better served by targeting your efforts on the main site and giving it the best, most relevant content you can while building its link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What if i dont use an H1, but rather, h2 with multiple keywords.
the reason i dont want to use h1 is because i can have only one h1, however if i use several h2s. is it gonna help me rank? bacause google likes h1 more than h2, is google gonna give more priority or same priority to h2., and if that priority is gonna be less, what will be the percentage of that lessness? for ex: h1 gets 90 score if my h1 is missing how much score my h2 will get out of hundred(i know there is no such metric but i am just wondering anyways)
White Hat / Black Hat SEO | | Sam09schulz0 -
Google URL Shortener- Should I use one or multiple???
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL). Many of these links go to the same page ex .com/services-page Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results? Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this. Thanks!
White Hat / Black Hat SEO | | mgordon1 -
Can I post an article on my blog if it has already been published and taken down?
Hi Guys, A writer for my site has offered to let me post her article on my blog, however the article has already been published on another blog, but the blog has now been taken down. If I publish this on my blog will there be any harm to my blog? I want to stay clean and not be in trouble with penguin in any way shape or form! Cheers everyone appreciate some advice here!
White Hat / Black Hat SEO | | edward-may0 -
Duplicate categories how to make sure I don't get penalized for this
Hi there How would I go about fixing duplicate categories? My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
White Hat / Black Hat SEO | | edward-may0 -
Domain change after over optimisation
Hi Guys, Wanted to get some second opinions. I have a client who worked with someone on their SEO for about 15months, during which time they reaped the benefits of being top of Googles organic rankings for their main keyword. Unfortunately, the link building that was completed was terrible with no care taken for future updates, over optimised anchor text links on totally irrelevant websites. Contacting webmasters in order for links to be taken down was an impossible task, these websites were pure spam sites which I doubt anyone oversees with any care. So went the disavow route, it caused a slight increase, but the way Google indexes the website and its pages is still all over the place. We have a good new content marketing plan in place for the next 12 months, and already have some great in depth articles, videos and interactive infographics created, but the problem is my client really needs some quick wins and cannot risk the domain never recovering fully from the issues. They own the .com version of the .co.uk they have so my thoughts at this point are to move them to the .com, and the new content strategy can be worked on as if its a new project. For the keywords they target there are sites on the 1st and 2nd page with DA of 16-18, so it wont take long with the new content we have to get them back to a level playing field with the competition. The one worry I do have is Google seeing the .co.uk go down, and the .com appear with the same website on and looking down on the "start from scratch" method.... does anyone know of any examples when they have acted on this? We started an AdWords account (very small spend) and this has been going to the .co.uk, if we start a new one, with same billing info going to the .com could that also cause problems? So basically, because of my clients situation, and the level the competition is it, moving to the .com seems like the best move, its just whether there is any risk involved. I've never read of there being any, but thought I'd get some opinions before comitting to it. By the way, I'm only seeing this as a feasible option as they can still keep branded traffic due to buying both domains at the same time. Thanks in advance.
White Hat / Black Hat SEO | | gamnaking10 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How can I tell if my site was penalized from the most recent penguin update?
Hey all, I want to be able to see if my website was penalized from the most recent penguin update because we have several hundred websites built and at the bottom of each on it says something along the lines Website by, Web Design by, Hosting by and links back to our homepage. Could this possibly be penalizing us since these links have similar anchor text and on sites that have nothing to do with our services? Thanks, Ryan
White Hat / Black Hat SEO | | MonsterWeb280 -
Would the same template landing page (placed on 50+ targeted domains) help or hurt my ranking?
Scenario: Company ABC has 50 related domains that are being forwarding to the main company URL. Q1: Would there be SEO value by creating a template landing page for each domain that includes product info, photos and keyword links to the main URL? Q2: If all 50+ landing pages were the same, would that penalize the main site due to duplicate content?
White Hat / Black Hat SEO | | brianmeert0