Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content due to parked domains
-
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues.
12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website
1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address
2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates?
3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects.
4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index
5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content.
Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere.
Any feedback would be greatly appreciated.
-
Oh, wow - if you're talking a couple of years ago and major ranking drops, then definitely get aggressive. Remove as many as possible and Robots No-index them. If you've got the Robots.txt directives in place, Google shouldn't put them back (although, from past experience, I realize "shouldn't" isn't a guarantee). If you're down 90%, you've got very little to lose and clearly Google didn't like something about that set-up.
Unfortunately, that's about the most drastic, reasonable option. The next step would be to start over with a fresh domain and kill all of the old domains. That could be a lot more hazardous, though.
-
Thank you Dr. Peter.
Couple of years ago my search engine positions tanked by around 90% and have not picked up back yet. At that time assumed it was due to the duplicate content on these domains, as they were parked ( Not 301, just domain masking) at that point. To avoid that duplicate content problem I moved to 301 redirection. None of these domains have any link juice to speak. Some domains have some typein traffic. I was just trying to capture them rather than link jiuice.
I did de-index most of the domains from webmaster tools in the past. But Google put them back, after 90 days or so. 301 redirection in place did not help that much.
If Google thinks there is a chance of abuse of the 301 of new domains, I would start removing the new domains completely and point else where so that Google can have some new content.
Thank youAji Abraham -
Ugh... 75 is a chunk. The problem is that Google isn't a huge fan of 301-redirecting a bunch of new domains, because it's been too often abused in the past by people buying up domains with history and trying to consolidate PageRank. So, it's possible that (1) they're suspicious of these domains, or (2) they're just not crawling/caching them in a timely manner, since they used to be parked.
Personally, unless there's any link value at all to these, I'd consider completely de-indexing the duplicate domains - at this point that probably does mean removal in Google Search Console and adding Robots.txt (which might be a prerequisite of removal, but I can't recall).
Otherwise, your only real option is just to give the 301-redirects time. It may be a non-issue, and Google is just taking its time. Ultimately, the question is whether these are somehow harming the parent site. If Google is just indexing a few pages but you're not being harmed, I might leave it alone and let the 301s do their work over time. I checked some headers, and they seem to be set up properly.
If you're seeing harm or the wrong domains being returned in search, and if no one is linking to those other domains, then I'd probably be more aggressive and go for all-out removal.
-
Hello Dr.Peter
Thank you for helping out.
There are around 75 or so domains pointing to the main website. When they were parked (prior to November 2014) on the main site, they were added as additional domains, which were url masked. So at least 30 domains were indexed in google with same content as main content.
12 months back, I realized the duplicate content error and changed the domain parking to 301 redirects. Also used ‘remove url’ functionality in Google Webmaster tools. Even after 12 months, I noticed a number of domains had duplicate contents in google index.
This I removed the pages from the addon domains again using google webmaster tools.To give you an idea my main site with original content/links is iscripts.com and an addon domain socialappster.com is pointed to a product page at iscripts.com/socialware. If you do a site: socialappster.com in google you find few pages in google index, even though it is 301 redirect for more than 12 months now. Similar issue with other domains pointing to product pages as well as whole site.
Appreciate any direction you can provide to clean this mess.
Thanks
Aji Abraham
-
Oh, and how many domains are we talking (ballpark)?
-
What was happening when they were parked - were they 302-redirected or was it some kind of straight CNAME situation where, theoretically, Google shouldn't have even seen the parked domains? Trick, of course, is that Google is a registrar, so they can see a lot that isn't necessarily public or crawlable.
Did the additional domains get indexed while parked, or after you went to 301-redirects?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When creating a sub-domain, does that sub-domain automatically start with the DA of the main domain?
We have a website with a high DA and we are considering sub-folder or sub-domain. One of the great benefits of a sub-folder is that we know we get to keep the high DA, is this also the case for sub-domains? Also if you could provide any sources of information that specify this, I can't see to find anything!
Intermediate & Advanced SEO | | Saba.Elahi.M.0 -
Upper and lower case URLS coming up as duplicate content
Hey guys and gals, I'm having a frustrating time with an issue. Our site has around 10 pages that are coming up as duplicate content/ duplicate title. I'm not sure what I can do to fix this. I was going to attempt to 301 direct the upper case to lower but I'm worried how this will affect our SEO. can anyone offer some insight on what I should be doing? Update: What I'm trying to figure out is what I should do for our URL's. For example, when I run an audit I'm getting two different pages: aaa.com/BusinessAgreement.com and also aaa.com/businessagreement.com. We don't have two pages but for some reason, Google thinks we do.
Intermediate & Advanced SEO | | davidmac1 -
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Could ranking problem be caused by Parked Domain?
I've been investigating a serious Google ranking drop for a small website in the UK. They used to rank top 5 for about 10 main keywords and overnight on 24/3/12 they lost rankings. They have not ranked in top100 since. Their pages are still indexed and they can still be found for their brand/domain name so they have not been removed completely. I've coverered all the normal issues you would expect to look for and no serious errors exist that would lead to what in effect looks like a penalty. The investigation has led to a an issue about their domain registration setup. The whois record (at domaintools) shows the status as "Registered and Parked or Redirected" which seems a bit unusual. Checking the registration details they had DNS settings pointing correctly to the webhost but also had web forwarding to the domain registrar's standard parked domain page. The domain registrar has suggested that this duplication could have caused ranking problems. What do you think? Is this a realistic reason for their ranking loss? Thanks
Intermediate & Advanced SEO | | bjalc20110 -
Concerns about duplicate content issues with australian and us version of website
My company has an ecommerce website that's been online for about 5 years. The url is www.betterbraces.com. We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website. The only major difference between the two websites is the price that is charged for the products. The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues. However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French
Intermediate & Advanced SEO | | djo-2836690