Concerns of Duplicative Content on Purchased Site
-
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine.
Upon purchasing the domain, I did the following:
- Rehosted the old site and content that had been down for 9-12 months on oldsite.com
- Allowed a week or two for indexation on oldsite.com
- Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules
- Issued a Press Release declaring the acquisition of oldsite.com for newsite.com
- Performed a site "Change of Name" in Google from oldsite.com to newsite.com
- Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com
It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline.
For Example:
- Oldsite.com has full attribution prior to going offline
- Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution)
- Oldsite.com goes offline
- Scraper sites continue hosting content
- Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content)
- Google reassigns original attribution to a scraper site
- Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen
- Google then silently punished Oldsite.com and Newsite.com (which it is redirected to)
QUESTIONS
- Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache?
- Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution?
- Unrelated: Are there any other steps that are recommend for a Change of site as described above.
-
Hi, John.
Ok, there is a q/a video of matt cutts answering the question about "originality" of content in terms of if bigger website copies content from smaller author-website. (Can't find the link to it, may be other MOZers will help out here). Matt said that yes, it's possible. So, as far as I understand, Google can reassign original attribution. Especially, if your website was offline for long time.
At the same time, here is a Matt Cutts' video about duplicate content as a penalizing factor - https://www.youtube.com/watch?v=mQZY7EmjbMA
According to that video, unless you're very spammy scraper, you are going to be fine in terms of duplicate.
About slow gain of rankings - having lots of referring domains is not the guarantee of fast or good rankings. It surely helps a lot, but it's not the only thing. Have you optimized content, technical SEO etc? As of tools for penalties - use Google Webmaster tools - manual action section. If there is nothing there, you haven't been penalized by google
About any recommendations - well, as I said, update/optimize content if needed, get your technical SEO in order. Since you said the rankings are growing and it has been a month since you've launched website - you're doing pretty good. It always requires time, my friend.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I added an SSL certificate this morning and now I noticed duplicate content
Ok, so Im a newbie, therefor I make mistakes! Lots of them. I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https. So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate? I guess what's the easiest that will also be best for ranking. Thank you!! Rena
Intermediate & Advanced SEO | | palila0 -
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
Duplicate content across hundreds of Local sites and they all rank #1
Usually when we discuss duplicate content, we're addressing the topic of penalties or non-indexing. In this case, we're discussing ranking high with duplicate content. I've seen lots of dental, chiropractor and veterinarian sites built by companies that give them cookie cutter sites with the same copy. And they all rank #1 or #2. Here are two companies that do that:
Intermediate & Advanced SEO | | katandmouse
http://www.rampsites.com/rampsites/home_standard.asp?sectionid=4
http://mysocialpractice.com/about/ The later uses external blogs to provide inbound links to their clients' site, but not all services do that, in fact, this is the first time I've seen them with external blogs. Usually the blog with duplicate copy is ON SITE and the sites still rank #1. Query "Why Your Smile Prefers Water Over Soft Drinks" to see duplicate content on external blogs. Or "Remember the Mad Hatter from the childhood classic, Alice in Wonderland? Back then, the process of making hats involved using mercury compounds. Overexposure could produce symptoms referred to as being" for duplicate content on chiropractor sites that rank high. I've seen well optimized sites rank under them even though their sites have just as much quality content and it's all original with more engagement and inbound links. It appears to me that Google is turning a blind eye on duplicate content. Maybe because these are local businesses with local clientele it doesn't care that a chiropractor in NY has the same content as one in CA, just as the visitor doesn't care because the visitor in CA isn't look at a chiropractor's site in NY generally. So maybe geo-targeting the site has something to do with it. As a test, I should take the same copy and put it on a non-geo-targeted site and see if it will get indexed. I asked another Local SEO expert if she has run across this, probably the best in my opinion. She has and she finds it difficult to rank above them as well. It's almost as if Google is favoring those sites. So the question is, should all dentists, chiropractors and veterinarians give it up to these services? I shudder to think that, but, hey it's working and it's a whole lot less work - and maybe expense - for them.0 -
Copying contents from a blog site (External) to a company blogsite (internal)
Hi, I have a client that has several external blogs www.blogsite1.info www.blogsite2.info and he also has the www.companywebsite.com the main domain of course is the comapnywebsite.com. They are doing some thing wrong, because instead of generating contents inside the main domain, the create contents in the blogsites and send links to the blogsites to see those contents. So they are inviting their users to EXIT the website... So, I told him, If you want to generate contents, please keep a blog INSIDE your domain www.companywebsite.com/blog, but keep the other ones, cause they are generating links (they are .info domains, that is not good, but they are nice keyword match domains) Now, he told me he was thinking on copy and paste the contents from the external blogsites to the internal website. I warned him about generating duplicate content. But.... is it really a problem? They are not in the same domain... Could google give a penalty because of that to the main domain? Thanks!
Intermediate & Advanced SEO | | teconsite0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Duplicate content across internation urls
We have a large site with 1,000+ pages of content to launch in the UK. Much of this content is already being used on a .nz url which is going to stay. Do you see this as an issue or do you thin Google will take localised factoring into consideration. We could add a link from the NZ pages to the UK. We cant noindex the pages as this is not an option. Thanks
Intermediate & Advanced SEO | | jazavide0