Concerns of Duplicative Content on Purchased Site
-
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine.
Upon purchasing the domain, I did the following:
- Rehosted the old site and content that had been down for 9-12 months on oldsite.com
- Allowed a week or two for indexation on oldsite.com
- Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules
- Issued a Press Release declaring the acquisition of oldsite.com for newsite.com
- Performed a site "Change of Name" in Google from oldsite.com to newsite.com
- Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com
It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline.
For Example:
- Oldsite.com has full attribution prior to going offline
- Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution)
- Oldsite.com goes offline
- Scraper sites continue hosting content
- Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content)
- Google reassigns original attribution to a scraper site
- Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen
- Google then silently punished Oldsite.com and Newsite.com (which it is redirected to)
QUESTIONS
- Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache?
- Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution?
- Unrelated: Are there any other steps that are recommend for a Change of site as described above.
-
Hi, John.
Ok, there is a q/a video of matt cutts answering the question about "originality" of content in terms of if bigger website copies content from smaller author-website. (Can't find the link to it, may be other MOZers will help out here). Matt said that yes, it's possible. So, as far as I understand, Google can reassign original attribution. Especially, if your website was offline for long time.
At the same time, here is a Matt Cutts' video about duplicate content as a penalizing factor - https://www.youtube.com/watch?v=mQZY7EmjbMA
According to that video, unless you're very spammy scraper, you are going to be fine in terms of duplicate.
About slow gain of rankings - having lots of referring domains is not the guarantee of fast or good rankings. It surely helps a lot, but it's not the only thing. Have you optimized content, technical SEO etc? As of tools for penalties - use Google Webmaster tools - manual action section. If there is nothing there, you haven't been penalized by google
About any recommendations - well, as I said, update/optimize content if needed, get your technical SEO in order. Since you said the rankings are growing and it has been a month since you've launched website - you're doing pretty good. It always requires time, my friend.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mixing up languages on the same page + possible duplicate content
I have a site in English hosted under .com with English info, and then different versions of the site under subdirectories (/de/, /es/, etc.) Due to budget constraints we have only managed to translate the most important info of our product pages for the local domains. We feel however that displaying (on a clearly identified tab) the detailed product info in English may be of use for many users that can actually understand English, and may help us get more conversions to have that info. The problem is that this detailed product info is already used on the equivalent English page as well. This basically means 2 things: We are mixing languages on pages We have around 50% of duplicate content of these pages What do you think that the SEO implications of this are? By the way, proper Meta Titles and Meta Descriptions as well as implementation of href lang tag are in place.
Intermediate & Advanced SEO | | lauraseo0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Best-of-the-web content in steep competition, ecommerce site
Hello, I'm helping my client write a long, comprehensive, best-of-the-web piece of content. It's a boring ecommerce niche, but on the informational side the top 10 competitors for the most linked to topic are all big players with huge domain authority. There's not a lot of links in the industry, should I try to top all the big industries through better content (somehow), pictures, illustrations, slideshows with audio, and by being more thorough than these very good competitors? Or should I go for something that's less linked to (maybe 1/5 as much people linking to it) but easier? or both? We're on a short timeline of 3 and 1/2 months until we need traffic and our budget is not huge
Intermediate & Advanced SEO | | BobGW1 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location. What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
Intermediate & Advanced SEO | | JChronicle0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Is this duplicate content something to be concerned about?
On the 20th February a site I work on took a nose-dive for the main terms I target. Unfortunately I can't provide the url for this site. All links have been developed organically so I have ruled this out as something which could've had an impact. During the past 4 months I've cleaned up all WMT errors and applied appropriate redirects wherever applicable. During this process I noticed that mydomainname.net contained identical content to the main mydomainname.com site. Upon discovering this problem I 301 redirected all .net content to the main .com site. Nothing has changed in terms of rankings since doing this about 3 months ago. I also found paragraphs of duplicate content on other sites (competitors in different countries). Although entire pages haven't been copied there is still enough content to highlight similarities. As this content was written from scratch and Google would've seen this within it's crawl and index process I wanted to get peoples thoughts as to whether this is something I should be concerned about? Many thanks in advance.
Intermediate & Advanced SEO | | bfrl0 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0