How Does This Site Get Away With It?
-
The following site is huge in the movie trailer industry:
It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry.
Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs.
We all know Google hates duplicate content at the moment... so how does this site get a away with it?
Does it's root-domain authority keep it up there?
-
I have seen in some instances where sites will add iframes if they have to use duplicate content, then nofollow the iframes, and tell the robot file to ignore them so they aren't analyzed. It works actually, but it's sloppy code having a iframe on every page sometimes multiple. Or at least it is to me for that purpose, but I guess it's better than getting penalized for dup content.
Have a great night.
Matthew Boley
-
Hey Rhys.
A few questions.
Does this site have an affiliate feed.
For example, are other people copying the content from this site through an affiliate feed of some sort.
That could be one of the example here.
( actually they do just found it )
In the case of Dup Content, Google looks at the how trustful a sites content is based on this overall ranking. So if this site is putting out content everyday and then pinging the google bot to come crawl the site first, thats all they would need to be verified as the originator of the content. The other sites get then can easily copy the content from this site, but as long as it gets indexed on this site first, they would not really have a problem.
Same thing goes for a blog you write.
You would have to dig a little deeper with their link structure. They overall ranking is pretty high they have tones of spun link from each url, about 100 linked pages for each domain name . A few good links from Mtv and Vh1 and mostly a lot of blogs.
But ya your right their SE traffic is off the charts . They rank really well for some movie names as well.
what is your end goal in running a comparison against them ?
They root domain is fairly high and that does play a big factor in how well they get ranked for a lot of these keywords as well.
their age of site is about 5.3 years and their domain authority is around 74 to 79 , depending where you look.
PR6 but you might need to dig deeper.
-
Could be that the other key factors like traffic and incoming links are astronomical, so Google's content dupe penalty is out weighed.
It's a case of "doing it better" and not necessarily doing it the first time. While they may be scraping content and it's all duped - they simply must get the huge traffic numbers and incoming links because it's content all in the one place.
I'm just assuming of course - without doing an audit who knows - but must be frustrating for you if you are working on a campaign against them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our forum links are redirecting to high spammy & NSFW sites: Any impact on main website?
Hi all, We have a discussion forum like subdomain.website.com. Some spammers have created many links with our subdomain URL which are redirecting to high spammy and NSFW sites (Not sure how they did). We are trying to stop the redirects. So far many visitors and bots have recorded visits to these spammy sites with our URL. Will this impact our website anyhow ? I noticed that our website spam score has been increased and not sure if this is coincidental or penalized. Ranking even dropped without manual actions. I wonder how much of this subdomain activity will impact main website? Please advise.
White Hat / Black Hat SEO | | vtmoz1 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
How to gain links to this site
Hello, How would you suggest I gain backlinks for bobweikel.com in light of all present and future Google updates? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Failed microsites that negatively affect main site: should I just redirect them all?
While they are great domain names, I suspect my 7 microsites are considered spammy and resulted in a filter on my main e-commerce site for the important keywords we now have a filter blocking from showing up in search. Should I consider it a sunk cost and redirect them all to my main e-commerce site, or is there any reason why that would make things worse? I've fixed just about everything I can thinking of in response to Panda and Penguin, before which we were on the first page for everything. That includes adding hundreds of pages of unique and relevant content, in the form of buyers guides and on e-commerce category pages -- resolving issues of thin content. Then I hid URL parameters in Ajax, sped up the site significantly, started generating new links... nothing... I have tons of new keywords for other categories, but I still clearly have that filter on those few important head keywords. The anchor text on the microsites leading to the main site are typically not exact match, so I don't think that's the issue. It has to be that the sites themselves are considered spammy. My bosses are not going to like the idea because they paid for those awesome domains, but would the best idea be to redirect them to the e-commerce site?
White Hat / Black Hat SEO | | ElBo9130 -
Multiple domains pointed at one site
I know things are changing and the things Google thinks are cheating searchers from finding what they are really looking for are changing too. So, I have multiple domain names that are related to my site, but not the actual site name. For instance, I have a certification program called Certified NetAnalyst that has a few domains for it... .com, .org and other derivatives like NetAnalyst. I would like to point the domains to my main company web site and not create a site just for the certification. Does Google think it is cheating to point domain names with my company branding names to my main web site? What about domain name forwarding to a specific URL, like taking the certification name domains and pointing them to the certification page instead of the main site? Wondering if one could no follow (don't know how to do that) the domain forwarding links so it is not duplicate content? Is that possible in some way? Could you put another robots.txt file with excludes in the domain forwarding url landing page so it would not be duplicate content? For the future I want all SEO "juice" to go to the main domain, but the keyword value of the domain names is valuable. I sure would be grateful if someone that has a good understanding and specific recent experience with Google policy and enforcement could offer some sage and practical advice and perhaps a case study example where Google "likes it" or on the other hand a good explanation of why I may not wish to do this! Thank You! Bill Alderson www.apalytics.com
White Hat / Black Hat SEO | | Packetman0071 -
Has anyone been able to recover a site from that was slapped by panda?
I have a client that the only thing I can determine is over optimization of a couple anchor terms which the person no longer ranks for..I tried mixing up with brandname , brandname.com and a diversity of links but nothing seems to budge anyone have a similar problem?
White Hat / Black Hat SEO | | foreignhaus0 -
Auto-link inside your own site to the same domain is white-hat?
Hi, I am using a plugin in wordpress that make auto link for some certain keywords in my site suppose: My site is example.com My important keyword is: sample and across the domain example.com through out the content if there is the word: sample it is linked automatically to example.com I like your opinion about this practice, if it may carry any kind of punishment by SEs? Thanks.
White Hat / Black Hat SEO | | Pooria0