Does 'framing' a website create duplicate content?
-
Something I have not come across before, but hope others here are able offer advice based on experience:
A client has independently created a series of mini-sites, aimed at targeting specific locations. The tactic has worked very well and they have achieved a large amount of well targeted traffic as a result.
Each mini-site is different but then in the nav, if you want to view prices or go to the booking page, that then links to what at first appears to be their main site.
However, you then notice that the URL is actually situated on the mini-site. What they have done is 'framed' the main site so that it appears exactly the same even when navigating through this exact replica site.
Checking the code, there is almost nothing there - in fact there is actually no content at all. Below the head, there is a piece of code:
<frameset rows="*" framespacing=0 frameborder=0> <frame src="[http://www.example.com](view-source:http://www.yellowskips.com/)" frameborder=0 marginwidth=0 marginheight=0>
<noframes>Your browser does not support frames. Click [here](http://www.example.com) to view.noframes>
frameset>
Given that main site content does not appear to show in the source code, do we have an issue with duplicate content? This issue is that these 'referrals' are showing in Analytics, despite the fact that the code does not appear in the source, which is slightly confusing for me. They have done this without consultation and I'm very concerned that this could potentially be creating duplicate content of their ENTIRE main site on dozens of mini-sites. I should also add that there are no links to the mini-sites from the main site, so if you guys advise that this is creating duplicate content, I would not be worried about creating a link-wheel if I advise them to link directly to the main site rather than the framed pages. Thanks!
-
Still laughing about the frames. Man, I am old, so frames were part of the web back in the day, whoever these people are that are doing this work, they need to put their slippers and reading glasses on and sit down in front of the fire with a glass of warm milk.
Frames, made my day I tells ya!
-
Hey, I can't see this approach working for long, it's exactly the kind of thing they are trying to cut down on. Like you say, it should not hurt the main page but it would be interesting to see if the mini sites have taken a hit as they are essentially low quality, cookie cutter garbage created just for the search engines.
I am unsure how google handles frames as it is not technically duplicate content, it is just a window to the main site itself but it is kind of manipulative to present one sites content in another one, especially when that other one is a page designed purely for search engine traffic and with identical content (bar the location keyword) to a bunch of others.
This whole approach is flawed.
-
Ha unfortunately they are for real! I have to confess that I've never seen this done before, and it immediately alerts my 'dodgy' sensor!
Good point regarding doorway pages. They are mini-sites with around 8 pages of their own, which then link to the framed site from the nav and the odd text link. However each of the mini sites has duplicated the same content with the location name changed wherever it appears. I assume therefore that you'd advise against linking to the main site?
The fact that the site has been framed raises a question if indeed Google does punish this as duplicate content:
If I were a spiteful black-hatter, could I not just frame a competitors site on loads of different domains and harm the original site's SERPs? I guess in the same way I could do that anyway by copying all the content, so there is a real problem with measuring original/duplicate content.
-
It's hard to say without seeing the mini sites and just how mini they are but they could be classed as doorway pages if they have little or no original content and are just designed to feed traffic to the main site.
If they are useful little sites then linking back to the main site may help that site rank better but it's still not a whiter than white approach but again, real tough to comment in detail without seeing the sites in question.
On a personal snobbery level, Frames? Are they for real?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content due to CMS
The biggest offender of our website's duplicate content is an event calendar generated by our CMS. It creates a page for every day of every year, up to the year 2100. I am considering some solutions: 1. Include code that stops search engines from indexing any of the calendar pages 2. Keep the calendar but re-route any search engines to a more popular workshops page that contains better info. (The workshop page isn't duplicate content with the calendar page). Are these solutions possible? If so, how do the above affect SEO? Are there other solutions I should consider?
Technical SEO | | ycheung0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
Duplicate Page Content and Titles
A few weeks ago my error count went up for Duplicate Page Content and Titles. 4 errors in all. A week later the errors were gone... But now they are back. I made changes to the Webconfig over a month ago but nothing since. SEOmoz is telling me the duplicate content is this http://www.antiquebanknotes.com/ and http://www.antiquebanknotes.com Thanks for any advise! This is the relevant web.config. <rewrite><rules><rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^www.antiquebanknotes.com$" negate="true"></add></conditions>
Technical SEO | | Banknotes
<action type="Redirect" url="<a href=" http:="" www.antiquebanknotes.com="" {r:1"="">http://www.antiquebanknotes.com/{R:1}" />
</action></match></rule>
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule></rules></rewrite>0 -
Omniture tracking code URLs creating duplicate content
My ecommerce company uses Omniture tracking codes for a variety of different tracking parameters, from promotional emails to third party comparison shopping engines. All of these tracking codes create URLs that look like www.domain.com/?s_cid=(tracking parameter), which are identical to the original page and these dynamic tracking pages are being indexed. The cached version is still the original page. For now, the duplicate versions do not appear to be affecting rankings, but as we ramp up with holiday sales, promotions, adding more CSEs, etc, there will be more and more tracking URLs that could potentially hurt us. What is the best solution for this problem? If we use robots.txt to block the ?s_cid versions, it may affect our listings on CSEs, as the bots will try to crawl the link to find product info/pricing but will be denied. Is this correct? Or, do CSEs generally use other methods for gathering and verifying product information? So far the most comprehensive solution I can think of would be to add a rel=canonical tag to every unique static URL on our site, which should solve the duplicate content issues, but we have thousands of pages and this would take an eternity (unless someone knows a good way to do this automagically, I’m not a programmer so maybe there’s a way that I don’t know). Any help/advice/suggestions will be appreciated. If you have any solutions, please explain why your solution would work to help me understand on a deeper level in case something like this comes up again in the future. Thanks!
Technical SEO | | BrianCC0 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0