Advice on Duplicate Page Content
-
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them.
For example -
http://www.jumpstart.com/common/find-easter-eggs
http://www.jumpstart.com/common/recognize-the-rsWe have many such pages.
Does Google look at them all as duplicate page content? If yes, how do we deal with this?
-
EGOL, Everett,
Thank you both for your very useful suggestions. Sounds like we should do something similar to our PDF documents to represent them as the actual/canonical content on the page. And we'll look at our CMS to see how we might implement the unlinked page name in the breadcrumb. We have done some work already in adding structured data with schemas (including aggregate ratings), so that is hopefully yielding some results already.
However, after an encouraging traffic spike that seemed to indicate that we were on the right track, we saw a very worrisome dip last month.... which then led to a lot of worried hand wringing about Panda.
So these suggestions are very helpful ; thanks again and we'll try them out!
-
Thank you, Everett,
Nice to see you posting in Q&A.
Look forward to seeing you regularly.
-
Hello Sudhir,
Those two pages would not be seen as duplicates. Google is very capable of separating the template from the content.
On a side note, you should look into getting the name of the page/game into the breadcrumb, though it doesn't have to be linked like the previous two pages in the path. For example:
You are here: Home --> Common --> Find Easter Eggs
Allowing visitors to review and rate the games would provide useful, keyword-rich, natural content on an otherwise content-sparse page. Once reviews/ratings are implemented you could also use Schema.org markup to enhance your search engine results by showing star ratings next to each game.
Good luck!
-
Google knows how to separate the template of the site from the content. So you have nothing to worry about if most of the code on your pages is the same code that is used on every other page.
I looked at your two sample pages and saw a few things that would concern me...
This page had very little content. If you have lots of pages with such a tiny amount of content you could have Panda problems.
http://www.jumpstart.com/common/find-easter-eggs
You also have pages like this....
http://www.jumpstart.com/common/recognize-the-rs-view
These have very little content.
I have a site with lots of printable content that is mainly images placed in .pdf documents to control the scale of the printing and the look of the printed page. The pages used to present them to visitors and the pdf documents were all thin content and my site had a Panda problem. That cause the rankings of every page on the site to fall and really damaged my traffic. I solved that by noindexing the html pages and applying rel=canonical to the pdf files using .htacess.
I can't say if this will happen to you but I would be uncomfortable if I had a site with such little content on its pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Minimising the effects of duplicate content
Hello, We realised that one of our clients, copied a large part of content from our website to his. The normal reaction would be to send a cease and desist letter. Nevertheless this would probably mean loosing a good client. The client dumped the text of several articles (for example:
Technical SEO | | Lvet
http://www.velascolawyers.com/en/property-law/136-the-ley-de-costas-coastal-law.html ) Into the same page:
http://www.freundlinger-partners.com/en/home/faqs-property-law/ I convinced the client to place our authorship tags on this page, but I am wondering if this is enough. What do you think? Cheers
Luca0 -
Duplicate page content & titles on the same domain
Hey, My website: http://www.electromarket.co.uk is running Magento Enterprise. The issue I'm running into is that the URLs can be shortened and modified to display different things on the website itself. Here's a few examples. Product Page URL: http://www.electromarket.co.uk/speakers-audio-equipment/dj-pa-speakers/studio-bedroom-monitors/bba0051 OR I could remove everything in the URL and just have: http://www.electromarket.co.uk/bba0051 and the link will work just as well. Now my problem is, these two URL's load the same page title, same content, same everything, because essentially they are the very same web page. But how do I tell Google that? Do I need to tell Google that? And would I benefit by using a redirect for the shorter URLs? Thanks!
Technical SEO | | tomhall900 -
Duplicate page errors from pages don't even exist
Hi, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist. My website has around 40-50 pages but SEO report shows that 375 pages have been crawled. My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report. Here is an example of a reported link: http://www.mywebsite.com/Door/Doors/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/ btw there is no issue such as crawl error in my Google webmaster tool. Any help appreciated
Technical SEO | | mmoezzi0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
Page crawling is only seeing a portion of the pages. Any Advice?
last couple of page crawls have returned 14 out of 35 pages. Is there any suggestions I can take.
Technical SEO | | cubetech0 -
Duplicate Content Errors
Ok, old fat client developer new at SEO so I apologize if this is obvious. I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title. Here is the duplicate title error Rare Currency And Old Paper Money Values and Information.
Technical SEO | | Banknotes
http://www.antiquebanknotes.com/ Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspx So, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this: this.Title = "Rare Currency And Old Paper Money Values and Information."; And it occurs only once...0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0