Duplicate Page Content on pages that appear to be different?
-
Hi Everyone! My name's Ross, and I work at CHARGED.fm. I worked with Luke, who has asked quite a few questions here, but he has since moved on to a new adventure. So I am trying to step into his role. I am very much a beginner in SEO, so I'm trying to learn a lot of this on the fly, and bear with me if this is something simple.
In our latest MOZ Crawl, over 28K high priority issues were detected, and they are all Duplicate Page Content issues. However, when looking at the issues laid out, the examples that it gives for "Duplicate URLs" under each individual issue appear to be completely different pages. They have different page titles, different descriptions, etc. Here's an example.
For "LPGA Tickets", it is giving 19 Duplicate URLs. Here are a couple it lists when you expand those:
http://www.charged.fm/one-thousand-one-nights-tickets
http://www.charged.fm/trash-inferno-tickets
http://www.charged.fm/mylan-wtt-smash-hits-tickets
http://www.charged.fm/mickey-thomas-ticketsInternally, one reason we thought this might be happening is that even though the pages themselves are different, the structure is completely similar, especially if there are no events listed or if there isn't any content in the News/About sections. We are going to try and noindex pages that don't have events/new content on them as a temporary fix, but is there possibly a different underlying issue somewhere that would cause all of these duplicate page content issues to begin appearing?
Any help would be greatly appreciated!
-
Nothing will positively effect this issue more than updating the content and giving the searchers solid, informative, unique content to read.
One way to do that might be to aggregate some reviews for these individual shows, give a short, unique bio of the performers, or rate the venues. 500-800 words of unique content will go a long way in this case.
Something else to work on would be the amount of internal links back and forth. When links are all robot sees, that becomes your duplicate content issue too. You can't do too much about that in this case. Most of the links come from the nav bars, so, the way to counter it would be again, adding great content.
-
Well, if it were one of my clients sites… I wouldn't do that. While I understand your logic with a noindex, I wouldn't want to create a situation where the pages would not be about to be found at all in search engines. Although it will drop your duplicate content numbers here on Moz, it's only a temporary fix. I guess a good question to explore is how long you will need to keep them as a noindex versus how long it would take to fix the content issues.
-
Hey Adam!
thanks for the response, that kind of confirms what we were thinking. So we are planning to put in a noindex follow on those pages while we work on adjusting the content/descriptions. Is that a good fix while we work on the pages or is there something else we should be doing?
-
Hey Ross!
Those pages are not "different" when it comes to search engines. Or maybe I should say, not different enough. The content is extremely thin and only switching out a word or two will absolutely make them come up as duplicate content. I would strongly suggest optimizing the page content and meta descriptions to be unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a duplicate content on my Moz crawler, but google hasn't indexed those pages: do I still need to get rid of the tags?
I received an urgent error from the Moz crawler that I have duplicate content on my site due to the tags I have. For example: http://www.1forjustice.com/graves-amendment/ The real article found here: http://www.1forjustice.com/car-accident-rental-car/ I didn't think this was a big deal, because when I looked at my GWT these pages weren't indexed (picture attached). Question: should I bother fixing this from an SEO perspective? If Google isn't indexing the pages, then am I losing link juice? 6c2kxiZ
Moz Pro | | Perenich0 -
301 Redirects - But still duplicate content?
Our website domain website.com redirects to website.com/en (since it's in English). Therefore, all pages on website.com redirects to website.com/en. In my Moz analytics, it says I have duplicate content, and lists all of these pages. Didn't the 301 redirects take care of the duplicate content? Or do I still have to add canonical tags?
Moz Pro | | Taulia0 -
Crawl report - duplicate page title/content issue
When the crawl report is finished, it is saying that there are duplicate content/page titles issues. However there is a canonical tag that is formatted correctly so just wondered if this was a bug or if anyone else was having the same issues? For example, I'm getting a error warning for this page http://www.thegreatgiftcompany.com/categories/categories_travel?sort=name_asc&searchterm=&page=1&layout=table
Moz Pro | | KarlBantleman0 -
How can I prevent errors of duplicate page content generated by my tags from my wordpress on-site blog platform?
When I add meta data and a canonical reference to my blog tags for my on-site blog which works using a wordpress.org template, Roger generates errors of duplicate content. How can I avoid this problem? I want to use up to 5 tags per post, with the same canonical reference and each campaign scan generates errors/warnings for me!
Moz Pro | | ZoeAlexander0 -
Page authority questions?
I've been analyzing some IT communities ...in order to check how relevant is the page authority vs PageRank. I found one main site which is organized by "communities'..and every community is a sub-domain. The root domain has an authority of 90/100 which it should be great......so the sub-domains "inherit" part of this authority.... Until here everything seems to be perfect. However, I went deeper and I picked one of these communities. Analyzing the "Linking Root Domain" I discovered it only has only 5 root domains pointing to its home page. Those 5 Root Domains have generated more than 134k links. That doesn't seem to be "natural". Checking those 5 Root Domains I discovered that they have been registered by the same Root Domain site. Ex: Main domain: Domain.com Community1.domain.com Community2.domain.com.... Linking Root Domains: DomainXY.com DomainABC.com DomainRST.com DomainFGH.com DomainOPQ.com It seems to me that it is easy to cheat the authority domain score. Just creating others sites developing the same topic and generating back links to your main domain
Moz Pro | | SherWeb0 -
Why does my crawl diagnostics show duplicate content
My crawl diagnostics show duplicate content at mysite.com and mysite.com/index.html which are essentially the same file.
Moz Pro | | MSSBConsulting0 -
This Rookie needs help! Duplicate content pages dropped significantly.
So I am pretty new to SEO Moz. I have an e-commerce site and recently did a website redesign. However, not without several mistakes and issues. That said, when SEO Moz did a crawl of my site, the results showed A LOT of Duplicate Content Pages on my site due to my having one item in many variations. It was almost over whelming and because the number of pages was so high, I have been trying to research ways to correct it quickly. The latest crawl from yesterday shows a drastic drop in the number of duplicate content pages and a slight increase in pages with too long page titles (which is fixable). I am embarrassed to give the number of duplicate pages that were showing but, just know, it's been reduced to a third of the amount. I am just wondering if I missed something and should I be happy or concerned? Has there been a change that could have caused this? Thanks for helping this rookie out!
Moz Pro | | AvenueSeo0 -
How do I find the most linked to page of a site?
I'm looking at a site for a potential link and am trying to find the most linked to page. The SEOmoz toolbar tells me the root domain (DA) is linked to by 660 root domains but the main URL (PA) is linked to by 38 root domains. I used open site explorer and got the same # of 38 root domains in the result. From the Top Pages tab, I clicked on the 2nd page down and the SEOmoz toolbar gives me 189 root domains linking to that page (PA). Then I ran a Linkscape report to see what that would say and I get 146 linking root domains. 1. Is this 2nd page down on OSE the most linked to page? 2. a. Is something off in these numbers?
Moz Pro | | Motava
b. How come OSE/Linkscape doesn't report the 660 root domains in the DA?0