Duplicate content warnings
-
I have a ton of duplicate content warnings for my site poker-coaching.net, but I can't see where there are duplicate URLs. I cannot find any function where I could check the original URL vs a list of other URLs where the duplicate content is?
-
thanks for the help. I am trying to cover all bases here. Duplicate content was one concern, the other one is too high link density and bad incoming links.
I have downloaded a full backlinks report now using Majestic SEO (OSE only shows incoming links from 74 domains...).
I think I may have found the problem. I used to have a forum on that domain years ago which was hacked and used for a lot of spammy outgoing links for stuff like Cialis, Viagra etc.. Those guys also linked from other sites to my forum pages. Example:| from: http://www.grupoibira.com/foro/viewto...
| Anchor: buy flagyl in lincolnu... | 3 | to: http://www.poker-coaching.net/phpbb3/...
|
When I closed the forum and deleted the forum plugin, I redirected all forum pages to my main page which, under the circumstances was a bad mistake I guess. Because with the redirect, all those spammy links now end up pointing to my main page, right? So first, I have removed that redirect now.
But the problem remains that I still have plenty of links from spam sites pointing to URLs of my domain that do not exist any more.
Is there anything else I can do to remove those links or have Google remove/disregard them, or do you think a reconsideration request explaining the situation would help? -
Honestly, with only 235 indexed pages, it's pretty doubtful that duplicate content caused you an outright penalty (such as being hit with Panda). Given your industry, it's much more likely you've got a link-based penalty or link quality issue in play.
You do have a chunk of spammy blog comments and some low-value article marketing, for example:
http://undertheinfluence.nationaljournal.com/2010/02/summit-attendees.php
A bit of that is fine (and happens in your industry a lot), but when it's too much of your link profile too soon, you could be getting yourself into penalty territory.
-
Hey There,
Just to clarify, to see the source of those errors, you’ll need to download your Full Crawl Diagnostics CSV and open it up in something like Excel. In the first column, perform a search for the URL of the page you are looking for. When you find the correct row, look in the last column labeled referrer. This tells you the referral URL of the page where our crawlers first found the target URL. You can then visit this URL to find the source of your errors. If you need more help with that, check out this link: http://seomoz.zendesk.com/entries/20895376-crawl-diagnostics-csv
Hope that helps! I will look at the issue on the back end to see if they are actually duplicate content.
Have a great day,
Nick
-
Thanks for looking into this. Actually I checked the whole site by doing a batch search on Copyscape and there were only minor duplicate content issues. I resolved those by editing the content parts in question (on February 24th 2012).
Since I am desperately searching for the reasons why this site was penalized (and it def is...), it would be great to know why your duplicate content checker finds errors. Could only be related to multiple versions of one page on different URLs. I do have all http://mysitedotcom redirected to www.mysitedotcom, and the trailing slash/notrailingslash URL problem was also resolved by a redirect long ago, so I do not know where the problem lies.
Thanks for the help! -
I think our system has roughly a 90-95% threshold for duplicate content. The pages I'm seeing in your campaign don't look that high, so something is up - I'm checking with support.
For now, use the "Duplicate Page Title" section - that'll tend to give you exact duplicates. The duplicate content detection also covers thin content and near duplicates.
-
Yes that is what I first thought too. If only it were that easy.
But when I do, I see a couple of URLs that definitely do not have any duplicate content . Could it be that the dupe content check considers text in sitewide modules (like the modules "Poker News" and "Tips for ...." in www.poker-coaching.net) as duplicate content, because they appear on all pages?
This way, the duplicate content finding function is totally worthless. -
If you drill down into your campaign report into 'Crawl Diagnostics' you will see a dropdown menu that's named "Show". Select 'Duplicate Page Content'... you will see a graph with a table below it. To the right of the URL you will see a column named "Other URL's". The numbers in that column are live links to a page with the list of URL's with duplicate content. At least that is how it is displayed in my campaigns.
-
You will find this information at google webmaster tools and at seomoz campaing. There you will the information you need.
One easy way to avoid this is to include the rel canonical metag. You need to include in every page (pages you want to be the official one) inside the head tag the follow:
where ww.example.com/index.html is your page adress. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Effect of inserting No indexed Contents in normal Pages (Nextgen Gallery)
Hello Dear Community, I'm running a photography website and have a question about the indexability of "No indexed Content" inserted on indexable pages. Background : I read everywhere that best practice is to "no index" all useless pages with few content, what I did with Yoast plugin : I no indexed all my nextgen galleries and "ngg_tags" since they create single pages for every photo, tags or slideshow. I did the same for all my porfolio-posts, price lists, testimonials and so on... Nevertheless, I inserted these galleries and portfolios on SEO optimized page for my target keywords. + Nextgen plugin automatically adds these images in the page sitemap. My idea is to have only my Seo optimized page showing in Google and not the others. Problem: I've been checking the results in Google Search Console, filtering by images : I discovered that most of the images featured in these Masonry galleries are not showing in google, and actually almost all the images indexed are the Wordpress from media gallery. I double checked with Screaming Frog, and the software doesn"t see images on these pages. My question is: Is the low indexablilty of these contents are related to the No indexation of the original contents ??? Does somebody has experienced the same issue that these contents doesn't show on Google ? in advance many thanks for your help
Reporting & Analytics | | TristanAventure0 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Backlinks or content? What is the problem here?
My Moz ranking is 29, Ahrefs Domain rank is 49 and Majestic Citation flow is 44 whereas trust flow is 17. This is a plain question - If the above are the rankings for backlink problem, then where is the problem more likely to be found? Backlink or content? In more detail - My site has been dropping in search since the last few weeks. With these rankings, is it likely that a backlink related problem is there? I myself agree that some content is poor and thin and there is a problem of plagiarism also. But overall, do i keep focusing on content? I do not know how good are these rankings as shown above. My URL is www.marketing91.com Please let me know whether the backlink profile looks good or not? So that at least i am not worried that there is a backlink problem as well. (i will surely work on toxic links soon) I can worry only on content.
Reporting & Analytics | | marketing910 -
Ecommerce, Product Content & Google Metrics
Hi I know Google has many different variations of what they consider to be thin content. I wondered if anyone has an idea of the best metric to determine what content you need to improve on your site? I work on a large e-commerce site so there are a thousands of product pages - all with product descriptions similar [but not duplicate] to competitors. I guess in terms of quantity, these pages don't have huge amounts of written content, so I'm wondering what Google classes as 'thin' on a product page: 1. Does Google just expect a conversion to deem that product page useful? And if not, what's the best metric to identify what works vs. what doesn't on product pages in Google's eyes. 2. If adding lots of product pages on mass is bad and will decrease overall authority? The content isn't duplicate, but may be fairly similar to other sites selling the same thing. I'm trying to get our reviews added directly to product pages rather than in a pop up to improve the unique content and I'm starting to write guides, FAQ's and I'll work towards getting video started - however, I'm the only SEO & we don't have much resource so this all takes time. If anyone else has any advice on steps to take that would be great 🙂
Reporting & Analytics | | BeckyKey0 -
Google Analytics is treating my blog like all the content is just on the home page.
Hello all, I installed Google Analytics on a main website and a blog (blog.travelexinsurance.com) While it appears to be tracking correctly (and when I test it in real time it shows that I'm visiting) but it is treating the entire blog as though it's one page. So I can't see data on blog post X. All I see is that X visitors came to my blog in aggregate. So I see blog.travelex.com has 999 visitors, but it doesn't show that /travel-luggage got 50 visits, while /insurace-tips got 75 and so forth. I assume I screwed up the tracking somehow, but can't figure out where I went wrong. Tracking on the main domain works just fine. It's specific to the blog.
Reporting & Analytics | | Patrick_G0 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
Duplicate page content
I have a website which "houses" five different and completely separate departments, so the content is separated by subfolders. e.g. domain.com/department1 domain.com/department2 etc. and each have their own individual top navigation menus. There is an "About Us" section for each department which has about 6 subpages (Work for us, What we do, Awards etc.) but the problem is that the content for each department is exactly the same. The only difference is the navigation menu and the breadcrumbs. This isn't ideal as a change to one page means having to make the change to all 5 and from an SEO perspective it's duplicate content x5 (apart from the Nav). One solution I can see is to have the "About Us" section moved to the root level (domain.com/about-us) and have a generic nav, possibly with the department names on it. The only problem with this is that it disrupts the user journey if they are forced away from the department that they're chosen. Basically i'm looking for suggestions or examples of other sites that have got around this problem, I need inspiration! Any help would be greatly appreciated.
Reporting & Analytics | | haydennz0 -
Sub-category considered duplicate content?
Hello, My craw diagnostics from the PRO account is telling me that the following two links have duplicate content and duplicate title tag: http://www.newandupcoming.com/new-blu-ray-releases (New Blu-ray Releases) http://www.newandupcoming.com/new-blu-ray-releases/action-adventure (New Action & Adventure Releases | Blu-ray) I am really new to the SEO world so I am stuck trying to figure out the best solution for this issue. My question is how should I fix this issue. I guess I can put canonical tag on all sub-categories but I was worried that search engines would not craw the sub-categories and index potentially valuable pages. Thanks for all the help.
Reporting & Analytics | | hirono0