Duplicate content warnings
-
I have a ton of duplicate content warnings for my site poker-coaching.net, but I can't see where there are duplicate URLs. I cannot find any function where I could check the original URL vs a list of other URLs where the duplicate content is?
-
thanks for the help. I am trying to cover all bases here. Duplicate content was one concern, the other one is too high link density and bad incoming links.
I have downloaded a full backlinks report now using Majestic SEO (OSE only shows incoming links from 74 domains...).
I think I may have found the problem. I used to have a forum on that domain years ago which was hacked and used for a lot of spammy outgoing links for stuff like Cialis, Viagra etc.. Those guys also linked from other sites to my forum pages. Example:| from: http://www.grupoibira.com/foro/viewto...
| Anchor: buy flagyl in lincolnu... | 3 | to: http://www.poker-coaching.net/phpbb3/...
|
When I closed the forum and deleted the forum plugin, I redirected all forum pages to my main page which, under the circumstances was a bad mistake I guess. Because with the redirect, all those spammy links now end up pointing to my main page, right? So first, I have removed that redirect now.
But the problem remains that I still have plenty of links from spam sites pointing to URLs of my domain that do not exist any more.
Is there anything else I can do to remove those links or have Google remove/disregard them, or do you think a reconsideration request explaining the situation would help? -
Honestly, with only 235 indexed pages, it's pretty doubtful that duplicate content caused you an outright penalty (such as being hit with Panda). Given your industry, it's much more likely you've got a link-based penalty or link quality issue in play.
You do have a chunk of spammy blog comments and some low-value article marketing, for example:
http://undertheinfluence.nationaljournal.com/2010/02/summit-attendees.php
A bit of that is fine (and happens in your industry a lot), but when it's too much of your link profile too soon, you could be getting yourself into penalty territory.
-
Hey There,
Just to clarify, to see the source of those errors, you’ll need to download your Full Crawl Diagnostics CSV and open it up in something like Excel. In the first column, perform a search for the URL of the page you are looking for. When you find the correct row, look in the last column labeled referrer. This tells you the referral URL of the page where our crawlers first found the target URL. You can then visit this URL to find the source of your errors. If you need more help with that, check out this link: http://seomoz.zendesk.com/entries/20895376-crawl-diagnostics-csv
Hope that helps! I will look at the issue on the back end to see if they are actually duplicate content.
Have a great day,
Nick
-
Thanks for looking into this. Actually I checked the whole site by doing a batch search on Copyscape and there were only minor duplicate content issues. I resolved those by editing the content parts in question (on February 24th 2012).
Since I am desperately searching for the reasons why this site was penalized (and it def is...), it would be great to know why your duplicate content checker finds errors. Could only be related to multiple versions of one page on different URLs. I do have all http://mysitedotcom redirected to www.mysitedotcom, and the trailing slash/notrailingslash URL problem was also resolved by a redirect long ago, so I do not know where the problem lies.
Thanks for the help! -
I think our system has roughly a 90-95% threshold for duplicate content. The pages I'm seeing in your campaign don't look that high, so something is up - I'm checking with support.
For now, use the "Duplicate Page Title" section - that'll tend to give you exact duplicates. The duplicate content detection also covers thin content and near duplicates.
-
Yes that is what I first thought too. If only it were that easy.
But when I do, I see a couple of URLs that definitely do not have any duplicate content . Could it be that the dupe content check considers text in sitewide modules (like the modules "Poker News" and "Tips for ...." in www.poker-coaching.net) as duplicate content, because they appear on all pages?
This way, the duplicate content finding function is totally worthless. -
If you drill down into your campaign report into 'Crawl Diagnostics' you will see a dropdown menu that's named "Show". Select 'Duplicate Page Content'... you will see a graph with a table below it. To the right of the URL you will see a column named "Other URL's". The numbers in that column are live links to a page with the list of URL's with duplicate content. At least that is how it is displayed in my campaigns.
-
You will find this information at google webmaster tools and at seomoz campaing. There you will the information you need.
One easy way to avoid this is to include the rel canonical metag. You need to include in every page (pages you want to be the official one) inside the head tag the follow:
where ww.example.com/index.html is your page adress. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Effect of inserting No indexed Contents in normal Pages (Nextgen Gallery)
Hello Dear Community, I'm running a photography website and have a question about the indexability of "No indexed Content" inserted on indexable pages. Background : I read everywhere that best practice is to "no index" all useless pages with few content, what I did with Yoast plugin : I no indexed all my nextgen galleries and "ngg_tags" since they create single pages for every photo, tags or slideshow. I did the same for all my porfolio-posts, price lists, testimonials and so on... Nevertheless, I inserted these galleries and portfolios on SEO optimized page for my target keywords. + Nextgen plugin automatically adds these images in the page sitemap. My idea is to have only my Seo optimized page showing in Google and not the others. Problem: I've been checking the results in Google Search Console, filtering by images : I discovered that most of the images featured in these Masonry galleries are not showing in google, and actually almost all the images indexed are the Wordpress from media gallery. I double checked with Screaming Frog, and the software doesn"t see images on these pages. My question is: Is the low indexablilty of these contents are related to the No indexation of the original contents ??? Does somebody has experienced the same issue that these contents doesn't show on Google ? in advance many thanks for your help
Reporting & Analytics | | TristanAventure0 -
Excluding Cookieless Static Content Sub-domain from GA/GTM
For the purposes of this question our ecommerce site url is www.ecommerce.com Our TLD is ecommerce.com We have, following advice from Yslow, Pagespeed and others, moved our static content to a subdomain - static.ecommerce.com We have Google Analytics and Enhance Ecommerce installed, fired from GTM. The cookieDomain setting in GTM is 'auto' At present cookies are being attached to our static resources. What changes do I need to make to to prevent this happening? Many thanks Julian
Reporting & Analytics | | jdeb0 -
I have a WP site which uses categories to display the same content in several locations. Which items should get a canonical tag to avoid a ding for duplicate content?
So...I have a Knowledge Center and press room that pretty much use the same posts. So...technically the content looks like its on several pages because the post shows up on the Category listing page. Do I add a Canonical tag to each individual post...so that it is the only one that is counted? Also...I have a LONG disclaimer that goes at the bottom of most of the posts. would this count as duplicate content? Is there a way to markup a single paragraph to tell the spiders not to crawl it?
Reporting & Analytics | | LindsayiHart0 -
Google Analytics: Dashboard to show popular content per directory
Hello, I work for a furniture business and I would like to set up a dashboard in Google Analytics to show a table for each of the 10 sections to show the most popular content, ie. /Sofas
Reporting & Analytics | | Bee159
/Sofas/black-leather-sofa | 987 PVs
/Sofas/brown-leather-sofa | 782 PVs
/Sofas/classic-material-sofa | 636 PVs
etc. /Beds
/Beds/king-size-bed | 900 PVs
etc How would I go about doing this? Thank you0 -
Someone mentioned us on facebook and brought 10k clicks to homepage.. how do I find the content?
So this morning our website started to go crazy with hits all coming from Facebook, and I am trying to figure out how I can see who was talking about us? All the traffic landed on one page specifically. When I look in analytics it just shows social, and then I can drill down to see they are coming from facebook, but that's it.
Reporting & Analytics | | DemiGR0 -
Large content snippets showing up as keywords?
I've started to notice something very strange: the search keywords report in analytics show a bunch of instances where a person copied large snippets of our site content and then pasted it into the search box. Half these searches are coming from the US and half from...India. I'm worried that this may be the sign of a competitor attempting to perform negative SEO on our site (though admittedly I don't know how). Anyone seen anything like this? Advice? Thanks!!
Reporting & Analytics | | SarahLK0 -
Duplicate page content
I'm seeing duplicate page content for tagged URLs. For example:
Reporting & Analytics | | DolbySEO
http://www.dolby.com/us/en/about-us/careers/landing.html
http://www.dolby.com/us/en/about-us/careers/landing.html?onlnk=al-sc as well as PPC campaigns. We tag certain landing pages purposefully in order to understand that traffic comes from these pages, since we use Google Analytics and don't have the abiility to see clickpaths in the package we have. Is there a way to set parameters for crawling to exclude certain pages or tagged content, such as those set up for PPC campaigns?0