Tired of finding solution for duplicate contents.
-
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below.
http://i.imgur.com/TXPretv.png
You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there.
You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct.
I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues.
Please help ME
-
Thank you. I will tell to my developer regarding this issue and than see what they reply.
-
You could canonical the "/portable-hard-disk" pages back up to "/hard-disk", but honestly, unless this is a widespread problem, I'd probably ignore it. if you have a lot of these sub-categories with duplicate search results, then I'd consider changing up your canonical scheme or NOINDEX'ing some sub-categories - search results just aren't high-value to Google, especially if they start all looking the same.
If this is an isolated occurrence, though, it's a lot of trouble for a relatively minor problem. It would take a pretty deep knowledge of your product inventory and site structure to know for sure, but my gut reaction is that this is a small issue.
-
So right now what should i do to solve this problem ?
-
I talked to the technical team. The screen may be a bit confusing. Your "items_per_page" variations are not being flagged as a duplicate of "/hard-disk/portable-hard-disk/". All of the pages (including the items_per_page variants) are being flagged as near-duplicates (95%+) of "/hard-disk". Basically, since those pages show the exact same products and only differ by a header, we're flagging them as being too similar. Once we do that, then all of the other pages that canonical to the "/portable-hard-disk" page also look like near-duplicates of "/hard-disk".
It's not catastrophic, but if you have enough of these category/sub-category search pages that overlap on their results, you may want to reconsider whether you index all of them. At small scale, it's not a big deal. At large scale, these very similar pages could dilute your ranking ability.
-
We don't currently have a way to ignore warnings/errors, although I know that's on the wish list. Let me ping the Product Team on this one and see if they have any additional insight.
-
Then how can I rip off from seomoz crawler those links ?
-
As best I can tell, your canonical tags are properly implemented and Google doesn't seem to be indexing any URLs with "items_per_page" in them. Our crawler and desktop crawlers may be getting confused because there are internal paths to these variations.
Ideally, that pulldown probably shouldn't be crawlable, but I think your canonical implementation as it stands is ok. I don't see any evidence that Google is having problems with it. It may just be a false alarm on our part.
-
SEO spider is showing meta descriptions and is not saying that content is duplicate. It means it is not checking rel canonical on these pages as well. So it is not an issue.
Note that duplicate title / desc does not mean content is duplicate.
Tools which are looking at one thing only will give this issue. Tools which are specific for finding duplicate content will not give an issue.
-
I also checked through Xenu and Screaming Frog Spider and both are showing the same thing. Check the attachment
-
But the pages were there before add-on was added. Right ?
If they were then the Google may have crawled them and SEOMoz may have picked them from Google or some other engines which resulted in the issue.
So I suggest to wait and watch as you will get Crawl Errors every week from SEOMoz.
-
I Installed the add-on before the product was added.
-
As of now your rel="canonical" immplementation looks fine. So these errors may have been found when you were not using rel="canonical" and you were not using AJAX for showing the different number of results.
You should wait for next weeks results and the results should come fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
How to fix duplicate content caused by tags?
I use SEMRush, and the issue they are finding is I have 30 duplicate content issues. All seem to be caused by the tags I add in my portfolio pieces. I have looked at my SEO settings (taxonomies, etc) in the Wordpress site, and don't know what I am doing wrong....any advice how to fix? I have attached a screen shot VsYv2wY
Technical SEO | | cschwartzel0 -
Is this considered Duplicate Content?
Good Morning, Just wondering if these pages are considered duplicate content? http://goo.gl/t9lkm http://goo.gl/mtfbf Can you please take a look and advise if it is considered duplicate and if so, what should i do to fix... Thanks
Technical SEO | | Prime850 -
404 and Duplicate Content.
I just submitted my first campaign. And it's coming up with a LOT of errors. Many of them I feel are out of my control as we use a CMS for RV dealerships. But I have a couple of questions. I got a 404 error and SEO Moz tells me the link, but won't tell me where that link originated from, so I don't know where to go to fix it. I also got a lot of duplicate content, and it seems a lot of them are coming from "tags" on my blog. Is that something I should be concerned about? I will have a lot more question probably as I'm new to using this tool Thanks for the responses! -Brandon here is my site: floridaoutdoorsrv.com I welcome any advice or input!
Technical SEO | | floridaoutdoorsrv0 -
Duplicate Home Page content and title ... Fix with a 301?
Hello everybody, I have the following erros after my first crawl: Duplicate Page Content http://www.peruviansoul.com http://www.peruviansoul.com/ http://www.peruviansoul.com/index.php?id=2 Duplicate Page title http://www.peruviansoul.com http://www.peruviansoul.com/ http://www.peruviansoul.com/index.php?id=2 Do you think I could fix them redirecting to http://www.peruviansoul.com with a couple of 301 in the .htaccess file? Thank you all for you help. Gustavo
Technical SEO | | peruviansoul0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Cross-domain duplicate content issue
Hey all, Just double-checking something. Here's the issue, briefly. One of my clients is a large law firm. The firm has a main site, and an additional site for an office in Atlanta. On the main site, there is a list of all attorneys and links to their profiles (that they wrote themselves). The Atlanta site has this as well, but lists only the attorneys located in that office. I would like to have the profiles for the Atlanta lawyers on both sites. Would rel=canonical work to avoid a dupe-content smackdown? The profiles should rank for Atlanta over the main site. This just means that G will drop the main site's profiles (for those attorneys) from their index, correct? No other weird side effects? I hope I worded all that clearly!
Technical SEO | | LCNetwork0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0