Tired of finding solution for duplicate contents.
-
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below.
http://i.imgur.com/TXPretv.png
You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there.
You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct.
I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues.
Please help ME
-
Thank you. I will tell to my developer regarding this issue and than see what they reply.
-
You could canonical the "/portable-hard-disk" pages back up to "/hard-disk", but honestly, unless this is a widespread problem, I'd probably ignore it. if you have a lot of these sub-categories with duplicate search results, then I'd consider changing up your canonical scheme or NOINDEX'ing some sub-categories - search results just aren't high-value to Google, especially if they start all looking the same.
If this is an isolated occurrence, though, it's a lot of trouble for a relatively minor problem. It would take a pretty deep knowledge of your product inventory and site structure to know for sure, but my gut reaction is that this is a small issue.
-
So right now what should i do to solve this problem ?
-
I talked to the technical team. The screen may be a bit confusing. Your "items_per_page" variations are not being flagged as a duplicate of "/hard-disk/portable-hard-disk/". All of the pages (including the items_per_page variants) are being flagged as near-duplicates (95%+) of "/hard-disk". Basically, since those pages show the exact same products and only differ by a header, we're flagging them as being too similar. Once we do that, then all of the other pages that canonical to the "/portable-hard-disk" page also look like near-duplicates of "/hard-disk".
It's not catastrophic, but if you have enough of these category/sub-category search pages that overlap on their results, you may want to reconsider whether you index all of them. At small scale, it's not a big deal. At large scale, these very similar pages could dilute your ranking ability.
-
We don't currently have a way to ignore warnings/errors, although I know that's on the wish list. Let me ping the Product Team on this one and see if they have any additional insight.
-
Then how can I rip off from seomoz crawler those links ?
-
As best I can tell, your canonical tags are properly implemented and Google doesn't seem to be indexing any URLs with "items_per_page" in them. Our crawler and desktop crawlers may be getting confused because there are internal paths to these variations.
Ideally, that pulldown probably shouldn't be crawlable, but I think your canonical implementation as it stands is ok. I don't see any evidence that Google is having problems with it. It may just be a false alarm on our part.
-
SEO spider is showing meta descriptions and is not saying that content is duplicate. It means it is not checking rel canonical on these pages as well. So it is not an issue.
Note that duplicate title / desc does not mean content is duplicate.
Tools which are looking at one thing only will give this issue. Tools which are specific for finding duplicate content will not give an issue.
-
I also checked through Xenu and Screaming Frog Spider and both are showing the same thing. Check the attachment
-
But the pages were there before add-on was added. Right ?
If they were then the Google may have crawled them and SEOMoz may have picked them from Google or some other engines which resulted in the issue.
So I suggest to wait and watch as you will get Crawl Errors every week from SEOMoz.
-
I Installed the add-on before the product was added.
-
As of now your rel="canonical" immplementation looks fine. So these errors may have been found when you were not using rel="canonical" and you were not using AJAX for showing the different number of results.
You should wait for next weeks results and the results should come fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL slash creating duplicate content
Hi All, I currently have an issue whereby by domain name (just homepage) has: mydomain.com and: mydomain.com/ Moz crawler flags this up as duplicate content - does anyone know of a way I can fix this? Thanks! Jack
Technical SEO | | Jack11660 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
Duplicate content and rel canonicals?
Hi. I have a question relating to 2 sites that I manage with regards to duplicate content. These are 2 separate companies but the content is off a data base from the one(in other words the same). In terms of the rel canonical, how would we do this so that google does not penalise either site but can also have the content to crawl for both or is this just a dream?
Technical SEO | | ProsperoDigital0 -
Duplicate Page Content but where?
Hi All Moz is telling me I have duplicate page content and sure enough the PA MR mT are all 0 but it doesnt give me a link to this content! This is the page: http://www.orsgroup.com/index.php?page=Scanning-services But I cant find where the duplicate content is other than on our own youtube page which I will get removed here: http://www.youtube.com/watch?v=Pnjh9jkAWuA Can anyone help please? Andy
Technical SEO | | ORS-Group0 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Duplicate Content
The crawl shows a lot of duplicate content on my site. Most of the urls its showing are categories and tags (wordpress). so what does this mean exactly? categories is too much like other categories? And how do i go about fixing this the best way. thanks
Technical SEO | | vansy0 -
How do i deal with duplicate content on the same domain?
I'm trying to find out if there's a way we can combat similar content on different pages on the same site, without having to re write the whole lot? Any ideas?
Technical SEO | | indurain0