Tired of finding solution for duplicate contents.
-
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below.
http://i.imgur.com/TXPretv.png
You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there.
You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct.
I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues.
Please help ME
-
Thank you. I will tell to my developer regarding this issue and than see what they reply.
-
You could canonical the "/portable-hard-disk" pages back up to "/hard-disk", but honestly, unless this is a widespread problem, I'd probably ignore it. if you have a lot of these sub-categories with duplicate search results, then I'd consider changing up your canonical scheme or NOINDEX'ing some sub-categories - search results just aren't high-value to Google, especially if they start all looking the same.
If this is an isolated occurrence, though, it's a lot of trouble for a relatively minor problem. It would take a pretty deep knowledge of your product inventory and site structure to know for sure, but my gut reaction is that this is a small issue.
-
So right now what should i do to solve this problem ?
-
I talked to the technical team. The screen may be a bit confusing. Your "items_per_page" variations are not being flagged as a duplicate of "/hard-disk/portable-hard-disk/". All of the pages (including the items_per_page variants) are being flagged as near-duplicates (95%+) of "/hard-disk". Basically, since those pages show the exact same products and only differ by a header, we're flagging them as being too similar. Once we do that, then all of the other pages that canonical to the "/portable-hard-disk" page also look like near-duplicates of "/hard-disk".
It's not catastrophic, but if you have enough of these category/sub-category search pages that overlap on their results, you may want to reconsider whether you index all of them. At small scale, it's not a big deal. At large scale, these very similar pages could dilute your ranking ability.
-
We don't currently have a way to ignore warnings/errors, although I know that's on the wish list. Let me ping the Product Team on this one and see if they have any additional insight.
-
Then how can I rip off from seomoz crawler those links ?
-
As best I can tell, your canonical tags are properly implemented and Google doesn't seem to be indexing any URLs with "items_per_page" in them. Our crawler and desktop crawlers may be getting confused because there are internal paths to these variations.
Ideally, that pulldown probably shouldn't be crawlable, but I think your canonical implementation as it stands is ok. I don't see any evidence that Google is having problems with it. It may just be a false alarm on our part.
-
SEO spider is showing meta descriptions and is not saying that content is duplicate. It means it is not checking rel canonical on these pages as well. So it is not an issue.
Note that duplicate title / desc does not mean content is duplicate.
Tools which are looking at one thing only will give this issue. Tools which are specific for finding duplicate content will not give an issue.
-
I also checked through Xenu and Screaming Frog Spider and both are showing the same thing. Check the attachment
-
But the pages were there before add-on was added. Right ?
If they were then the Google may have crawled them and SEOMoz may have picked them from Google or some other engines which resulted in the issue.
So I suggest to wait and watch as you will get Crawl Errors every week from SEOMoz.
-
I Installed the add-on before the product was added.
-
As of now your rel="canonical" immplementation looks fine. So these errors may have been found when you were not using rel="canonical" and you were not using AJAX for showing the different number of results.
You should wait for next weeks results and the results should come fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Avoiding duplicate content on internal pages
Lets say I'm working on a decorators website and they offer a list of residential and commercial services, some of which fall into both categories. For example "Internal Decorating" would have a page under both Residential and Commercial, and probably even a 3rd general category of Services too. The content inside the multiple instances of a given page (i.e. Internal Decorating) at best is going to be very similar if not identical in some instances. I'm just a bit concerned that having 3 "Internal Decorating" pages could be detrimental to the website's overall SEO?
Technical SEO | | jasonwdexter0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Taking descriptions from Manufacturer sites and Duplicate content
We are doing some inventory improvements eg new photographs from various angles, etc. We are also writing descriptions for each product.. As one of our suppliers has perfect desriptions on their site what is the theory on how duplicate content will affect our ranking for these products if we copy and paste? Also if we change the descriptions, just how different do they need to be? Thanks
Technical SEO | | seanmccauley1 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | | vforvinnie1 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0 -
Getting rid of duplicate content with rel=canonical
This may sound like a stupid question, however it's important that I get this 100% straight. A new client has nearly 6k duplicate page titles / descriptions. To cut a long story short, this is mostly the same page (or rather a set of pages), however every time Google visits these pages they get a different URL. Hence the astronomical number of duplicate page titles and descriptions. Now the easiest way to fix this looks like canonical linking. However, I want to be absolutely 100% sure that Google will then recognise that there is no duplicate content on the site. Ideally I'd like to 301 but the developers say this isn't possible, so I'm really hoping the canonical will do the job. Thanks.
Technical SEO | | RiceMedia0