Duplicate content issue - online retail site.
-
Hello Mozzers, just looked at a website and just about every product page (there are hundreds - yikes!) is duplicated like this at end of each url (see below). Surely this is a serious case of duplicate content? Any idea why a web developer would do this? Thanks in advance! Luke
prod=company-081
prod=company-081&cat=2 -
Pleasure
-
Wait a minute...I'm interested in this.
How did you ninjas find the website? Must've been posted prior to the answers and removed later.
Luke, would you mind pm'ing me with the pages? I think I could benefit from a look at it.
-
Thanks Amelia
-
Thanks Gary - good to know - much appreciated, Luke
-
Thanks CommercePundit - yes, this site needs a great deal of work, that's for sure.
-
Very common.
There are a bunch of simple solutions.
First you can look at putting some php at the top of the page that looks to see if "cat" has a value. if so then put an if clause in the meta section. if cat not empty show meta noindex etc...
Another option is for you to go into webmaster tools and change the "URL Parameters" in the "Crawl" menu. Select the option that best fits.
I have had to do this for many sites and seen great results once sorted out.
You can also use the "Remove URLs" tool under the "Google Index" menu to remove large sections quickly if they all fall under a specific pattern or path.
-
Thanks for useful feedback EGOL - yes I'm on to advising on content (or lack of!) next (gulp!).
-
Hi Luke,
These all are scraped pages according to Google Guideline - https://support.google.com/webmasters/answer/2721312?hl=en&ref_topic=2371375
There are 2 ways to resolve this issue.
1: You will have to remove these kind of pages.
2: If these all are your product pages then you should include generic information about your product and unique description or you can say very exact description about your product.Above 2 things only is the solution for your question.
-
Surely this is a serious case of duplicate content?
It is not too serious... because it can probably be fixed without a huge effort.
I have not done a thorough study of the problem but you might be able to solve it by changes to the CMS or with rel=canonical or with redirects.
Any idea why a web developer would do this?
This is really common. Lots of developers are developers and not SEOs.
IMO you have a much larger problem. The pages of this website have almost zero content. Almost guaranteed to perform poorly or be hit with a panda problem.
-
Hi Luke,
Use the canonical tag. Some info from Google here: https://support.google.com/webmasters/answer/139394?hl=en
Good luck!
Amelia
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Causing Duplicate Content
I use Opencart and have found that a lot of my duplicate content (mainly from Products) which is caused by the Search function. Is there a simple way to tell Google to ignore the Search function pathway? Or is this particular action not recommended? Here are two examples: http://thespacecollective.com/index.php?route=product/search&tag=cloth http://thespacecollective.com/index.php?route=product/search
Intermediate & Advanced SEO | | moon-boots0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Ranking sites in vertical markets with 90% scraped content
Hi, Hoping to get advice about ranking sites (a vertical market search engine/portal like a car site for example) that gets its content from scraping car sites. For various reasons (mostly scale eg cant get car dealers to push their listings to us) content was scraped. The startup has received great press, TV interviews, incubator programs etc, and has also secured very significant investment. I feel if this site was launched pre-panda it would be ranking much better. We have invested significantly in our tech, our search tools and site innovation place us easily as market leader in this space. Anyone with experience in ranking sites with legitimate reasons for using scraped content?
Intermediate & Advanced SEO | | edthomasnp0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071