How to avoid duplicate content on internal search results page?
-
Hi,
according to Webmaster Tools and Siteliner our website have an above-average amount of duplicate content.
Most of the pages are the search results pages, where it finds only one result. The only difference in this case are the TDK, H1 and the breadcrumbs. The rest of the layout is pretty static and similar.
Here is an example for two pages with "duplicate content":
https://soundbetter.com/search/Globo
https://soundbetter.com/search/Volvo
Edit: These are legitimate results that happen to have the same result. In this case we want users to be able to find the audio engineers by 'credits' (musicians they've worked with). Tags. We want users to rank for people searching for 'engineers who worked with'. And searching for two different artists (credit tags) returns this one service provider, with different urls (the tag being the search parameter) hence the duplicate content.
I guess every e-commerce/directory website faces this kind of issue.
What is the best practice to avoid duplicate content on search results page?
-
It really depends on your developers and your budget. I do development and SEO, so this is how I would handle it. On searches that are returning just one result, I would put something in place to see how many results are returned, if it is only one result returned, in the head of the page I would set the canonical url for the search page to the actual page that is being returned as the result.
If more result is being returned, you can handle that in many different ways. One way would be to create a pseudo category out of the results page. I would use this sparingly and only for popular search terms. But you could have an extension written for your site that can give you some on page control of the text, the url, the meta areas, and things like that. I wrote a module for a platform I use a couple of years ago that does something like it. http://blog.dh42.com/search-pages-landing-pages/ You can get the gist of the idea by reading about it there, but that is one good way to handle a limited number of them to get them to rank better. I would not do it with every search result though, you might get a penalty.
-
Sorry, I misread it. I think either or in regards to the robots or on page is applicable. I think the on page would make them fall out faster though.
-
I wouldn't do a no follow however
I agree. My solution was to use NOINDEX, FOLLOW.
-
Thanks Prestashop for your answer.
Is there another solution other than no-indexing all our search results?
Like many sites (yelp, tripadvisor and others) our search results help drive traffic. They aggregate the answer to questions that are asked in searches, such as 'recording studios in london'.
https://soundbetter.com/search/Recording Studio - Engineer/London, UK
-
I would add it to the robots.txt file. Depending on how your cms is set up, you can grab the search string from the current url and also use the presence of it to fire a no index as well. I wouldn't do a no follow however, there is nothing bad about following it, it is just the indexing of the search pages.
-
Hey Prestashop
To add a little more clarity - would you:
a.) add /search/ to robots.txt, like so:
Disallow: /search/or
b.) add noindex/nofollow at page level: like so:
in the search results page template.I would opt for option b, but it would be interested to hear your thoughts too and why.
Thanks,
-
No-index your search results. Most platforms do it by default to eliminate that error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Shopify Duplicate Content in products
Hello Moz Community, New to Moz and looking forward to beginning my journey towards SEO education and improving our clients' sites. Our client's website is a Shopify store. https://spiritsofthewestcoast.com/ Our first Moz reports show 686 duplicate content issues. I will show the first 4 as examples. https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/haida-eagle-teardrop-earrings https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/haida-orca-silver-earrings https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/silver-oval-earrings https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/haida-eagle-spirit-silver-earrings As you can see, URL titles are unique. But I know that the content in each of those products have very similar product descriptions but not exactly. But since they have been flagged as a site issue by Moz, I am guessing that the content is 95% duplicate. So can a rel=canonical be the right solution for this type of duplicate content? Or should I be considering adding new content to each of 686 products to drop below the 95% threshold? Or another solution that I may not be aware of. Thanks in advance for your assistance and expertise! Sean
Technical SEO | | TheUpdateCompany1 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Search result pages - noindex but auto follow?
Hi guys, I don't index my search pages, and currently my pages are tagged name="robots" content="noindex"> Do I need to specify follow or will it automatically be done? Thanks Cyto
Technical SEO | | Bio-RadAbs0 -
Does duplicate content on word press work against the site rank? (not page rank)
I noticed in the crawl that there seems to be some duplicate content with my word press blog. I installed a seo plugin, Yoast's wordpress seo plugin, and set it to keep from crawling the archives. This might solve the problem but my main question is can the blog drag my site down?
Technical SEO | | tommr10 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0