Boat broker - issues with duplicate content and indexing search results
-
Hello,
I have read a lot about optimising product pages and not indexing search results or category pages as ideally a person should be directed straight to a product page.
I am interested in how best to approach a site that is listing second hand products for sale - essentially a marketplace of second hand goods (in my case, www.boatshed.com - international boat brokers).
For example, we currently have 5 Colvic Sailer 26 boats for sale across the world - that is 5 boats of the same make and model but differing years, locations, sellers and prices.
My concern is with search results and 'category' pages. Unlike typical e-commerce sites, when someone searches for a 'Colvic sailer 26 for sale' I want them to go to a search results style page as it is more useful for them to see a list of boats than one random one that Google decides is most important (or possibly one it can match by location).
Currently we have 3 different URL types to show search results style pages (i.e. paginated lists of boats that include name, image and short description):
manufacturer URL's e.g. http://www.boatshed.com/colvic-manufacturer-145.html
category URL's e.g. barges http://www.boatshed.com/barges-category-55.html
and normal search results e.g. dosearch.php?form_boattype_textbox=&....I have noindexed the search results pages but our category and manufacturer URLs show up in search results and ultimately these are pages I want people to land on. I am however getting duplicate content warnings in Moz. Most boats are in several categories and all will come up on 1 manufacturer and one manufacturer and model page.
Both sets of URL's are in my opinion needed; lots of users search for exact makes / models and lots of users just search for the type of boat e.g. 'barge for sale' so both sets of landing pages are useful.
Any suggestions or thoughts greatly appreciated
Thanks
Ben
-
I have run into this same problem in multiple industries.
John Mueller at Google has answered my question multiple times on this subject saying that internal duplicate content is NOT an issue and especially if you can justify it being there a manual review of sorts would not identify a problem. Google will make a judgement call on what the best page to serve is by the users query.
In your case you are doing what should be done, the least helpful page is the actual product page and in many cases in the past I have no indexed them.
This same problem arises for many industries, real estate, office space, used cars etc...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Review snippets not shown on google search results
Hi, In Moz it shows that we have a review snippet for a keyword/page, but it is not shown on google SERP. Can anyone explain why it isnt shown on Google search results, and what we should do in order to get it shown ?
On-Page Optimization | | jensatlieto0 -
Should I block google indexing "search.php"
My question is I have a search page on our website , you can search by date, number of people staying and so on, I am just wondering should block this in the robots.txt ? Because we have pretty URL'S already for searching by county and searching by towns. I cannot see any benefit of having e.g "search/search.php?sp_dateFrom=16%2F12%2F2015&sp_dateTo=23%2F12%2F2015&sec_drop%5B%5D=727&spesh_town_id=764&q=&occupants=5&bedrooms=3&submit=SEARCH#search" indexed. Would I be correct in doing this ?
On-Page Optimization | | McCaldin0 -
I am trying to better understand solving the duplicate content issues highlighted in your recent crawl report of our site - www.thehomesites.com.
Below are some of the urls highlighted as having duplicate content -
On-Page Optimization | | urahul
http://www.thehomesites.com/zip_details/76105
http://www.thehomesites.com/zip_details/44135
http://www.thehomesites.com/zip_details/75227
http://www.thehomesites.com/zip_details/94501 These are neighborhood reports generated for 4 different zip codes. We use a standard template to create these reports. What are some of the steps we can take to avoid these pages being categorized as duplicate content?0 -
Duplicate content issue, across site domains (blogging)
Hi all, I've just come to learn that a client has been cross-posting their blog posts to other blogs (on higher quality domains, in some cases). For example - this is the same post on 3 different blogs. http://thebioethicsprogram.wordpress.com/2014/06/30/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
On-Page Optimization | | ketanmv
http://blogs.law.harvard.edu/billofhealth/2014/06/29/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
http://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html
And, sometimes a 4th time, on an NPR website. I'm assuming this is doing no one any favors and Harvard or NPR is going to earn the rank most every time. I'm going to encourage them to publish only fresh content on their real blog, would you agree? Can this actually harm the ranking of their blog and website - should we delete the old entries when migrating the blog? They are going to move their Wordpress Blog to hosting on their real domain soon:
http://www.bioethics.uniongraduatecollege.edu/news/ The current set up is not adding any value to their domain. Thank you for any advice! Ketan0 -
Duplicate Content only an Issue on a Huge Scale?
To what extent is duplicate content an issue? We have a support forum with some duplicate content because users ask the same questions. The Moz reports we receive highlights our duplicate content and page title for our support forum as a "big" issue. I'm unsure to what extent it harms our SEO, and making the support section non-crawable would impair our level of support. It would be nice to know for sure if we should be concerned about this, and if yes, how can we do it differently? Thanks, I appreciate you help. -Allan
On-Page Optimization | | Todoist0 -
301 redirected Duplicate Content, still showing up as duplicate after new crawl.
We launched a site where key landing pages were not showing up in google. After running the seomoz crawl it returned a lot of duplicate pages which may expalin this. The actual url of the page is /design and it was telling me the following were dupes: /design/family-garden-design
On-Page Optimization | | iterate
/design/small-garden-design
/design/large-rural-garden-design
/Design All of these URL's were in fact pointing to the /design landing page. I 301 redirected all of the pages so they all now resolve to /design After running another crawl the day after doing this it's still showing up as duplicate content on seomoz. Does seomoz evaluate the new changes right away?0 -
Should H1s be used in the logo? If they are and it is dynamic on each page to relate to the page content, is this detrimental to the site rather than having it in the page content?
On some sites, the H1 is contained within the logo and remains consistent throughout the site (i.e. the company name is in the of the logo). If the h1 in a logo is dynamic for each page (i.e. on the homepage it is company name - homepage) is this better or worse to have it changed out on the logo rather than having it in the page content?
On-Page Optimization | | CabbageTree0 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0