Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
ECommerce product listed in multiple places, best SEO practice?
-
We have an eCommerce site we have built for a customer and the products are allowed to appear in more than one product category within the web site.
Now I know this is a bad idea from a duplicate content point of view, But we are going to allow the customer to select which out of the multiple categories the product appears in will be the default category.
This will mean we will have a way of defining what the default url is for a product.
So am I correct in thinking all the other urls where the product appears we should add a rel canonical to these pages pointing to the default url to stop duplicate content?
Is this the best way?
-
Site is still in testing but as an example the products are for cars and the same product can fit multiple cars, hence why it may be in more than one place
-
Short answer: using rel="canonical" to point to the default URL for each product will solve the product-related duplicate content issue.
As for the "best" way to solve this problem, I actually prefer using a robots meta tag with "noindex, follow" attributes on each duplicate product page.
Since the duplicate URLs only exist for navigational purposes, I'd prefer to keep them out of the search engine indexes altogether.
Once you have this issue squared away, be sure to read through these resources for other important eCommerce considerations:
-
Another option i thought of would be that the other pages I simply add a no index meta tag to so it only indexes the one url.
The reason I say this as from experience of how seomoz crawls sites and some pages which i know are not duplicate content yet i have had many pages for many sites get tagged for duplicate content simply because there isn't enough of a difference on the page.
The product pages wouldb e very much the same as really all that would be different is the breadcrumb trail and possibly some other links 95% of the pages would be the same as they are the same product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps: Best Practice
What should and what shouldn't go in the sitemap? In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs? Thanks for any advice/ anecdotes 🙂
Intermediate & Advanced SEO | | Fubra0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Best Practices for Converting PDFs to HTML
We're working with a client who gets about 80% of their organic, inbound search traffic from links to PDF files on their site. Obviously, this isn't ideal, because someone who just downloads a PDF file directly from a Google query is unlikely to interact with the site in any other way. I'm looking to develop a plan to convert those PDF files to HTML content, and try to get at least some of those visitors to convert into subscribers. What's the best way to go about this? My plan so far is: Develop HTML landing pages for each of the popular PDFs, with the content from the PDF, as well as the option to download the PDF with an email signup. Gradually implement 301 redirects for the existing PDFs, and see what that does to our inbound SEO traffic. I don't want to create a dip in traffic, although our current "direct to inbound" traffic is largely useless. Are their things I should watch out for? Will I get penalized by Google for redirecting a PDF to HTML content? Other things I should be aware of?
Intermediate & Advanced SEO | | atourgates0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
When removing a product page from an ecommerce site?
What is the best practice for removing a product page from an Ecommerce site? If a 301 is not available and the page is already crawled by the search engine A. block it out in the robot.txt B. let it 404
Intermediate & Advanced SEO | | Bryan_Loconto0 -
How to perform Local SEO for sites like Angies List/Task Rabbit or Craigslist
I have a new SEO client that has a business model similar to Criagslist and Angies List or Task Rabbit, Where they offer local based services nationwide. My first thought was Local link building and citation building etc. But the issue is they are a purely online service company and they don't have a phyiscal address in every city/state they will be offering their services in. What is the best course of action for providing SEO services for this type of business model. I am pretty much at a stand still on how to rank them locally for the areas they provide services in. it's a business model that involves local businesses and customers looking for services from those local businesses.
Intermediate & Advanced SEO | | VITALBGS0 -
SEO value in multiple backlinks from same domain and from various sub-domains.
A site has a link to my site as one of their main tabs, which means whenever a user clicks through to another page within the site, my link - being a main tab - is there. This creates thousands of links from this site. How does Google treat this? Do we have a rough formula estimate. In other words, assume it creates 1,000 backlinks would the SEO value be around the same as if I had just 2 link total as a main tab, but on 2 different non-related sites? Or, does it actually count fully as 1,000 links? Links from various sub-domains. Several .EDU's are linking to my site. Different schools within the overall same university. Example: nursing.abc.edu links to my site, but so does business.abc.edu. For SEO does that count as much as if I had links from complete non-related universities, or would Google evaluate that these links are related (since same main domain) and that will discount any links more than 1 to some extent? If discounted, then what do we estimate the discount to be? thank yoyu
Intermediate & Advanced SEO | | knielsen1