E-commerce site, one product multiple categories best practice
-
Hi there,
We have an e-commerce shopping site with over 8000 products and over 100 categories.
Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees"
The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well.
Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options:
- Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking.
- Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page.
- Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well)
- Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages
This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it.
Thank you all!
-
Hi,
This topic is quite old, but is still relevant.
I understand that the solution mentioned above is the most thorough one.
But is there something wrong with just using canonicals? In a webshop that we are managing, there are just a couple of subcategories that belong to different categories. An example:
Only these two URL's will generate duplicate content, since the categories above 'Company law' ('Economic law' and 'Companies') clearly have different content. Can't you just pick one version as the canonical one? Since we have just a couple of these categories, this is an easier solution.
Thanks for your feedback guys!
-
Thought I'd answer my own question!! (with the help of Dr Pete, who answered this question in private Q&A)
"The multiple path issue is tough - you can't really have a path visitors can follow and then hide that from Google (or, at least, it's not a good idea). You could NOINDEX certain paths, but that's a complex consideration (it has pros and cons and depends a lot on your goals and site architecture).
If you generate the breadcrumb path via user activity and store it in a session/cookie, that's generally ok. Google's crawlers, as well as any visitor who came to the site via search, would see a default breadcrumb, but visitors would see a breadcrumb based on their own activity. That's fine, since the default is the same for humans as for spiders."
That seems to be a fairly conclusive answer IMO.
-
Hi Arik,
I'd really like an answer to this aswell, as there seems to be no clear answer online.
My understanding is that a breadcrumb should specify a canonical crawl path (not based on referral path), so option 1 is out
option 2 seems suboptimal and not something I can recall seeing implemented on other sites
options 3 and 4: I don't want multiple URLs and to use rel=canonical as I already have one definitive URL.
This seems like it must be a fairly regular problem people have, but cant see a good solution online anywhere
Help anyone?
-
Dear All,
I repeat about Option 1: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking.
Changing content based on the referral path means that the same url will have different content at times. Which means that the search engine will probably find a different content on the page than some other views of the page. As far as I know, this is cloaking - please correct me if I'm wrong.
Option 4 will not necessarily achieve the desired effect as the search engine might decide to ignore the tag. i checked a few examples that this is actually what happens when other e-commerce stores use canonical - you find both URLs in the serps. So I doubt this is the perfect solution...
I'm still not convinced that I have a definitive answer for this. Anyone?
Thanks!
-
Option 1 is not cloaking - it is displaying content dynamically. Cloaking would be if you showed one page to viewers and a different version to Googlebot.
I would say it depends on how different pages are. If all that changes in the breadcrumbs, they I would say you're fine with options 1, 2, or 4.
If the pages are significantly different, such as different category names, page titles, descriptive text, etc. I would go with option 4.
-
Thanks Adam.
I very much respect your opinion and even agree that from a user's point of view option 1 is the best.
I wonder though - it's this considered as cloaking?
|
|
From:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
Some examples of cloaking include:
[...]
Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor|
|
This becomes more complicated, as the path the user chose to get to the specific subcategory or product page reflects not only on the breadcrumbs but also on the category's navigation menu and possibly the descriptive text of the category.
What's your take on this?
-
Options 1, 2, or 4 should be fine. Option 3 is not recommended.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices for Title Tags for Product Listing Page
My industry is commercial real estate in New York City. Our site has 300 real estate listings. The format we have been using for Title Tags are below. This probably disastrous from an SEO perspective. Using number is a total waste space. A few questions:
Intermediate & Advanced SEO | | Kingalan1
-Should we set listing not no index if they are not content rich?
-If we do choose to index them, should we avoid titles listing Square Footage and dollar amounts?
-Since local SEO is critical, should the titles always list New York, NY or Manhattan, NY?
-I have red that titles should contain some form of branding. But our company name is Metro Manhattan Office Space. That would take up way too much space. Even "Metro Manhattan" is long. DO we need to use the title tag for branding or can we just focus on a brief description of page content incorporating one important phrase? Our site is: w w w . m e t r o - m a n h a t t a n . c o m <colgroup><col width="405"></colgroup>
| Turnkey Flatiron Tech Space | 2,850 SF $10,687/month | <colgroup><col width="405"></colgroup>
| Gallery, Office Rental | Midtown, W. 57 St | 4441SF $24055/month | <colgroup><col width="405"></colgroup>
| Open Plan Loft |Flatiron, Chelsea | 2414SF $12,874/month | <colgroup><col width="405"></colgroup>
| Tribeca Corner Loft | Varick Street | 2267SF $11,712/month | <colgroup><col width="405"></colgroup>
| 275 Madison, LAW, P7, 3,252SF, $65 - Manhattan, New York |0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Best format for E-Commerce Pages in Title Text / Link Text & Markup
Hello Please comment on which you think is best SEO practice for each & any comments on link juice following through. Title text ( on Product Page ) <title>Brandname ProductName</title>
Intermediate & Advanced SEO | | s_EOgi_Bear
OR
<title>ProductName by Brandname</title> on category page <a <span="" class="html-attribute-name">itemprop="name" href="[producturl]">ProductName</a>
<a <span="" class="html-attribute-name">itemprop="brand" href="[brandurl]>BrandName</a> OR <a <span class="html-attribute-name">itemprop="name" href="[producturl]">BrandName ProductName
( Leave Brand Link Out)</a <span> Product Page <a itemprop="name" href="[producturl]">ProductName
<a itemprop="brand" href="[brandurl]>BrandName</a itemprop="brand" href="[brandurl]></a itemprop="name" href="[producturl]"> OR <a itemprop="name" href="[producturl]">BrandName ProductName
( Leave Brand Link Out)</a itemprop="name" href="[producturl]"> Thoughts?0 -
What are the best practices for microdata?
Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.
Intermediate & Advanced SEO | | WebElaine0 -
Please share best practices for subfolders and paths in a domain name
I am seeking feedback on the best way to proceed with regards to a project I am working on. Say for example the domain was domain.com and this site wanted to target specific markets such as realtors, attorneys, churches, and restaurants. Which URL structure would be better? domain.com/industries/attorneys or domain.com/attorneys Can I get your feedback along with any supporting articles. This is for a large ecommerce site but these particular pages are solely going to be used for marketing purposes to bring those site visitors to the website to let them know we understand their needs. Thanks for your help> Malcom
Intermediate & Advanced SEO | | PrintPlace.com0 -
One website, multiple service points
Hi all Wondering if anyone could offer an opinion on this..am talking to a new client who offer kitchen installation and design. They have a central headquarters and cover a 100 mile radius of their location. A lot of search terms they are aiming to target - Kitchen Design, Kitchen Fitters etc offer localised results. This is where my issue lays. I have worked with plenty of clients in the past which have physical presence in multiple locations and have marked up the site so that the site ranks for each of the stores, but trying to make one site appear in many locations where it doesn't have an address is a different issue completely. Not only do they only have one address, they also only have one phone number. We will target, as best we can, the non localised keywords but need to work out what to do to cover the locations 20/30/40 miles from the office which they cover. I welcome any opinions on this please.
Intermediate & Advanced SEO | | Grumpy_Carl0 -
XML Sitemaps for Message Boards / Forums - Best Practices?
I'm working with a message board that has been around for 10+ years and never taken SEO best practices into consideration. They recently started seeing mobile URLs show up in regular results, which they don't want. I'm recommending they implement multiple sitemaps to properly indicate to Google how to crawl the site and what to index. I've never dealt with a site this large so I'm not sure best practices. They have a HUGE community and new URLs are created every second. Doing a site: search returns "About 12,100,000" URLs. What are some best practices / the best way to approach sitemaps for a site of this size?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Multiple Versions of Pages on One Website
Hi! My name is Sarah and I work for a brand design firm in Los Angeles. Currently we're working on a website redesign for our company. We have three pages of content that we want to add to the site, but are unsure if we will get penalized by Google if we add all of them since they may come off as too similar? The pages are: Branding
Intermediate & Advanced SEO | | Jawa
Personal Branding
Corporate Branding Does anyone know if our SEO will be penalized for having all three of these pages separately, or should we just focus on Branding, and include Personal Branding and Corporate Branding as sub categories on the page? Thanks! Sarah P.S. I should also say, we will have more than just the three aforementioned pages. It's going to be a big site with around 200+ pages. (Half of them being services, which is where the Branding, PB and CB pages will be located.)0