Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
E-commerce site, one product multiple categories best practice
-
Hi there,
We have an e-commerce shopping site with over 8000 products and over 100 categories.
Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees"
The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well.
Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options:
- Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking.
- Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page.
- Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well)
- Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages
This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it.
Thank you all!
-
Hi,
This topic is quite old, but is still relevant.
I understand that the solution mentioned above is the most thorough one.
But is there something wrong with just using canonicals? In a webshop that we are managing, there are just a couple of subcategories that belong to different categories. An example:
Only these two URL's will generate duplicate content, since the categories above 'Company law' ('Economic law' and 'Companies') clearly have different content. Can't you just pick one version as the canonical one? Since we have just a couple of these categories, this is an easier solution.
Thanks for your feedback guys!
-
Thought I'd answer my own question!! (with the help of Dr Pete, who answered this question in private Q&A)
"The multiple path issue is tough - you can't really have a path visitors can follow and then hide that from Google (or, at least, it's not a good idea). You could NOINDEX certain paths, but that's a complex consideration (it has pros and cons and depends a lot on your goals and site architecture).
If you generate the breadcrumb path via user activity and store it in a session/cookie, that's generally ok. Google's crawlers, as well as any visitor who came to the site via search, would see a default breadcrumb, but visitors would see a breadcrumb based on their own activity. That's fine, since the default is the same for humans as for spiders."
That seems to be a fairly conclusive answer IMO.
-
Hi Arik,
I'd really like an answer to this aswell, as there seems to be no clear answer online.
My understanding is that a breadcrumb should specify a canonical crawl path (not based on referral path), so option 1 is out
option 2 seems suboptimal and not something I can recall seeing implemented on other sites
options 3 and 4: I don't want multiple URLs and to use rel=canonical as I already have one definitive URL.
This seems like it must be a fairly regular problem people have, but cant see a good solution online anywhere
Help anyone?
-
Dear All,
I repeat about Option 1: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking.
Changing content based on the referral path means that the same url will have different content at times. Which means that the search engine will probably find a different content on the page than some other views of the page. As far as I know, this is cloaking - please correct me if I'm wrong.
Option 4 will not necessarily achieve the desired effect as the search engine might decide to ignore the tag. i checked a few examples that this is actually what happens when other e-commerce stores use canonical - you find both URLs in the serps. So I doubt this is the perfect solution...
I'm still not convinced that I have a definitive answer for this. Anyone?
Thanks!
-
Option 1 is not cloaking - it is displaying content dynamically. Cloaking would be if you showed one page to viewers and a different version to Googlebot.
I would say it depends on how different pages are. If all that changes in the breadcrumbs, they I would say you're fine with options 1, 2, or 4.
If the pages are significantly different, such as different category names, page titles, descriptive text, etc. I would go with option 4.
-
Thanks Adam.
I very much respect your opinion and even agree that from a user's point of view option 1 is the best.
I wonder though - it's this considered as cloaking?
|
|
From:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
Some examples of cloaking include:
[...]
Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor|
|
This becomes more complicated, as the path the user chose to get to the specific subcategory or product page reflects not only on the breadcrumbs but also on the category's navigation menu and possibly the descriptive text of the category.
What's your take on this?
-
Options 1, 2, or 4 should be fine. Option 3 is not recommended.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with sold product pages when everything you sell are unique one off items
Hi there, This is something i have been unsure of for years. It's a little different to most ecom website situations. What would you do with product pages when every product is a "one off" unique product and once sold will never be for sale again? Should i redirect to a category page? 404? Leave it as is marked as sold or say it is sold and show links to similar items? At the moment we have 700 products for sale but over 5000 sold products that have their own product page and my concern is as this grows it could become a lot for a WordPress woocommerce site to handle? I don't want to do anything to slow my site down or unnecessarily bloat it but i want to do the right thing by the visitor and also not do anything to hurt my rankings. These pages often rank in google and may have been there for years before the item actually sells. To throw another curve ball, there may be multiple other products (for sale or already sold) with the exact same name but are unique and different from each other. These products pages will often be 98% the same content as each other too. To explain how this could be the case, we sell artworks from many different artists, Every artwork is an original and is unique. But many artists paint the same subject matter multiple times, albeit in a slightly different way from previous times. So you end up with a unique product that has everything the same as another (same artist, same name of artwork, same size, same description, different image, different sku) but is actually different and unique. This has left me somewhat uncertain of what is best to do. Any advice would be greatly appreciated. Thank you
Intermediate & Advanced SEO | | Scottlinklater0 -
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Best Practices for Converting PDFs to HTML
We're working with a client who gets about 80% of their organic, inbound search traffic from links to PDF files on their site. Obviously, this isn't ideal, because someone who just downloads a PDF file directly from a Google query is unlikely to interact with the site in any other way. I'm looking to develop a plan to convert those PDF files to HTML content, and try to get at least some of those visitors to convert into subscribers. What's the best way to go about this? My plan so far is: Develop HTML landing pages for each of the popular PDFs, with the content from the PDF, as well as the option to download the PDF with an email signup. Gradually implement 301 redirects for the existing PDFs, and see what that does to our inbound SEO traffic. I don't want to create a dip in traffic, although our current "direct to inbound" traffic is largely useless. Are their things I should watch out for? Will I get penalized by Google for redirecting a PDF to HTML content? Other things I should be aware of?
Intermediate & Advanced SEO | | atourgates0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Sites in multiple countries using same content question
Hey Moz, I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain? The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me. I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately? www.example.com www.example.co.uk www.example.ca Please help and thanks so much! Cole
Intermediate & Advanced SEO | | ColeLusby0 -
Slug best practices?
Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below. Are there reasons why one might be better than the others? http://www.washingtonpost.com/news/morning-mix/wp/2014/06/20/bad-boys-yum-yum-violent-criminal-or-not-this-mans-mugshot-is-heating-up-the-web/ http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/ http://www.tmz.com/2014/06/19/mugshot-eyes-felon-sexy/
Intermediate & Advanced SEO | | TheaterMania0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0