E-Commerce Categorization
-
I'm working on an e-commerce site that currently has about 50 root categories and growing, with no sub-categories. They are all linked from the sidebar of every page and all the products are pretty related. They could probably be sub-categorized in to 5 root categories.
At want point does categorization become too flat?
-
Making such radical changes is always something that shouldn't be taken lightly, that's for sure. And if it's done, you'll need to have a spreadsheet where you can enter a column with all the page names, a column with their current URLs, and one with the new URLs, because implementing 301 Redirects is critical but can be a nightmare otherwise.
What it comes down to is evaluating the value of other SEO (on-site, link-building, and social) that can be done to improve things as compared with architectural changes. Judgment calls. Not always fun.
-
Yeah that makes sense. Unfortunately this site is built in ProStores, which gives me no control over the URL structure. It makes me hesitant to make category changes that are going to recreate the URL structure.
ProStores is a nightmare.
-
Roger,
Categorization becomes too flat at the moment you lose high quality visitors. Since the site started out flat, there's no way to tell what that point it. The only way to determine if it's already too flat is to refine it. I always recommend to clients that it's best to have no more than eight to ten top level categories, and only have sidebar navigation that links to sub-categories within the specific category you're in, or at most, those, then below them, two or three additional links to similar top level categories.
The reason for this method is because in the flat model, you have no way of communicating to search engines what the real relationship separation is. This in turn dilutes the ability to drive more strength to the highest level categories. The end result then is a situation where your top level categories don't do as well for their most important keyword phrases, the sub-category page phrases also suffer, and in turn, individual product pages do as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
Hi, We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages. As far as I know, the page URL won't change and won't have parameters. Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots. Is it better to have URL parameters for version B and C of the content? For example: /page for the default content /page?id=2 for the B version /page?id=3 for the C version The dynamic content comes from the server side, so not all pages copy variations are in the default HTML. I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
Technical SEO | | Gyorgy.B1 -
Wrapping my head around an e-commerce anchor filter issue, need help
I am having a hard time understanding how Google will deal with this scenario, I would love to hear what you guys think or suggest. Ok a category page on the site in question looks like this. http://makeupaddict.me/6-skin-care All fine and well, But a paginated page or a filtered category pages look like these http://makeupaddict.me/6-skin-care#/page-2 and http://makeupaddict.me/6-skin-care#/price-391-1217 From my understanding Google does not index an anchor without a shebang (#!), but that doesn't mean that they do not still crawl them, correct? That is where the issue comes in, since anchors are not indexed and dropped from the urls, when Google crawls a filtered or paginated page, it is getting different results. From the best of my understanding, and someone can correct me if I am wrong but an anchor is not passed in web languages like a querystring is. So if I am using php and land on http://makeupaddict.me/6-skin-care or http://makeupaddict.me/6-skin-care#/price-391-1217 and use something like .$_SERVER['SELF'] to get the url both pages will return http://makeupaddict.me/6-skin-care since the anchor is handled client side. With that being the case, is it imagined that Google uses that standard or is it thought they have a custom function that grabs the whole url anchor in all? Also if they are crawling the page with the anchor, but seeing it anchor less how are they handling the changing content?
Technical SEO | | LesleyPaone0 -
SEO Ramifications of migrating traditional e-commerce store to a platform based service
Hi I'm thinking of migrating my 11 year old store to a hosted platform based e-commerce provider such as Shopify or Amazon hosted solutions etc etc I'm worried though that will lose my domains history and authority if i do so Can anyone advise if this is likely or will be same as a 301 redirect etc etc and should be fine ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Bay Area E-Commerce SEO Firm Needed
So my e-commerce site recently got hit badly with the latest Penguin update. Traffic is down by 60%. We were using a cheap Indian SEO firm who did get us great results but it seems they was a lot more spamming than I realized. I am now looking to clean up my backlinks and create a new relationship with a local business so I can be more hands on with my SEO. Does anyone have any recommendations for SEO firms that have experience in e-commerce? Ideally somewhere in the Bay Area or even Sacramento?
Technical SEO | | premierchampagne0 -
Different domains vs subdomains for 6 e-shops
We have to choose to stay working with different domain or move all 6 domains to new domain. Right now we have 6 different brand e-shops with 6 domains that have Domain Authority 30-40. So for example:
Technical SEO | | TauriUrb
e-shop1.com
e-shop2.com
.. And we are thinkig about making brand new domain and move all these domains content and link juice into one domain with 301 redirect. So we would have one e-shop solution with 6 subdomains like: e-shop1.newdomain.com
e-shop2.newdomain.com
... Like gap.com does with their brands or http://www.andotherbrands.com. That each brand has subdomains but they use same e-shop solution. With same domains or new subdomains strategy, we will improve internal linking structure between all these e-shops. We have also considerd that with new domain we don't have good organic search results within few first months. So we try to see the bigger picture and consider SEO future. SO QUESTION IS:
Lets leave brand marketing out, then what would you suggest to do? To stay with 6 different DA 30-40 domains or build one stong domain with 6 different brand subpages? We cant use subcategories. All subpages products will be clothes. Latetly there as beed many news and articles that subdomains are part of main domain and vice versa we are looking suggestions from this board. As we right now think that as we have weak domains righ now it would be better idea to start building one strong domain.0 -
Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines. I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important. Resources: http://www.ericpender.com/blog/tabs-and-seo http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643 Sample in use: http://www.seomoz.org/article/search-ranking-factors **Old Version: ** http://screencast.com/t/BWn0OgZsXt http://seatgeek.com/boston-celtics-tickets/ New Version with tabs: http://screencast.com/t/VW6QzDaGt http://screencast.com/t/RPvYv8sT2 http://seatgeek.com/miami-heat-tickets/ Notes: Content not displayed stacked on browser when Javascript turned off, but it is in the source code. Content shows up in Google cache of new page in the text version. In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be? Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional). This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource. Are there any issues with hidden text / is this too far down in the html? Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
Technical SEO | | chadburgess0 -
WordPress E-Commerce Plugin Duplicate Content Problem
I am working on a wordpress website that uses the WP E-Commerce plugin. I am using the Yoast seo plugin but not totally familiar with it. I have noticed that WP E-Commerce creates duplicate content issues. Here's an example: http://www.domain.com/parent-category/product-url-1/ is the same content as http://www.domain.com/parent-category/child-category/product-url-1/. I was wondering which of these following options are the best solution: 1. 301 redirect the multiple instances to one page
Technical SEO | | theanglemedia
2. noindex all but one instance
3. Use the canonical tag (i've used this tag before for telling SE's to use the www version of a page but not sure if it's the appropriate for this)
4. a combination of one of these 3 options? Thanks in advance!0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0