E-commerce categrory out of stock items
-
Hi All,
I would like to hide all the products that are out of stock on my category pages, for example with a display:none (maybe there are better options/techniques do this....tips are welcome). The visitor has the options to reveal the out of stock items by using one of the filters, or by using a check-box "Show out of stock items".
Would this still be in line with Google's guidelines? Am I taking a risk to get a penalty cause I'm hiding content?
In my opinion it would not, cause I'm doing this to achieve a better user experience. Items are most of the time out of stock for a week not any longer.
Hope to hear from you guys.
Thanks in advance
Richard
-
Hello Richard,
That is a great question and I'm impressed by your attention to detail with regard to page-rank distribution changing as things go in and out of stock.
To answer your question, I don't think you risk being penalized for displaying in this way any more than thousands of other sites, including huge brands, risk it by using drop-down divs (e.g. "read more" , "transcript") and tabbed product description areas (e.g. "sizes", "description", "technical details", "Shipping costs") to break up the pertinent information into bite-sized chunks for the user. I work on a site that has checkboxes the user can uncheck to hide certain items if they don't wish to see them. This all uses similar coding to what you have described.
As long as you never specifically target Google (as in say "If Googlebot, then show this content, else show this other content) I think you'll be fine.
With that said, you may want to look into using a View-All rel canonical page to take care of that page-rank distribution issue you mentioned, depending on how it impacts the load-time of the page and how many links you will be sending part of your page-rank to: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html . If it were me I'd just stick to the solution you first asked about, but there are plenty of options.
Also think about the UX when a visitor lands on the out-of-stock product page. All it takes is a few quality raters or a few hundred organic visitors who land on that page while it's out of stock to give it a bad rating or a fast back-click to the SERPs and you could find yourself battling the effects of Panda, at least as far as I understand the process. Some options to improve that user experience include: Estimated date that the product will come back; ability to backorder; ability to sign up for an email alert when it gets back in stock; related product links with images.
Good luck!
Everett
-
I've made this change in September, and from the users point of view the experience is much better, I read some time ago that Google takes into account different sorting of categories,
Even when you add new products to a category, some of the others get pushed back, So I hope Google does know how to handle it.
I haven't tried to hide 'out of stock' products but I'm always careful with hiding stuff on the client side since this can be interpreted the wrong way by Google.
I think that doing it server side is better but It's the same like sorting.
The only reason to show an 'out of stock' product on top is if It's really popular and if you either know when It's coming back to stock or let the user subscribe to a 'back in stock' email.
Hops this helps
-
Thanks for your reply Asaf,
What I don't like about your solution is the fact that products keep moving between different pagination pages. This means Google will find the pages deeper in the hierarchical structure every now and then. It's hard to build a history on a specific ranking cause incoming link juice keeps changing.
Did you find out if hiding 'out of stock' items can cause a penalty?
Thanks again Asaf
-
I had the same problem,
What I finally did is to sort the products and always placing the temp out of stock products at the end of the list / or on the last page.
When a product is discontinued I remove it and 301 redirect to a similar product.
Asaf
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links On Out Of Stock Product Pages Causing 404
Hi Moz Community! We're doing an audit of our e-commerce site at the moment and have noticed a lot of 404 errors coming from out of stock/discontinued product pages that we've kept 200 in the past. We kept these and added links on them for categories or products that are similar to the discontinued items but many other links of the page like images, blog posts, and even breadcrumbs have broken or are no longer valid causing lots of additional 404s. If the product has been discontinued for a long time and gets no traffic and has no link equity would you recommend adding a noindex robots tag on these pages so we're not wasting time fixing all the broken links on these? Any thoughts?Thanks
Technical SEO | | znotes0 -
Blog on subdomain of e-commerce site
Hi guys. I've got an e-commerce site which we have very little control over. As such, we've created a subdomain and are hosting a WordPress install there, instead. This means that all the great content we're putting out (via bespoke pages on the subdomain) are less effective than if they were on the main domain. I've looked at proxy forwarding, but unfortunately it isn't possible through our servers, leaving the only option I can see being permenant redirects... What would be the best solution given the limitations of the root site? I'm thinking of wildcard rewrite rules (eg. link site.com/blog/articleTitle to blog.site.com/articleTitle) but I'm wondering if there's much of an SEO benefit in doing this? Thanks in advance for everyone's help 🙂
Technical SEO | | JAR8970 -
Impact of keywordchange for e-commerce
Hello, For an upcomming campaign we noiced the search volume for the word photobook is much higher then photobooks. Our page is currently ranking quite strong for photobooks, but not for photobook.
Technical SEO | | ETonnard
What would be the impact if we change the url, name and keywords to photobook?
And if it would make sense to change, what would te correct steps be to do so?
Thanks you very much for your thoughts on this matter!0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Many Errors on E-commerce website mainly Duplicate Content - Advice needed please!
Hi Mozzers, I would need some advice on how to tackle one of my client’s websites. We have just started doing SEO for them and after moz crawled the e-commerce it has detected: 36 329 Errors – 37496 warnings and 2589 Notices all going up! Most of the errors are due to duplicate titles and page content but I cannot identify where the duplicate pages come from, these are the links moz detected of the Duplicate pages (unfortunately I cannot add the website for confidentiality reasons) : • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_2=&products_per_00&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_00&products_per_00&products_per_00&page=2 With these URLs it is quite hard to identify which pages need to be canonicalize. And this is jsut an example out of thousands on this website. If anyone would have any advice on how to fix this and how to tackle 37496 errors on a website like this that would be great. Thank you for your time, Lyam
Technical SEO | | AlphaDigital0 -
Changing E Commerce platforms
Hey All, Im Olga and new to search engine optimization and web development. I recently changed from O S Commerce to Go daddy Quick shopping Cart. Before on my old site I had a lot of images show up on google searches on the first page in the image section that have recently vanished due to the change. I think that is where the majority of my sales were coming from as I did not really rank for any keyword searches. My old web designer did everything for my old site and told me a year ago he was to busy to help me out anymore and I did not know how to work OS Commerce hence the change to quick shopping cart. Does anyone know how to get images back into google images, or any other beginner strategies to help me rank in google. I have used all of the tools on this program and pretty much got everything figured out only I think the industry I am in is VERY competitive. Would love any advice or help. Thanks, Olga Girls Gone Biker
Technical SEO | | girlsgonebiker0 -
WordPress E-Commerce Plugin Duplicate Content Problem
I am working on a wordpress website that uses the WP E-Commerce plugin. I am using the Yoast seo plugin but not totally familiar with it. I have noticed that WP E-Commerce creates duplicate content issues. Here's an example: http://www.domain.com/parent-category/product-url-1/ is the same content as http://www.domain.com/parent-category/child-category/product-url-1/. I was wondering which of these following options are the best solution: 1. 301 redirect the multiple instances to one page
Technical SEO | | theanglemedia
2. noindex all but one instance
3. Use the canonical tag (i've used this tag before for telling SE's to use the www version of a page but not sure if it's the appropriate for this)
4. a combination of one of these 3 options? Thanks in advance!0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0