Product discontinued
-
Hi there,
We have ranking top 3 for a competitive keyword, however the product linked to that keyword has just been discontinued, normally if we have alternatives to this product I would keep this page live and direct users to those alternatives products.
What is best practice here? keeping the page live and telling users that this product has been discontinued and offering a helpline number? or just removing the page all together? (404)
Thanks
-
I would suggest creating a transitional custom page solution. In this scenario, you keep the page itself but instead of the product information, you show a custom message such as "We're sorry but this product has been discontinued". Then below that, you might present one, two or a few similar products for the visitor to consider, along with an invitation to browse the category that product was in.
If this is going to be done, I would suggest building in a time-frame for using this method, such as x months. At the end of that time-period, I would implement a 301 "permanently moved" redirect pointing that page to the next highest most relevant page, such as the sub-category or category the product falls within. This way, you give users options and an opportunity to still buy from you, in a respectful way.
Alternately, if you don't have any products that are essentially like the discontinued product, I would implement a 301 redirect right away, again to that next higher most relevant page, and I would suggest including a custom message at the top of that page's main content area, communicating how that product has been discontinued, if your programmer has the skill to implement that custom message based on the source of the visit...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Putting rel=canonical tags on blogpost pointing to product pages
I came across an article mentioning this as a strategy for getting product pages (which are tough to get links for) some link equity. See #21: content flipping: https://www.matthewbarby.com/customer-acquisition-strategies Has anyone done this? Seems like this isn't what the tag is meant for, and Google may see this as deceptive? Any thoughts? Jim
Intermediate & Advanced SEO | | jim_shook0 -
URL structure for categories, sub categories and products
Hi, I'm looking for some advice about URL hierarchy and the best way to structure URLs for SEO with regards to categories, sub categories and product pages. The way the site is set up displays the URLs as such, example: 1. /badge-accessories/ 2. /badge-accessories/plastic-wallets/ 3. /badge-accessories/plastic-wallets/clear-flexible-wallets/ I am questioning whether it would be best to keep it like this (which the site developers are suggesting) or change to something like: 1. /badge-accessories/ 2. /plastic-wallets/ 3. /clear-plastic-flexible-wallets/ Or something like: 1. /badge-accessories/ 2. /plastic-wallets/ 3. /plastic-wallets/clear-flexible-wallets/ Any advice would be appreciated. Thanks
Intermediate & Advanced SEO | | Kerry_Jones0 -
Product descriptions & Duplicate Content: between fears and reality
Hello everybody, I've been reading quite a lot recently about this topic and I would like to have your opinion about the following conclusion: ecommerce websites should have their own product descriptions if they can manage it (it will be beneficial for their SERPs rankings) but the ones who cannot won't be penalized by having the same product descriptions (or part of the same descriptions) IF it is only a "small" part of their content (user reviews, similar products, etc). What I mean is that among the signals that Google uses to guess which sites should be penalized or not, there is the ratio "quantity of duplicate content VS quantity of content in the page" : having 5-10 % of a page text corresponding to duplicate content might not be harmed while a page which has 50-75 % of a content page duplicated from an other site... what do you think? Can the "internal" duplicated content (for example 3 pages about the same product which is having 3 diferent colors -> 1 page per product color) be considered as "bad" as the "external" duplicated content (same product description on diferent sites) ? Thanks in advance for your opinions!
Intermediate & Advanced SEO | | Kuantokusta0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
URL for New Product
Hi, We are creating a section on our established existing website to display our new marketplace product & associated category pages. This marketplace will be a section of the site where our users can sell online training courses that they've created. It will be branded on our site as the Marketplace. Is it important to include 'marketplace' in the URL? Or would it be better to include a relevant keyword such as 'training-courses' instead? Or both? I've assumed I shouldn't use both as that would increase the length of the URLs and number of subfolders.
Intermediate & Advanced SEO | | mindflash0 -
What would be the ideal method to handling auto-generated product content across network of dealership websites?
We have recently started work with a dealership group that operates ~20 separate dealerships (different locations and brands) and individual websites for each. The group also operates two umbrella websites for the group brand that shows the inventory across All 20 dealerships. All websites are basically using the same template and all product listings are from the same data source (same back-end system). All websites are currently also hosted on the same IP address. Typically we work with clients to rectify duplicate content issues and work towards having just one version of any piece of content. However, this is a unique situation in that each dealership has a legitimate brand and marketing need for having their own website. It also is not realistic to ask the client to create unique content for the same product listing 22x. We understand there are numerous options to consider but I would appreciate hearing any advice/feedback from individuals who have dealt with similar situations. If you know of any good resources on such a scenario, that would also be helpful to verify our thoughts. NOTE: the duplicate content for product inventory is not across all 22 sites but just usually between 3-4 for each product. Often each product listing is shown on 1 or 2 dealerships and the 2 umbrella sites (one is the main group site and the other a product used/clearance site). Currently we can see multiple domains indexed for the same product listings.
Intermediate & Advanced SEO | | BryanSmith0 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0