Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
When removing a product page from an ecommerce site?
-
What is the best practice for removing a product page from an Ecommerce site?
If a 301 is not available and the page is already crawled by the search engine
A. block it out in the robot.txt
B. let it 404
-
Bryan,
If I were removing 100 product pages from an eCommerce site because they barely convert I would approach it this way:
#1. Run the URls through a tool to see which ones have external backlinks.
Often times none of the pages will have any external backlinks, and those that do are usually not very good. If this is the case - or if you "really" aren't able to do any 301 redirects (and if so that's something that needs to be fixed) - skip to step #3. Otherwise...#2. 301 redirect those with external backlinks to the most relevant page, be that a similar product or the category page directly above the product to be removed. Try to avoid redirecting them all to the homepage or some other "catch all" page, as these may be treated like a 404 by Google.
#3. Simply remove the pages and show the custom 404 page that suggests other products, or whatever messaging you want to show there (e.g. "This product has been removed from our catalog. Please see these other products...") and be sure to check the http header response code (lots of free tools for that) to ensure these URLs actually serve a 404 response (note: This should show up on the removed URL, as opposed to redirecting the visitor to another page like .../404.html).
#4. Since the now-removed URLs are not linked to from anywhere, either internally or externally, it could take awhile for Google to recrawl them and see the 404 error. If you need this to happen more quickly, such as when dealing with duplicate manufacturer descriptions and removal of page to recover from Panda, it may be wise to provide some type of html sitemap file listing out the URLs long enough for Google to recrawl them.
I would not block them in the robots.txt file, as that could result in Google not seeing the 404 and not removing it from the index (though they will cease to show the meta description).
- topic:timeago_earlier,about a month
-
Okay the question is regarding indexing, I should of been more specific.
If we are removing 100 product pages from an ecommerce site because they barely convert (regardless of a nice 404 page) and we cannot transfer the user to a relevant page. Is it a best to leave the pages live? or remove them (404) and block them in the robots.txt file?
-
Hi Bryan,
There are various reasons to remove a product page from an eCommerce store. Before deciding to remove a product page, you should consider if removing it will in fact help your SEO. If not, you need to look into alternatives such as 301 redirects or informing visitors in the old product pages that the product is no longer available. I'm not sure why performing 301 redirects is not an option for you - you may want to consider trying to get access to do this.
We have written an article some time ago about the different scenarios an eCommerce store will face when deciding to remove old product pages, and how to deal with each scenario: http://blog.referralcandy.com/2011/12/14/how-to-remove-old-products/
Hope that helps!
-
Right the questions is regarding crawlability, link juice etc..
-
I would create a custom 404 page that gives users options of similar products or product categories.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Ecommerce sites, same products
We are a large catalog company with thousands of products across 2 different domains. Google clearly knows that the sites are connected. Both domains are fairly well known brands - thousands of branded searches for each site per month. Roughly half of our products overlap - they appear on both sites. We have a known duplicate content issue - both sites having exactly the same product descriptions, and we are working on it. We've seen that when a product has different content on the 2 sites, frequently, both pages get to page 2 of the SERPs, but that's as far as it goes, despite aggressive white hat link building tactics. 1. Is it possible to get the same product pages on page 1 of the SERPs for both sites? (I think I know the answer...) 2. Should we be canonicalizing (is that a word?) products across the sites? This would get tricky - both sites have roughly the same domain authority, but in different niches. Certain products and keywords naturally rank better on 1 site or the other depending on the niche.
Intermediate & Advanced SEO | Dec 29, 2014, 8:45 PM | AMHC0 -
Ecommerce Site homepage , Is it okay to have Links as H2 Tags as that is relevant to the page ?
Hi All, I have a Rental site and I am bit confused with how best do my H Tags on my homepage I know the H1 is the most important, Then H2 Tags and so on.. and that these tags should really be titles for content. However, I have a few categories (links) on my homepage so I am wondering if I could put these as H2 Tags given that it is relevant to the page . H3 Tags will my News and Guides etc , H4 Tags will the whats on the footer. I am attached a made up screenshot of what I propose for my homepage if someone could please give it a quick look , it would be very much appreciated. I have looked at what some competitors do a lot of them don't seem to have h2's etc but I know it's an important factor for rankings etc. Many thanks Pete dJSFQwI
Intermediate & Advanced SEO | Oct 31, 2014, 1:33 PM | PeteC120 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | Apr 30, 2014, 10:56 AM | WebServiceConsulting.com0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | Mar 9, 2014, 8:06 PM | fabioricotta-840380 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | Oct 16, 2013, 11:56 PM | FPD_NYC0 -
Should I noindex the site search page? It is generating 4% of my organic traffic.
I read about some recommendations to noindex the URL of the site search.
Intermediate & Advanced SEO | Jun 11, 2013, 2:06 PM | lcourse
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales). My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead. Would you noindex this page or not? Any thoughts?0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | Jan 7, 2013, 9:51 PM | richardo24hr0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | Feb 10, 2016, 6:49 AM | HrThomsen0