Should I implement 301 Redirects vs 410 in Removing Product Pages?
-
I manage an ecommerce site and had a question about 301 redirects vs 410 when removing product pages. What we have in place now is a 410 because we have removed products that are no longer available and the content itself no longer exists. In all my research, SEO Best Practices is to have a 301 in place. Should we replace our 410 with a 301 redirect or keep it?
-
hi there!
I want to add a bit more of difficulty on this topic. What would you guys do if your products are renewed every year, so product A in 2014 is product B (almost the same as the previous) in 2015, and then B will eventually become C in 2016.
How would you guys avoid the 301 chain?
any suggestion?
Thanks!
rafa
-
Thanks Wissam for your prompt feedback! We do have similar pages we can redirect them too and right now, when the visitor gets the 410 page, we are giving them the option to search our site or go to our home page. I will check into the # of inbound links.
-
Thanks Peter for your prompt feedback! I will look at your Moz link further. Right now, when the visitor gets the 410 page, we are giving them the option to search our site or go to our home page.
-
if the product have a relevant substitute then you can utilize the the 301. but if the product don't have a relevant substitute then 404/410 is the right of doing it.
i might rethink intelligently if the product have too much inbound links to it ... but this is case by case situation.
-
Hi Lori
A 410 code it technically correct in the context you are using it, but for SEO purposes you are better to do something with links that are going to the pages you are currently returning 410 codes for.
If you have a page that relates in some way to the page you have removed, then create a 301 redirect and send the requests to the similar page.
If no similar page exists, you could just redirect to the home page. In some cases though in those circumstances it could be worthwhile sending the link to a custom 404 page similar to the one Moz use as you can see for my invalid page query: http://moz.com/404. Rather than use a 301 redirect in those circumstances, this type of response can be a good opportunity to engage with the visitor to try to guide them to the information they are looking for. The Moz 404 page is a good example of that.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I transform my knowledge into products?
Hi Mozers, Since 7 years, I've learned a lot about SEO, analytics, local, PPC,... from sources like Moz, Distilled, Market Motive,... and I'm using tools like Moz Analytics, Majestic, SEMrush, Wordstream,... but I've some difficulties to transform my knowlegde into products/services that I could sell as an freelance consultant. Audits is an example of easy product that I could sell but have you another examples of "easy" products that I could developp as an freelance consultant? Thank you for your help. Jonathan
Industry News | | JonathanLeplang0 -
Google penalty removal expert questions
We have searched online for a Google penalty “expert” (individual or company) and have located what appear to be “experts”. Please provide feedback on the following 2 individuals/companies we have found that can help with penalty removal. Have you or one of your clients used either of the “experts” below? What were the results? How many disavows and reconsideration requests did you/they have to make? 1.www.penaltypros.com . To give a quote and to see what your links are they use links from Google Webmaster Tools only. Penaltypros.com disavows first and then removes bad links second. This is opposite of what Google and Seo’s recommend but penaltypros.com claims 100% success using this non-traditional approach. See imgur.com link for screenshot. 2.http://www.hiswebmarketing.com/ To give a quote and to see what your links are they use links from https://ahrefs.com/ only. Please provide any and all feedback on the above 2 “experts” and also post the websites, individual names, company names of those that you consider Google penalty removal “experts” so that we may obtain a quote from them. Lp9F3FI
Industry News | | RetractableAwnings.com1 -
My Recent Drop from the first page
Hello, I have a website design operation in Akron, Ohio. I have been ranking on the first page for the last year or so. Just recently (within the last 5 -6 weeks) I have fell back to page 2 on google. I have changed up my content a little bit on my home page, adding more references to the "Web Design Akron Ohio" keyword. I have looked at the sites ahead of me and I noticed that most of the highest ranking sites are using Meta-Keywords, which moz suggest not to. I added Meta-Keywords about 10 days ago just testing if there would be any ranking change and so far no change at all. Would someone be kind enough to look at my site and throw out some suggestions of what I might do to get back to the first page? I'm trying to rank for "web design akron ohio" and my URL is http://www.uswebproducts.com Any help would be appreciated.
Industry News | | Scott-Jones0 -
Page performance and reinstating previous version
Hi all, I hope there is a search guru out there who can assist with this. I decided 3 months ago that our SEO contractors weren't doing as well as they should be and after long discussion, i reoptimised all pages myself. Most are performing better now but sales and enquiries went through the floor. I have today found ranking information from the week before i made the changes, something i thought i had lost. Now i have this, i have found that one of the two main pages has gone from position 1 to position 38. This explains the problem. I know all things are not equal and in the intervening months, competitors have updated their sites and links, however, this is the security sector and things don't change much, so, all things being equal, would reinstating the old version of the page be likely to reinstate my previous ranking position or thereabouts? or will the MIGHTY GOOGLE punish me in some way for swapping back to a previous page version? We use a CMS system and all revisions are stored. The page in question is www.compoundsecurity.co.uk/security-equipment and the keyword in question is 'wireless alarms'. Any help will be greatly appreciated by this non SEO plebe. Cheers Si P.S. feel free to berate me for not recording all pertinent info about rankings BEFORE i star playing around with the site. It was my first time and i have well and truly learn my lesson.
Industry News | | DaddySmurf0 -
How can i discover how many of my pages have been indexed by google?
I am currently in the process of trying to produce a report for my corporation and this is a metric that i cannot seem to find on OpenSiteExplorer. Could anyone help?
Industry News | | CF20150 -
We noticed that goods offered in our email newsletters used to disappeared from first google search results page!?
We noticed that goods offered in our email newsletters used to disappeared from fisrt google search results page. Goods where in top 5 positions or even higher, but after email newsletters we didn't find them even in top 100. We suspect service provider of email sending is in blacklist? Could it be reason? If yes, how could we check that?
Industry News | | Patogupirkti0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Do "big" SEO companies remove links after termination of service?
Or worded differently: Has anyone heard of "big" SEO companies removing links after termination of service? I have a client who isn't particularly happy with the SEO he's getting from a big Aussie SEO firm, and he wants to terminate, however they've built thousands of links for him and he's a little concerned they might all get pulled. Has anyone heard of this happening, or; Do you think this is a legitimate concern? I think its physically possible to remove backlinks like this because it seems the SEO firm in question is building links by using other client's websites. I also wonder if they might have large content farm style sites where they place links for clients which might be quite easy to take down. Please discuss!
Industry News | | CheapGames990