Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Benefit of using 410 gone over 404 ??
-
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone.
Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change.
I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case?
Or, when would you use a 410 gone?
Thanks
-
I had the (mis)fortune of trying to deindex nearly 2 million URLs across a couple of domains recently, so had plenty of time to play with this.
Like CleverPhD I was not able to measure any real difference in the time it took to remove a page that had been 410'd vs one that had been 404'd.
The biggest factor governing the removal of the URLs was getting all the pages recrawled. Don't underestimate how long that can take. We ended up creating crawlable routes back to that content to help Google keep visiting those pages and updating the results.
-
The 410 is supposed to be more definitive
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
404 is "not found" vs 410 is "gone
10.4.5 404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address. This status code is commonly used when the server does not wish to reveal exactly why the request has been refused, or when no other response is applicable.
10.4.11 410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 (Not Found) SHOULD be used instead. This response is cacheable unless indicated otherwise.
The 410 response is primarily intended to assist the task of web maintenance by notifying the recipient that the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed. Such an event is common for limited-time, promotional services and for resources belonging to individuals no longer working at the server's site. It is not necessary to mark all permanently unavailable resources as "gone" or to keep the mark for any length of time -- that is left to the discretion of the server owner.
That said, I had a similar issue on a site with a couple thousand pages and went with the 410, not sure it really made things disappear any faster than the 404 (that I noticed).
I just found a post from John Mueller from Google
https://productforums.google.com/forum/#!topic/webmasters/qv49s4mTwNM/discussion
"In the meantime, we do treat 410s slightly differently than 404s. In particular, when we see a 404 HTTP result code, we'll want to confirm that before dropping the URL out of our search results. Using a 410 HTTP result code can help to speed that up. In practice, the time difference is just a matter of a few days, so it's not critical to return a 410 HTTP result code for URLs that are permanently removed from your website, returning a 404 is fine for that. "
So, use the 410 as a matter of a few days you may see a difference with 30k pages.
All of that said, are you sure with a site that big you would not need to 301 some of those pages. If you have a bunch of old news items or blog posts, would you not want to redirect them to the new URLs for those same assets? Seems like you should be able to recover some of them - at least your top traffic pages etc.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Footer Links Used for Keyword Spam
I was on the phone with a proposed web relaunch firm for one of my clients listening to them talk about their deep SEO knowledge. I cannot believe that this wouldn’t be considered black-hat or at least very Spammy in which case a client could be in trouble. On this vendor’s site I notice that they stack the footer site map with about 50 links that are basically keywords they are trying to rank for. But here’s the kicker shown by way of example from one of the themes in the footer: 9 footer links:
White Hat / Black Hat SEO | | RosemaryB
Top PR Firms
Best PR Firms
Leading PR Firms
CyberSecurity PR Firms
Cyber Security PR Firms
Technology PR Firms
PR Firm
Government PR Firms
Public Sector PR Firms Each link goes to a unique URL that is basically a knock-off of the homepage with a few words or at the most one sentences swapped out to include this footer link keyword phrase, sometimes there is a different title attribute but generally they are a close match to each other. The canonical for each page links back to itself. I simply can’t believe Google doesn’t consider this Spammy. Interested in your view.
Rosemary0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
I have a recipe food blog and use wordpress, but my recipes are usually in more than one category...?
The recipes are in most cases in more than one category (usually two) each. Do and (and if so how) need to set each post to one canicol url? E.g A recipe on Peas is in healthy foods (which is the default wordpress cat.) and also Vegetarian Dishes. I use YOAST for wordpress
White Hat / Black Hat SEO | | Kelly33300 -
I have deleted a couple of posts from my blog, im using wordpress but still showing in the search how to delete?
Hey Guys, So I deleted a couple of pages from my blog, and when I search the keyword it is still showing do you guys have any idea how I can completed delete this from the search? Here is the page http://bit.ly/1cRR4qJ
White Hat / Black Hat SEO | | edward-may0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
White Hat / Black Hat SEO | | PeterConnor0 -
Using an auto directory submission
Has anyone used easysubmits.com and what's your experience with it? Any other directory submission or link building tools that help automate and manage the process like easysubmits.com says they can do? I'm just looking at it currenlty and wanted to hear others thoughts before I get taken in by some black hat method that hurts my websites instead.
White Hat / Black Hat SEO | | Twinbytes0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11