Dealing with 410 Errors in Google Webmaster Tools
-
Hey there!
(Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic.
As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too.
As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools.
(Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense).
Any opinions on the best way to handle this?
Thx!
-
Thanks!
-
Having the errors in Webmaster Tools is not going to negatively impact your SEO in any way. It's more of a heads up to you, the webmaster, that they have found a page that is missing.
As long as there are no internal or external links to those pages, they should disappear automatically, although it could take months. If you don't want the errors cluttering up your report, then manually marking them as fixed is the way to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
Webmaster Tools Verification Problem
Hello, Somehow a website I'm working on has lost it's verification in Webmaster Tools. I have absolutely no clue why... The Google Analytics code is still working, the verification meta tag is in place, both are not working. I get an error message about Google not being able to connect to the server. I asked about any possible changes to server settings or stuff about that, but apparently nothing has changed there. The URL in question is Bivolino.com Does someone has any other ideas what I could be looking for. Thanks, Kind regards, Erik
Technical SEO | | buiserik0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
What may be the reason a sitemap is not indexed in Webmaster Tools?
Hi,
Technical SEO | | SorinaDascalu
I have a problem with a client's website. I searched many related questions here about the same problem but couldn't figure out a solution. Their website is in 2 languages and they submitted 2 sitemaps to Webmaster Tools. One got 100% indexed. From the second one, from over 800 URLs only 32 are indexed. I checked the following hypothesis why the second sitemap may not get indexed: sitemap is wrongly formatted - False sitemap contains URLs that don't return 200 status - False, there are no URLs that return 404, 301 or 302 status codes sitemap contains URLs that are blocked by robots.txt - False internal duplicate content problems - False issues with meta canonical tags - False For clarification, URLs from the sitemap that is not indexed completely also don't show up in Google index. Can someone tell me what can I also check to fix this issue?0 -
Webmaster Tools Site Map Question
I have TLD that has authority and a number of micro-sites built off of the primary domain. All sites relate to the same topic, as I am promoting a destination. The primary site and each micro-site have their own CMS installation, but the domains are mapped accordingly. www.regionalsite.com/ <- primary
Technical SEO | | VERBInteractive
www.regioanlsite.com/theme1/ <- theme 1
www.regioanlsite.com/theme2/ <- theme 2
www.regionalsite.com/theme3/ <- theme 3 Question: Should my XML site map for Webmaster Tools feed all sites off of the primary domain site map or are there penalties for this? Thanks.0 -
WP Blog Errors
My WP blog is adding my email during the crawl, and I am getting 200+ errors for similar to the following; http://www.cisaz.com/blog/2010/10-reasons-why-microsofts-internet-explorer-dominance-is-ending/tony@cisaz.net "tony@cisaz.net" is added to Every post. Any ideas how I fix it? I am using Yoast Plug in. Thanks Guys!
Technical SEO | | smstv0 -
Google penalty
Anyone have any success stories on what they did to get out of Google penalty?
Technical SEO | | phatride0