Disavow 401, 403, 410, 500, 502, 503
-
Dear people,
I am cleaning my backlink profile and I am not sure if I should disavow links that drive you to a: 401, 403, 410, 500, 502, 503.
I do understand that since last Penguin update, it won't be necessary, but I would like to be sure about it. Any hints out there?
Thanks in advance
-
I am still trying to decipher rather you're talking about them pointing to your site that resolves in these codes or if when you query the URLs or domains they are giving you these codes: 401, 403, 410, 500, 502, 503I still think that I need to see a little bit more information however if you are cleaning house and you have backlinks that resolve to dead domains and 404's etc. Please remember anyone can make these domains and URLs live anytime. so it is best to err on the side of caution and disavow spamming backlinks. whether or not they're coming from dead / 500 backlink or hitting your site with an unauthorized 401.( I have placed what the codes mean below in hopes that it may be of some help to you)Successful 2xx
- 200 Status OK
- 201 Created
- 202 Accepted
- 203 Non-Authoritative Information
- 204 No Content
- 205 Reset Content
- 206 Partial Content
Redirection 3xx
- 300 Multiple Choices
- 301 Moved Permanently
- 302 Found
- 303 See Other
- 304 Not Modified
- 305 Use Proxy
- 306 (Unused)
- 307 Temporary Redirect
Client Error 4xx
- 400 Bad Request
- 401 Unauthorized
- 402 Payment Required
- 403 Forbidden
- 404 Not Found
- 405 Method Not Allowed
- 406 Not Acceptable
- 407 Proxy Authentication Required
- 408 Request Timeout
- 409 Conflict
- 410 Gone
- 411 Length Required
- 412 Precondition Failed
- 413 Request Entity Too Large
- 414 Request-URI Too Long
- 415 Unsupported Media Type
- 416 Requested Range Not Satisfiable
- 417 Expectation Failed
Server Error 5xx
- 500 Internal Server Error
- 501 Not Implemented
- 502 Bad Gateway
- 503 Service Unavailable
- 504 Gateway Timeout
- 505 HTTP Version Not Supported
I hope this helps,Tom
-
The only way to know would be to look at the links and your domai ( at the very least understand the type of Business you’re domains is)
If you’re willing to a put the here that would be awesome if not you can send them to me private message
If not I can place a URL that works for only one click and you add the URLs that way?
all the best,
tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
Pingbacks and Trackbacks: A good source for Disavow links?
I am 100% new to the world of WordPress, more advanced CMS systems, and all the interesting things that have been developed over the last several years on the web. After installing WordPress I noticed options to allow pingbacks and trackbacks. Upon researching these, I found it interesting that you can very quickly and easily identify spam wordpress blog sites that mention your article in a spammy fashion if you leave this option ENABLED. Because you literally get a notification of new nonsense happening on the web that may affect you. Is this a logical, rational thing to leave enabled and use as one of many tricks to manage new additions to a disavow file? It almost seems like a Godsend because you don't have to go looking for them.
Intermediate & Advanced SEO | | HLTalk0 -
Which links to disavow?
I've got a new client that just fired their former SEO company, which was building spammy links like crazy! Using GSC and Majestic, I've identified 341 linking domains. I'm only a quarter of the way through the list, but it is clear that the overwhelming majority are from directories, article directories and comment spam. So far less than 20% are definitely links I want to keep. At what point do I keep directory links? I see one with a DA of 61 and a Moz spam score of 0. I realize this is a judgement call that will vary, but I'd love to hear some folks give DA and spam numbers. FWIW, the client's DA is 37.
Intermediate & Advanced SEO | | rich.owings0 -
Disavow files on m.site
Hi I have a site www.example.com and finally have got the developers to add Google webmaster verification codes for: example.com m.example.com As I was advised this is best practice - however I was wondering does this mean I now need to add the disavow file. Thanks Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
Google Webmaster Tools is showing the following message: _Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _ Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly? Here is the 301 redirect code I am using in .htaccess: RewriteEngine On RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/ RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC] RewriteRule (.*) http://www.example.com/$1 [R=301,L] Could adding the following code after that in the .htaccess potentially cause any issues? BEGIN EXPIRES <ifmodule mod_expires.c="">ExpiresActive On
Intermediate & Advanced SEO | | kimmiedawn
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule> END EXPIRES (Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess: BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule> END WordPress Thanks0 -
Best practice to disavow spammy links
Hi Forum, I'm trying to quantify the logic for removing spammy links.
Intermediate & Advanced SEO | | Mark_Ch
I've read the article: http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings. Based on my pivot chart results, I see around 55% of my backlinks at zero pagerank. Q: Should I simply remove all zero page rank links or carry out an assessment based on the links (zero pagerank) DA / PA. If so what are sensible DA and/or PA metrics? Q: What other factors should be taken into consideration, such as anchor text etc.0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Should I bother disavowing nofollow backlinks?
Hello! I am about to go through the list of backlinks in our profile and sort out what we want to disavow. A question I had is, should I both disavowing nofollow backlinks, even if they look spammy? Or should I just focus on cleaning up the dofollow's? Thanks!
Intermediate & Advanced SEO | | Ryan_Phillips0