Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
406 Errors from Third-Parties websites In Google Webmaster Tools
Google web master tools is displaying 406 errors page.The source is not from our site domain. How to fix these issues if they are from different domain? 2KXlhRy
On-Page Optimization | | SirishaNueve0 -
Same titles in Webmasters' tools
Hello, In webmasters' tools I get a message saying I have 505 pages html improvements that are possible because 505 of my titles are used on two pages. Actually, webmasters' tools is tripping since the doubles they find are: www.example.com and www.example.com/ Anyways, I have indexation problems and I was wondering if this could be the case and how to solve that. Thanks for your help;
On-Page Optimization | | EndeR-0 -
Confused by Moz page grading tool
Can anyone shed any light on why moz ranks this page an F: http://www.traditional-cleaning.co.uk/cleaning-in-tynemouth.htm for 'cleaners in tynemouth' and 'cleaning in tynemouth' Many thanks!
On-Page Optimization | | EdwardoUK0 -
Google Webmaster Tools - Inappropriate Keywords
In December our website was the victim of a devastating pharma hack. Google Webmaster Tools never reported malware on our website, but our keywords within Webmaster Tools were all changed to keywords related to mens and ladies watches. Our Google organic search rank and resulting traffic has been dramatically impacted. Since realizing we were hacked, we rebuilt our website from the ground up with a clean and updated installation of Joomla CMS (from 1.5 to 2.5) and changed to a more secure server. As of December 21st, 2012, our website has been clean of the phama hack. We wrote Google and told them what happened. They replied with a "no manual spam actions were found" message. Unfortunately, Google Webmaster Tools still reflects all of the hacked keywords for mens and ladies watches. We are watching our Google Webmaster Tools everyday waiting for Google to remove the hacked keywords. It's been a little over a month and still only about 5 of the first 100 keywords have anything to do with our website. What do you recommend we do to expedite the removal of these inappropriate keywords in Google WMT?
On-Page Optimization | | TCM-SEO0 -
Why is SEOMOZ Crawl Diagnostics not in sync with Webmaster Tools
Currently, my Website, according to the Crawl Diagnostics Summary, has 401 'Duplicate Page Title Errors'. But in Google Webmaster Tools, under Óptimization on the Left hand Side Toolbar, if you look up HTML Improvements, there are only 4 'Duplicate Title Tags'. I have two questions re this: A) Do 'Duplicate Page Title Errors' and 'Duplicate Title Tags' have the same meaning' ? , and B) why are there 401 errors located by the former, and just 4 by the latter?
On-Page Optimization | | ABCPS0 -
Tool to creat a good XML sitemap
Hello lads, I need to creat a XML sitemap for a website so I can add to Google Webmaster and Bing Webmaster. What do you guys recommend? Tks in advance! PP
On-Page Optimization | | PedroM0 -
Discrepancy between SeoMoz vs Google Webmaster tools
SeoMoz reports over 70 4xx client server errors on my site, but Google Web Master Tools does not report any broken links. There are not any broken links on any of the pages that it is reporting. Could there be another reason for the 4xx errors besides broken links?
On-Page Optimization | | AndyHawkins0 -
Is there a tool out there I could use to help me compose unique meta tags in bulk?
We have a website that has hundreds of crawl errors due to duplicate meta tags. I could do with a tool to help compose unique ones in bulk so we don't exceed the recommended character limit and follow any other best practices.
On-Page Optimization | | WebDesignBirmingham0