Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Tool for Retrieving Multiple URL Word Counts in Bulk?
I am doing some content analysis with over 200 URLs to go through! Does anybody know of, or can recommend any bulk on-page word count checkers which would help with the heavy lifting? Any suggestions are greatly appreciated. Thanks!
On-Page Optimization | | NickG-1230 -
Is there a tool that I can use to scrape and see metatags?
Looking for a tool that allows me to scrape the websites off a page listing of Google and output a spreadsheet with the websites and their related meta-tag details (mainly title tag). Is there a tool out there that can conveniently allow me to do this?
On-Page Optimization | | Gavo0 -
Duplicate Content Indentification Tools
Does anyone have a recommendation for a good tool that can identify which elements on a page are duplicated content? I use Moz Analytics to determine which pages have the duplicated content on them, but it doesn't say which pieces of text or on-page elements are in fact considered to be duplicate. Thanks Moz Community in advance!
On-Page Optimization | | EmpireToday0 -
Is there a tool that will "grade" content?
Does anybody know of a tool that can "grade" content for Panda compliance. For example, it might look at: • the total number of words on the page • the average number of words in sentences • grammar • spelling • repetitious words and/or phrases • Readability—using algorithms such as: Flesch Kincaid Reading Ease Flesch Kincaid Grade Level Gunning Fog Score Coleman Liau Index Automated Readability Index (ARI) For the last 5 months I've been writing and rewriting literally 100s of catalog descriptions—adhering to the "no duplicate content" and "adding value" rubrics—but in an extremely informal style. I would like to know if I'm at least meeting Google Panda's minimum standards.
On-Page Optimization | | RScime250 -
On-page report tool
This is a question regarding the advise this tool offer to increase the ranking of a webpage with focusing with a particularly keyword we choose. I give an example: On-page Report Card am checking my keywords and I use. "cleanse london" my surprise is Report card give an "F" for my target landing page http://www.purifyne.com , but the issue is I am already in first place first position. I know SEOmoz know have the algorithm from Google to know how to rank better but my issue is should be a little more accurate! at least. I don't want to be misunderstood here, I just want more guidance, to rank much better using this tool that I am paying for. Any thoughts?
On-Page Optimization | | teksyte0 -
Keyword Density Tools
Does anyone have recommendations on the best tool(s) to use to check the keyword density of each page of a website? I'm not sure if SEOmoz has such a tool.
On-Page Optimization | | webestate0 -
Client needs a basic page analysis tool
I have a client that I work with that is a real estate business. They aren't in need or don't want to pay for full seo services at this time, but would like to do some basic page analysis and correction to make sure their pages are at least somewhat properly target for their keywords. The SEOMOZ tools that I have access to as a PRO are overkill for their needs. Do you all know of any basic tools that can just handle something like the page analysis feature of seomoz? Thanks for your help. Steven
On-Page Optimization | | sfmatthews0 -
Webmaster tools Site speed?
Google webmaster tools site performance is reading out at 2.8 and still raising (Going further into the slow pale area) this was in the green fast area for a while until now. Is this something to be worried about?
On-Page Optimization | | BobAnderson0