Tips on speeding up disavow process and still doing it right
-
Hello,
I'm doing a 10-step disavow check using this article, this word document, and this corresponding spreadsheet template. Doing a disavow for a non-manual penguin penalty (paid links, doorway sites)
It's taking a long time. Can you give me some pointers of how to speed things up? I want to do it right but I need to know how I can do it faster. Like tricks on domains to automatically disavow, what not to check, whatever tips you have.
I'm using Cognitive SEO and manual checking. I'd like to avoid pricy tools. I'm on a mac.
Thanks!
-
I'm sorry to say, there really is no way to speed it up without cutting corners!
Link removal is a painfully slow process but one that needs to be done if your link profile is terrible. I don't use exactly the same process as that guide but it's pretty close and it's going to be about the same speed.
There are automation tools out there and maybe a little Excel trick or two that could save you seconds but if you've got a big, low quality profile, there really is no way to save any significant amount of time without risk of missing some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my comp still beating me??
Hey Friends, I have been doing some analyzing in opensiteexplorer and attached below is an image showing my url (5th position) and my competitors (1st position). Any ideas as to WHY they are still beating me in rankings? I'm sure there could be a number of reasons but the numbers are staggering. Thanks Screen%20Shot%202013-09-16%20at%205.22.46%20PM.png Douu98y
Moz Pro | | brandon070 -
Redirected pages still sending response code 200
SEO Moz tool reports missing title tags on all the links that have been redirected. E.g. this page: http://www.imoney.my/ms/personal-loan When I check the response code on the page with redirect checker it shows code 200 (page exists). Has it happened to anyone else? How can a redirected page send a 200 code?
Moz Pro | | imoney0 -
Why do crawlers still track meta keywords if it is not needed in my site?
I have crawled three sites already and it returns more than 5000 errors most of which are MIssing Meta Keywords tags. The sites are on Wordpress and using my SEO plugin I can easily edit the meta keywords of each page, but I am having second thoughts. Well should I?
Moz Pro | | jernest0020 -
What did I do wrong? What did they do right?
I have only been here a little over a week so bare with me if I am out of protocol by asking this here if there is a more appropriate place for me to have done so. Anyhow, i own the domain www.lowloans.com and just a few months ago, this domain was ranked 1 on Google for the search term "low loans" and "lowloans" Now, that domain is nowhere to be found. I don't even know where to go to begin to work on this. With respect to the search lowloans, the sites that get top spot don't have a lot of info that I have been able to find that would help me better my position. I must be doing something wrong in my campaign as all of top 3 sites on google, don't rank within my campaign "rankings" report. Ugh. Where to start? Is there some section within SEOmoz that would give me the info I need to reclaim the top spot for my desired search terms? I am still making my way through this site so if I am asking something obvious to you vets, please be patient with me.
Moz Pro | | FailuretoPlan1 -
Adding canonical still returns duplicate pages
According to SEOmoz, several of my campaigns show that I have duplicate pages (SEOmoz Errors). Upon reading more about how to resolve the issue, I followed SEOmoz's suggestion to add rel='canonical' <links>to each page. After the next SEOmoz crawl, the number of SEOmoz Errors related to duplicate pages remained the same and the number of SEOmoz notices shot up indicating that it recognized that I added rel='canonical'.</links> I'm still puzzled as to why the SEOmoz errors did not go down with respect to duplicate page errors after I added rel='canonical', especially since SEOmoz noticed that I added them. Can anyone explain this to me? Thanks,
Moz Pro | | MOZ2
Scott.0 -
Is the Open Site Explorer Tool working right now?
Is the Open Site Explorer Tool working right now? I keep getting directed to a sign-up page.
Moz Pro | | LarryEngel0 -
SEOMoz says i have errors but goole webmaster doesnt show them - which one is right ?
I have about 350 websites all created in farcry 4.0 cms platform. When i do a site crawl using any seo tool ( seomoz, raven, screaming frog) it comes back telling me I have duplicate titles, description and content for a bunch of my pages. The pages are the same page its just that the crawl is showing the object Id and the friendly URL which is autocreated in the CMS as different pages. EXAMPLE these are the samge page but are recognised as different in SEOMOZ crawl test and therefore flagged as having duplicate title tags and content ... <colgroup span="1"><col style="width: 488pt; mso-width-source: userset; mso-width-alt: 23771;" span="1" width="650"></colgroup>
Moz Pro | | cassi
| www.westendautos.com.au/go/latest-news-and-specials <colgroup span="1"><col style="width: 488pt; mso-width-source: userset; mso-width-alt: 23771;" span="1" width="650"></colgroup>
| www.westendautos.com.au/index.cfm?objectid=9CF82BBD-9B98-B545-33BC644C0FA74C8E | | GOOGLE WEBMASTER however does not show me these errors ? It shows no errors at all. Now i believe i can fix this by chucking in a rel=canonical at the top of each page ? (a big job over 350 sites) But even so - my problem is that the website developers are telling me that SEOMOZ and all the other tools are wrong - that google will see these the way it should, that the object ID's would not get indexed ( although i have seen at least one object id show up in the serps.) Do i believe the developers and trust that google has it sorted or go through the process of hassling the developers to get a rel=canonical added to all the pages? (the issue sees my homepage as about 4 different pages www.domain.com/ www.domain.com/home /index AND object id.0 -
A suggestion to help with linkscape crawling and data processing
Since you guys are understandably struggling with crawling and processing the sheer number of URLs and links, I came up with this idea: In a similar way to how SETI@Home (is that still a thing? Google says yes: http://setiathome.ssl.berkeley.edu/) works, could SEOmoz use distributed computing amongst SEO moz users to help with the data processing? Would people be happy to offer up their idle processor time and (optionally) internet connections to get more accurate, broader data? Are there enough users of the data to make distributed computing worthwhile? Perhaps those who crunched the most data each month could receive moz points or a free month of Pro. I have submitted this as a suggestion here:
Moz Pro | | seanmccauley
http://seomoz.zendesk.com/entries/20458998-crowd-source-linkscape-data-processing-and-crawling-in-a-similar-way-to-seti-home1