Do allow or disavow, that is the question!
-
We're in the middle of a disavow process and we're having some difficulty deciding whether or not to disavow links from Justia.com and prweb.com - justia.com alone is giving us 23,000 links with just 76 linked pages. So, to allow, or disavow? That's the question!
What do you think guys?
Thank you.
John.
-
Hey John
If you decide to take action action, then being aggressive with the links is a good approach. Both in Cyrus Shephard's great Moz blog post on the disavow tool and also advice from Google itself says if you suspect an entire domain to be spammy, go ahead and disavow all of it.
However, from my own perspective, I would only go through and create a disavow file if I knew for sure that I was suffering from a manual or algorithmic penalty. I have seen very little benefit in being proactive with that tool (eg rankings are good, you spot bad links in your link profile and disavow them to be safe) and, in fact, I have seen a number of cases when a disavow was submitted "prematurely" - ie, a site was ranking fine and then disavowed some links and saw rankings fall.
If we want to look at it from a slightly skeptical point of view - if you're not suffering from a Google penalty, do you really want to inform Google that you have suspicious links in your profile?
However, that is a matter of preference based on my own experience. I would certainly take note of the links you think are bad (and perhaps put together a file ready to go, just in case). Worth noting that prweb.com has made all of its links nofollow anyway, and so as they're not passing on link equity it doesn't seem logical to then disavow them (as they have no SEO benefit) Also, keep in mind though that if you visit the page and the link is not there - and especially if you do a google search for cache:http://www.example.com and see that the cached version contains no link - there's a very good chance that the link has already been discounted anyway and so would not be flagged in a manual or algorithmic check. Seeing as you have so many links from the domains, that may be occurring.
Hope this helps
-
Has Google notified you of the need to disavow links in Webmaster Tools? Usually, there's a message about unnatural links on the Manual Actions page.
I've never preemptively disavowed links. Maybe that's wrong. But then again, no single site is giving us 23,000 links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about Indexing of /?limit=all
Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?
Intermediate & Advanced SEO | | RobAus0 -
SSL and robots.txt question - confused by Google guidelines
I noticed "Don’t block your HTTPS site from crawling using robots.txt" here: http://googlewebmastercentral.blogspot.co.uk/2014/08/https-as-ranking-signal.html Does this mean you can't use robots.txt anywhere on the site - even parts of a site you want to noindex, for example?
Intermediate & Advanced SEO | | McTaggart0 -
2 menus Responsive website Seo question
Hi there Thanks for reading my post. İ am fairly new to SEO and dont know much coding. İ purchased an opencart theme and am working with a developer to modify it to make it more user friendly. The website is responsive and İ modified the menu for the desktop but now it doesnt have categories but just products. So it doesnt have URL for categories but just filter. So the developer recommended to add the mobile menu which has categories, and subcategories back to the desktop menu. İm not sure if this is a kosher approach to seo. Here is the link: As you can see there is a menu in the top menu and menu on the Main Menu. THoughts? will this be problem for duplicate content? The Main menu keywords are crucial and is what the website is revolving around. New website
Intermediate & Advanced SEO | | socratic-goat7770 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
SEOMOZ Diagram question
Hi, On this SEOMOZ help page (http://www.seomoz.org/learn-seo/internal-link) the diagram explaining the optimal link structure (image also attached) has me a little confused. From the homepage, if the bot crawls down the right-hand link first, will it not just hit a dead end where it cant crawl any further and disappear? OR... will it hit the end of the structure and then crawl backwards to the homepage again and follow down another link and then just repeat the process until all pages are indexed? Cheers pyramid.jpg
Intermediate & Advanced SEO | | activitysuper0 -
Page Titles... question about which is better
Hi, I'm kind of a newbie and I'm working on an e commerce website. I would love to be able to optimize the site so that the keyword "dog boutique" was ranking for the homepage. B/C a lot of the pages call from php to create the meta data, most of generated page titles look like "Product Name, Category - Moondoggie Dog Boutique" My question is would it be more helpful to just have Moondoggie Dog Boutique on the page title on the page I would like to rank for "dog boutique" and use Moondoggie Inc. or Just Moondoggie in it's place on all of the other pages? Would this help or make it worse? Thanks! KristyO If you would like to see hte site: http://www.moondoggieinc.com
Intermediate & Advanced SEO | | KristyO0 -
What to do when unique content is out of the question?
SEO companies/people are always stating that unique, quality content is one of the best things for SEO... But what happens when you can't do that? I've got a movie trailer blog and of late a lot of movie agencies are now asking us to use the text description they give us along with the movie trailer. This means that some pages are going to have NO unique content. What do you do in a situation like this?
Intermediate & Advanced SEO | | RichardTaylor0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0