Is there a tool to figure out bad backlinks
-
With the new changes to the google algorithm. I'm trying to figure out what links google may think are hurting my site. Any thoughts? Thanks
-
I see.
I would presume that these guys have been penalised then. It may be that they are now trying to reindex their site and are still appearing in the listings but much lower now.
There isnt really a definite way of knowing if they have been banned, but as good practise, I would try and get any URL removed from a website that I had any doubts about.
You could maybe try contacting the Google Webmaster team to confirm this, but I don't know how useful they will be. Then again, its worth a try at least.
Matt.
-
Hi Matt,
The sites I believe might have a problem are still showing up in the results like that. However, where they might have ranked the first page before, some of the blog sites that we link with are not in the top 100 results anymore. Strange...
-
Hi Matt,
The sites I believe might have a problem are still showing up in the results like that. However, where they might have ranked the first page before, some of the blog sites that we link with are not in the top 100 results anymore. Strange...
-
Hi Morgan,
The website will show in Google if you do a site:www.domain.com search.
If you searched for 'domain' (replace this with their website name, i.e. seo moz) and they don't show up in the top few listings thn you can be pretty sure that they have been banned.
As soon as this is the case, I would either contact the webmaster or manually delete the link if you can.
Good luck!
Matt.
-
Hi Matt,
Thanks for the help. If the site is still showing page results that is linking to you, that means it is not banned? I see a few that I have on blog rolls, but that site still shows 1500 results with google.
-
Hi Morgan,
Unfortunately, there isn't a quick way to do this.
What I have done is use Open Site Explorer and downloaded the .csv file of all the linking domains to my website.
Now that I have them in a spreadsheet, it is a bit easier to filter through them. I have been drilling down on links from blogs with a particularly low PA or DA. Then just doing the hard task of checking to see if they have individually been banned by Google. You can do this by searching for their domain to see if they appear.
This is a slow process, but better safe than sorry, eh?
Hope this helps.
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use existing page with bad URL or brand new URL?
Hello, We will be updating an existing page with more helpful information with the goal of reaching more potential customers through SEO and also attaching a SEM campaign to the specific landing page. The current URL of the page scores 25 on Page Authority, and has 2 links to it from blog articles (PA 35, 31). The current content needs to be rewritten to be more helpful and also needs some additional information. The downsides are that it has an "bad" URL- no target keyword and uses underscores. Which of the following choices would you make? 1. Update this old "bad" URL with new content. Benefit from the existing PA. -or- 2. Start with a new optimized URL, reusing some of the old content and utilizing a 301 redirect from the previous page? Thank you!
Technical SEO | | XLMarketing0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Backlink Redirection as Backlink Building Strategy?
Just checking in - i'm working on a site with tons of broken backlinks from high authority sites. For instance, I've discovered that some 90% of their backlinks are broken, and these are from highly recognizable, name brand magazines, newspapers, blogs and the like. Right now, the site has a Domain Authority of 48 (better than most in the industry from what i am learning) yet as the site has been around for years and has gone through 5 redesigns, there is an absolute ton of solid inbound backlinks that are getting 404's. Using Screaming Frog (list mode) I've also learned there are a ton of 301's that turn out to be redirecting to 404 pages so that also starts to add up. I always knew this was a problem / opportunity and I've always considered it a high priority to fix (301) broken links of this sort to improve ranking (you know, using htaccess or WordPress Redirection tools) -- and to avoid multiple redirects wherever possible. In fact, I consider it a basic all-win, no-lose strategy. I always assumed this was the case and I also assume this will continue to be so. However, as a professional, I always want to double check my assumptions every now and then... Is this still considered solid strategy? Are there any issues that one should look out for?
Technical SEO | | seo_plus0 -
How long does it take for Webmaster Tools to index a site?
I submitted my client's site about a week ago. It had 138 links, it's still at 43 links. Should it be taking that long to index? Thanks! Luciana
Technical SEO | | Luciana_BAH1 -
Organizing A Backlink Authority Category Page
I work for a company that has many promotions throughout the year, some big, some HUGE. Typically they have created a landing page for this content. The issue is, when this promotion ends, we will kill the landing page, thus 404ing the backlinks and putting the page authority in purgatory. (1) What would be the best way the keep these pages organized? I was thinking about creating a main "Promotions" page with the current promotion on it (the previous ones linked on the bottom of the page). Then when the promotion ends I would copy those contents and add them to a new page and link to it from the original "promotions" page. An issue I see with this is that the promotions page would always have the same Title Tag and vanity URL. (2) This could provide many links to the "promotions" page over time to build it's authority, but would constantly changing content hurt ranking factors?
Technical SEO | | nat88han0 -
Templates for Meta Description, Good or Bad?
Hello, We have a website where users can browse photos of different categories. For each photo we are using a meta description template such as: Are you looking for a nice and cool photo? [Photo name] is the photo which might be of interest to you. And in the keywords tags we are using: [Photo name] photos, [Photo name] free photos, [Photo name] best photos. I'm wondering, is this any safe method? it's very difficult to write a manual description when you have 3,000+ photos in the database. Thanks!
Technical SEO | | TheSEOGuy10 -
WWW and Without WWW Backlinks
I have just seen through ahrefs and found without WWW have more backlinks instead of WWW. Is there any way to forward all those without WWW to WWW domain, is there any harm or effect in serp ranking?
Technical SEO | | chandubaba0 -
Use webmaster tools "change of address" when doing rel=canonical
We are doing a "soft migration" of a website. (Actually it is a merger of two websites). We are doing cross site rel=canonical tags instead of 301's for the first 60-90 days. These have been done on a page by page basis for an entire site. Google states that a "change of address" should be done in webmaster tools for a site migration with 301's. Should this also be done when we are doing this soft move?
Technical SEO | | EugeneF0