Why would a domain show up in Webmaster Tools but not latest links?
-
I am going through links and trying to figure out what to disavow. I found a domain under "Who links the Most" I wanted to see what the exact link was and I can't find it when I download all links. Why would that be?
-
We have never done anything blackhat or spammy, but over time we have just acquired crappy links. We had a traffic drop when the first penguin was rolled out so we thought cleaning up some of the old crap would be a good idea.
-
I have seen links reported in GWT that when I clicked on were not there. I like to look at the page source and search for the link just to see if it might be hidden. But all I can guess is the GWT data is old and the link is no longer valid. Recently I read this article by Marie Haynes http://searchenginewatch.com/article/2374406/Penguin-Recovery-Should-You-Be-Removing-Links-or-Just-Disavowing and asked her in the comments what she thought about proactive disavow. Take a look but basically she says if you have no history of shady spammy links, you have nothing to worry about. But please read her words as she is very respected in this area.
-
We aren't in any sort of penalty, the site has been around close to 14 years and over time it has just picked up some crappy links. I figured it would be a good idea just to be proactive and get rid of stuff that I know is bad.
-
The only thing I can think of is that Google's John Mueller has said in the past that they only show a sample of links not all the links which has been a giant pain for many trying to do clean ups, however they have also said if it is not in WMT latests links then you need not bother with it as part of your disavow.
However if you are aware of a link and you think it is a poor one that you want to disassociate yourself with then I would add it and make a note about it if you are also in a penalty that needs a reconsideration request. It shows you are looking further than the bare minimum.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Legacy domains
Hi all, A couple of years ago we amalgamated five separate domains into one, and set up 301 redirects from all the pages on the old domains to their equivalent pages on the new site. We were a bit tardy in using the "change of address" tool in Search Console, but that was done nearly 8 months ago now as well. Two years after implementing all the redirects, the old domains still have significant authority (DAs of between 20-35) and some strong inbound links. I expected to see the DA of the legacy domains taper off during this period and (hopefully!) the DA of the new domain increase. The latter has happened, although not as much as I'd hoped, but the DA of the legacy domains is more or less as good as it ever was? Google is still indexing a handful of links from the legacy sites, strangely even when it is picking up the redirects correctly. So, for example, if you do a site:legacydomain1.com query, it will give a list of results which includes pages where it shows the title and snippet of the page on newdomain.com, but the link is to the page on legacydomain1.com. What has prompted me to finally try and resolve this is that the server which hosted the original 5 domains is now due to be decommissioned which obviously means the 301 redirects for the original pages will no longer be served. I can set up web forwarding for each of the legacy domains at the hosting level, but to maintain the page-by-page redirects I'd have to actually host the websites somewhere. I'd like to know the best way forward both in terms of the redirect issue, and also in terms of the indexing of the legacy domains? Many thanks, Dan
Intermediate & Advanced SEO | | clarkovitch0 -
Google WMT Turning 1 Link into 4,000+ Links
We operate 2 ecommerce sites. The About Us page of our main site links to the homepage of our second site. It's been this way since the second site launched about 5 years ago. The sites sell completely different products and aren't related besides both being owned by us. In Webmaster Tools for site 2, it's picking up ~4,100 links coming to the home page from site 1. But we only link to the home page 1 time in the entire site and that's from the About Us page. I've used Screaming Frog, IT has looked at source, JavaScript, etc., and we're stumped. It doesn't look like WMT has a function to show you on what pages of a domain it finds the links and we're not seeing anything by checking the site itself. Does anyone have experience with a situation like this? Anyone know an easy way to find exactly where Google sees these links coming from?
Intermediate & Advanced SEO | | Kingof50 -
Links with Parameters
The links from the home page to some internal pages on my site have been coded in the following format by my tech guys: www.abc.com/tools/page.html?hpint_id=xyz If I specify within my Google Webmaster tools that the parameter ?hpint_id should be ignored and content for the user does not change, Will Google credit me for a link from the home page or am I losing something here. Many thanks in advance
Intermediate & Advanced SEO | | harmit360 -
Cross-linking domains dominate SERP?
Hi, I have been doing some keyword research and noticed two domains properly linking back to each other for almost every piece of content. I thought this was not working any longer but it looks like it works for them. For many competitive keywords, they rank in top 10, and even for some keywords, they rank #1 and #2. PA and DA not more than 36-38. With 3-4 linking root domains, these pages manage to rank in top 10. And the second strategy they have, is to create alternative text to rank for a number of different long-tail-keywords. Seperate pages targeting seperate keywords and the only difference between them is slightly modified text and images. Third is possibly the best, their second domain is an exact match domain name for most keywords linked to this industry. On some SERP's, they have 8-10 results in top 30. SEMRUSH shows %500 growth for both of these domains. So, I guess I should just sit and admire them.
Intermediate & Advanced SEO | | Gamer070 -
What are your thoughts on using Dripable, VitaRank, or similar service to build URL links too dilute link profile???
One of my sites has a very spamy link profile, top 20 anchors are money keywords. What are your thoughts on using Dripable, VitaRank, or similar service to help dilute the link profile by building links with URLs, Click Here, more Info, etc. I have been building URL links already, but due to the site age(over 12 years) the amount of exact match anchor text links is just very large and would take forever to get diluted.
Intermediate & Advanced SEO | | 858-SEO0 -
Cross Sub Domain Canonical Links
I currently have 1 website, but am planning on dividing it into sub-domains specific to geographic locations such as xxx.co.uk, xxx.it, xxx.es, etc... We are working on creating original content for the sub-sites, however upon launch many will be duplicate pages. Is there a problem with cross sub-domain canonical links? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
If google ignores links from "spammy" link directories ...
Then why does SEO moz have this list: http://www.seomoz.org/dp/seo-directory ?? Included in that list are some pretty spammy looking sites such as: <colgroup><col width="345"></colgroup>
Intermediate & Advanced SEO | | adriandg
| http://www.site-sift.com/ |
| http://www.2yi.net/ |
| http://www.sevenseek.com/ |
| http://greenstalk.com/ |
| http://anthonyparsons.com/ |
| http://www.rakcha.com/ |
| http://www.goguides.org/ |
| http://gosearchbusiness.com/ |
| http://funender.com/free_link_directory/ |
| http://www.joeant.com/ |
| http://www.browse8.com/ |
| http://linkopedia.com/ |
| http://kwika.org/ |
| http://tygo.com/ |
| http://netzoning.com/ |
| http://goongee.com/ |
| http://bigall.com/ |
| http://www.incrawler.com/ |
| http://rubberstamped.org/ |
| http://lookforth.com/ |
| http://worldsiteindex.com/ |
| http://linksgiving.com/ |
| http://azoos.com/ |
| http://www.uncoverthenet.com/ |
| http://ewilla.com/ |0 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0