Non-Recognition of Links
-
Hi All,
I asked about a client last month and have had to do some other digging to try to find out what's going on with its Google rankings.
According to our link-building spreadsheet, we have got up to 50 links (from 50 domains) in process of being actioned and a large proportion of these are actually in existence.
There are two questions:-
1. Open Site Explorer only recognises 3 domains - as I know that other domains exist and are pointing (mostly 'followed'), what can be the reason OSE doesn't recognise this?
2. What can be done to encourage these external links to be more easily accessible by OSE and, presumably other bots?
Other Points:-
1. I initially thought it might be crawl blocking issue causing the rankings, but Bing/Yahoo rankings are slowly dragging themselves upwards.
2. Robots.txt is not blocking any of the site
3. Pro on-site analysis for the target keyword is 'A'
4. The website's stats per OSE are better than some competitors in the top 20 except on the root domain issue, which is why the above point is important.
Link building for other clients has worked really well without hiccups and with general gradual recognition, so any tips from more experienced folks out there would be gratefully appreciated.
Many thanks,
Martin
-
Hi Martin,
You might find it useful to take a look at the Linkscape Update Schedule in case timing is a factor.
I believe Rand outlined the recent changes to the indexing rationale in this webinar
Using Open Site Explorer to Uncover New Marketing Opportunities , but if you still have questions then as Brian suggested, it may be a good idea to lodge a ticket or email the Help Team. help [at] seomoz.org.
Hope that helps,
Sha
-
Hey, if the page you got the link on was interesting enough that you wanted to get a link on it, than what harm is there in letting the world know about that resource via Twitter, Facebook, or whatever other service you choose? ... and if it's not worth talking about, or you would be embarrassed to speak of it, than how "quality" was that link anyway
On the OSE Catch 22, gotcha... all I can think of is that perhaps the low quality sites are not always re-crawled with the update, thus not picking up the new links?. An SEOmoz staffer with intimate knowledge of the crawl behavior could better answer that one though
Brian
-
Hi guys,
Thanks for the feedback so far and I will be definitely checking GWT and maybe even tweeting out the links. I did think that seemed a little bit... you know, false - but I guess it's just ensuring Google takes note of the actual page? What do people think? I'm unwilling to Facebook them out, because that's even more 'in your face' and I'm unwilling to SPAM out 50 domains just to get them indexed. Advice welcomed on these points.
@Brian - yes, I suppose they could be coming from lower domains, but equally many have been pulled from competitor link data from OSE, so catch 22?
@Theo - I will double-check
@Ross - firmly NO to Black Hat. I don't do this ANYWAY, but equally something's screwing up the SEO anyway, so going down that route could permanently jeopardise the site and that's not what the client's paying for.
-
Like Theo said, I would start with Webmaster Tools ( Links to your site > All domains area ), if they are in there, Google knows about them, and if they have any value to pass though that link, they are passing it.
One other quick note, if you know those pages you are getting links from are all index and follow pages, you may want to just double check to see if they have been indexed ( Google search for site:www.the-exact-domain.com/and-page-url.html ), if you get no results back, then you know those pages are not in the index (not found yet, or otherwise dropped).
On the OSE thing, if I am remembering this correctly Rand said something about how they were focusing the crawl, pulling in less low quality sites - could it be that the domains you are getting links from are low quality?
Brian
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
Hi Martin,
Although OSE is an awesome tool it is still in its infancy and may not have the capacity to crawl the links you are talking about. Another way to check the links is to have a look via majestic SEO, they having a much bigger index than OSE and tend to show a good deal more links.
I would also have a look at the Google Webmaster Tools and see if the links are present in there.
If you are worried about the links being crawled and indexed by google then take the URL and run it through Google itself with the site: command. If it does not turn up there is a chance it may not be indexed. If it is not indexed I believe that the site: command that returns no values must send a Google bot to the URL to crawl it- can't confirm this is true but it just makes good sense.
If you want to be doubly sure you links are getting crawled you can force a crawl by Google by bookmarking the page thorugh a bookmarking service or sharing it in a social network.
WARNING, MESSY BLACK HAT TACTICS COMING UP*****
And if you really want to give it a good ole kick up the jaxie you can load up a automatic bookmarking tool and bookmark the URL with you link on over a couple hundred domains. Problems with this method include:
- need to buy spammy software like bookmark demon
- you are in effect creating a link wheel which may devalue your efforts
- it sticks out like a sore thumb
- links on bookmarking sites drop of the link graph or get devalued very quickly..
However, the positives that come out of this technique are you link will be crawled and indexed and it will have another couple of hundred links pointing at it....... for a while.
If you are working with a client I would recommend just running it through facebook or tweeting out the link and stay away from forcing any crawls. However, if it is the middle of november and you have a christmas shop that needs to rank quickly get that black hat on.
Hope that helps.
-
The fact that OSE doesn't pick up a link doesn't necessarily mean a link isn't 'active' and giving your site value. Even though Linkscape captures a vast amount of URLs, it only crawls a portion of the web, most likely from the bigger pages down. If many of these links to your site are coming from smaller / less powerful domains, they might not (yet) have been picked up by Linkscape.
Try looking at Google Webmaster central to see if the links are included there. If Google lists them as links there, they are very likely to be counted by them as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound Links Warning
I got the following error about our domain name in Link Explorer. "You entered the URL freexy.net which redirects to youlovelife.com/?domain=freexy.net. Click here to analyze youlovelife.com/?domain=freexy.net instead." Can you give me an advice about this problem?
Moz Pro | | ligia.tatucu0 -
How can we efficiently use Fresh Web Explorer and Just Discovered Links?
Love the fresh data sources SEOmoz is building for us. However, I'm frustrated by the lack of scale the tools offer. Let's say I have 30 competitors I want to watch (which is pretty conservative - if we're targeting 100 keywords on a site, we could easily have 100's of top 20 ranked competitors). If I have to run individual reports for each using OSE and Fresh Web Explorer, that would be hours of work every day/week. Ideally, I'd like to see a campaign feature where you could add 2-200 competitors to view in one report. You could view recent links (from FWE and JDL) for all competitors on one handy report, and sort by various metrics. So for example, if you wanted to view the top 10 links your competitors have gotten in the past week, you could see that in 30 seconds of work, vs many hours of work. Any others who think this would be useful? Any ideas for how we can use the data in such a way without this feature?
Moz Pro | | AdamThompson1 -
Why are not nofollowed links counted in On-Page Analysis Report?
When I run the On-Page Analysis on our homepage, the report says the page has 238 **"Internal followed links". ** Why are not nofollowed internal links counted as well? Nofollowed links have been leaking link juice for quite some time now. Martin
Moz Pro | | TalkInThePark0 -
Number of available links limited?
OK, I've been making use of the free LinkScape API (on behalf of a client of mine) and trying to get links (and info on those links) to a specific domain/page/etc. NOTE : I've been using it without any issue in the past, however we are currently facing some weird issues. Let's take this simple query as an example : http://lsapi.seomoz.com/linkscape/links/wikipedia.org?SourceCols=4&TargetCols=4&Sort=page_authority&Scope=page_to_domain What this one supposedly does is to get links to "wikipedia.org", right? I'm reading : The Page_to_* scopes will by default return 25 links per source domain if no limit is specified, so you can see domain diversity. Due to space limitations in our API, a general link query for a given page will return at most 25 pages for every unique domain linking to that page. And I'm saying OK, that's fine. The thing is that (instead of the 1000 links I had been getting before), I'm now getting just 25 links. NOT per... "source domain"... but obviously per "target domain" (= wikipedia.org) - or am I missing something? (well, probably wikipedia suddenly has just about 25 links pointed to it... makes sense! 🙂 ) Please, let me know what's going on with the above, simply because getting just 25 links is close to worthless... Thanks a lot, in advance!
Moz Pro | | drkameleon0 -
Are organic site links excluded in the rankings report?
If I'm pulling a rankings report for a specific subdomain and a different subdomain appears at the main result and the specific subdomain is a site link beneath it, does that show up in the results? It doesn't seem to in my report.
Moz Pro | | mattiasantin0 -
DA or PA for link building strenght
Hello, We're doing link building for nlpca(dot)com with Open Site Explorer. 90% of the sites found that we're targeting have NLP resource sections that will probably list our site because we are an authority. My Excel Spreadsheet has the following values: Backlink Holder Site Name Site URL Site Type Tactic PA Ease (1-5) Estimated Value (1-5) Priority (Ease x Est. Value) Contact Info -- I'm only interested in strong sites - sites that will be around for the long haul, and I'm stopping aquiring sites that their resource section has a PA lower than 20. Should I be incorporating DA in some way as well? What other feedback do you have for me?
Moz Pro | | BobGW0 -
In OpenSiteExplorer - how do I find out which in bound links were lost?
http://www.opensiteexplorer.org/www.homefinder.com/a!links The # of inbound root domains is ~3,100 and has remained relatively flat for a while yet we've been acquiring 1000's of new root domains that link to us. That said, we want to get some clarity on why the count has remained flat. It would be helpful if we could see which domains have stopped linking to us from one OSE index to the next. Is it really the case that we're losing links as fast as we're gaining them or is something else going on?
Moz Pro | | homefinder1 -
Link Count Per Page Including JavaScript Links - Should We Worry About Them?
With large ecommerce sites, we usually have more than 100 links per page and many times have more than 200 links on each page due to links and images in the header, footer, guided navigation and then the body product grid and content. When I use most on-page link counting tools like SEO x-ray and the SEO Moz Pro crawl report, I notice that every visible link on the page gets counted. This includes and javascript based links that expand the product grid to 30, 60 or view all, javascript sorting links, javascript links to view customer reviews for each product. etc. There was a QA post here http://www.seomoz.org/q/should-i-nofollow-the-main-navigation-on-certain-pages about nofollowing and page rank sculpting and it seems pretty unanimous that most don't think that page rank sculpting is very valuable. So my question is, are the javascript links on pages that don't link to another page viewed differently by search engines? If so, shouldn't there be a way to see on-page link count minus javascript call links that don't actually link to another page? To expand a bit on my question, we also use nofollow attributes on the text links in the left navigation that are meant for refining products just as the javascript links in the product grid are meant to refine the products, sort them, allow for product comparison, allow for viewing customer reviews, etc. So should it be ok to have 300 links on a page if the unimportant ones that you don't want crawled like the left navigation refinements and product grid javascript links all have rel="nofollow" applied to them? I know that would basicly be PageRank sculting, but it seems like the best options for shopping sites that have a lot of navigation links.
Moz Pro | | abernhardt0