GWMT / Search Analytics VS OpenSiteExplorer
-
Just had the experience of using OSE data to show what we call "linkrot" to a client -- only to find that GWMT / Search Analytics shows no such thing.
Fortunately the client is an old friend and no face was lost, but it was dicey there for a bit as I have come to rely on and reference OSE again and again and again,
OSE showed Domain Authority dropping by about 1/3 in the last 12 months, presumably due to old links getting broken, linking sites changing their architecture etc.
And of course, ranking is tanking, as you would expect.
But Google shows many more (and much more spammy looking!) backlinks.
Has anyone had any experience benchmarking the 2 data sets of backlinks against each other? Dr Pete?
Does one update more frequently than another?Do you trust one more than another?? If so, why??
Thanks!
-
I know it's not always the answer people want to here, but Matt's right - this is basically where we're at. OSE tends to focus on higher-authority links and quality over quantity. Unfortunately, while this works well for tracking the strengths in your link profile, it doesn't always do as well at tracking the weaknesses. We're very much interested in expanding the quantity as well, but it's a balancing act and, in the interest of full transparency, there are many engineering challenges.
People have compared our index to Majestic and Ahrefs on the blogosphere. Since I can't claim to be unbiased, I'd welcome you to read those posts and make your own judgments. In fairness to Majestic and Ahrefs, all three of us are somewhat transparent about sources and at least our general methodologies. Unfortunately, Google is not very transparent about how they sample links or choose which data to show. So, direct comparison with any of the major SEO tools to Google Search Console proves to be a lot trickier. We're also not clear on Google's update cycle for that data.
-
I agree with Eric. No one source is going to give you a full picture of your link profile. Generally, OSE is best for measuring the overall strength of a full link profile, as many low-authority sites aren't indexed.
Also, keep in mind that there are a _lot _ of reasons that DA can go down, many of which have nothing at all to do with your specific link profile. That's why we recommend using it to benchmark against competitors rather than as an absolute score. Rand goes into more detail about that here:
DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
-
Whenever you deal with links, even though I really like OSE, typically we have to compile all of the link data from multiple sources. We typically use OSE, Majestic, ahrefs, Google Search Console, as well as others and compile all of the links into one spreadsheet and then look at them there. Different sites have different crawlers and no one source is the most accurate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam score is 7/17
My site has a spam score of 7/17 and I'm not really sure how this happened - I think maybe when I changed my domain from Wordpress to Bluehost? It's very hard to get my site to rank in Google and I'm getting very few hits through search engines, wondering if the spam score is why? How do I fix this? The flags are: Low MozTrust or MozRank Score
Moz Pro | | Plant-Powered
The site link profile is not trustworthy. ✓ Large Site with Few Links
We found very few sites linking to this site, considering its size. ✓ Site Link Diversity is Low
The diversity of link sources to this subdomain is low. ✓ Ratio of Followed to Nofollowed Subdomains
The ratio of followed to nofollowed subdomains linking to this subdomain is outside the normal range of others in our index. ✓ Ratio of Followed to Nofollowed Domains
The ratio of followed to nofollowed domains linking to this subdomain is outside the normal range of others in our index. ✓ Small Proportion of Branded Links
Links to this subdomain have low amounts of branded anchor text. Low Number of Pages Found
Crawl only gets a valid response to a small number of pages. What will this be doing to my site and how do I make this better? Thanks!0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
How to add a Google+ business page to the social analytics?
Since my Google+ account is attached to my business page, MOZ is tracking the actual user page. I want to track the business page. It won't let me add another user. How do I fix this? Thanks.
Moz Pro | | kcampbell990 -
Finding Related Keywords and/or Phrases
I'm diving into unknown territory and looking for keywords and phrases related to a prospective new clients business. He knows some of the general terms, for example, water removal. What tool works best to find related search terms and the number of searches for those keywords/search terms. I usually use the Google Keyword Tool but going to give moz a try. Love to be able to get local search data as well. Thanks!
Moz Pro | | DFLsports1 -
Rogerbot's crawl behaviour vs google spiders and other crawlers - disparate results have me confused.
I'm curious as to how accurately rogerbot replicates google's searchbot I've currently got a site which is reporting over 200 pages of duplicate/titles content in moz tools. The pages in question are all session IDs and have been blocked in the robot.txt (about 3 weeks ago), however the errors are still appearing. I've also crawled the page using screaming frog SEO spider. According to Screaming Frog, the offending pages have been blocked and are not being crawled. Webmaster tools is also reporting no crawl errors. Is there something I'm missing here? Why would I receive such different results. Which one's should I trust? Does rogerbot ignore robot.txt? Any suggestions would be appreciated.
Moz Pro | | KJDMedia0 -
How long does opensiteexplorer take to generate exportable reports
I tried exporting a few reports out of opensiteexplorer. They have been pending for more than 36 hours. Progress bar appears to be stuck. Is this normal? How long does it take to get the reports generated?
Moz Pro | | ibwork0 -
Does SEOmoz have a keyword visibility report / tool?
Know of a tool where I can show the overall success of a set of keywords, us vs. the competition? A visibility report where a #1 ranking is worth 30 points, #2 is worth 29, so on and so on down to #30 worth 1 point, outside of the top 30 is worth nothing. (something like this). Trying to show an overall visibility scorecard, not sure if I can do it here or with some other tool. Didn't see it on Raven. Thanks!
Moz Pro | | akim260