Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
GWMT / Search Analytics VS OpenSiteExplorer
-
Just had the experience of using OSE data to show what we call "linkrot" to a client -- only to find that GWMT / Search Analytics shows no such thing.
Fortunately the client is an old friend and no face was lost, but it was dicey there for a bit as I have come to rely on and reference OSE again and again and again,
OSE showed Domain Authority dropping by about 1/3 in the last 12 months, presumably due to old links getting broken, linking sites changing their architecture etc.
And of course, ranking is tanking, as you would expect.
But Google shows many more (and much more spammy looking!) backlinks.
Has anyone had any experience benchmarking the 2 data sets of backlinks against each other? Dr Pete?
Does one update more frequently than another?Do you trust one more than another?? If so, why??
Thanks!
-
I know it's not always the answer people want to here, but Matt's right - this is basically where we're at. OSE tends to focus on higher-authority links and quality over quantity. Unfortunately, while this works well for tracking the strengths in your link profile, it doesn't always do as well at tracking the weaknesses. We're very much interested in expanding the quantity as well, but it's a balancing act and, in the interest of full transparency, there are many engineering challenges.
People have compared our index to Majestic and Ahrefs on the blogosphere. Since I can't claim to be unbiased, I'd welcome you to read those posts and make your own judgments. In fairness to Majestic and Ahrefs, all three of us are somewhat transparent about sources and at least our general methodologies. Unfortunately, Google is not very transparent about how they sample links or choose which data to show. So, direct comparison with any of the major SEO tools to Google Search Console proves to be a lot trickier. We're also not clear on Google's update cycle for that data.
-
I agree with Eric. No one source is going to give you a full picture of your link profile. Generally, OSE is best for measuring the overall strength of a full link profile, as many low-authority sites aren't indexed.
Also, keep in mind that there are a _lot _ of reasons that DA can go down, many of which have nothing at all to do with your specific link profile. That's why we recommend using it to benchmark against competitors rather than as an absolute score. Rand goes into more detail about that here:
DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
-
Whenever you deal with links, even though I really like OSE, typically we have to compile all of the link data from multiple sources. We typically use OSE, Majestic, ahrefs, Google Search Console, as well as others and compile all of the links into one spreadsheet and then look at them there. Different sites have different crawlers and no one source is the most accurate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Different DR on MOZ vs SEMrush
My domain has a different backlink profile on Moz and different on SEMRush. I don't under whats accurate. my domain is an AI Jobs Portal https://www.ml-jobs.ai/
Moz Pro | | mcafeeonline0 -
Google Site search operator showing different results than Search Console
Hey everybody, I am seeing some confusing results. I am seeing that in the back of our Search Console we are showing around 4,500 sites indexed. If I use the "site" operator in google, only 2820 show up... any thoughts as to why that happens?
Moz Pro | | HashtagHustler1 -
Increase in Rankings, but search visibility is decreasing
I just started updating my site with the very basics, focus keyword, meta description, and page titles. I see that I am going up in my targeted keywords, however my search visibility has dropped from 3.34% to 2.78%. My assumption is that my keyword choice is a little off, however I see that I have increased in keyword rankings and dropped very little in what I was ranking for previously, plus the keyword volume seems relatively the same. In fact I've added far more keywords than I have dropped. Curious why my search visibility has dropped, however my keyword rankings only seem to increase.
Moz Pro | | kthomasd0 -
What is the best way to treat URLs ending in /?s=
Hi community, I'm going through the list of crawl errors visible in my MOZ dashboard and there's a few URLs ending in /?s= How should I treat these URLs? Redirects? Thanks for any help
Moz Pro | | Easigrass0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Why is MOZ and Google search Volume so different?
A search term in MOZ shows the monthly search volume to be 49K. In Google, the same term shows the search volume at only 1300 monthly searches. Which do I trust? Thanks, Don
Moz Pro | | rcman0 -
How do you guys/gals define a 'row?'
I have a question about calls to the API and how these are measured. I noticed that the URL Metrics calls allow a batch of multiple URLs. We're in a position where we need link data for multiple websites; can we request a single row of data with link information for multiple URLs, or do we need to request a unique row for each URL?
Moz Pro | | ssimburg0 -
Seomoz Spider/Bot Details
Hi All Our website identifies a list of search engine spiders so that it does not show them the session ID's when they come to crawl, preventing the search engines thinking there is duplicate content all over the place. The Seomoz has bought a over 20k crawl errors on the dashboard due to session ID's. Could someone please give the details for the Seomoz bot so that we can add it to the list on the website so when it does come to crawl it won't show it session ID's and give all these crawl errors. Thanks
Moz Pro | | blagger1