September's Mozscape Update Broke; We're Building a New Index
-
Hey gang,
I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days.
This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them.
In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need.
We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made.
For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules.
I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
-
I hope we might actually have that 11/17 index out a little bit early. We've made a lot of fixes and optimizations, and, fingers crossed, it looks (so far) like it's making a difference in terms of speed to index processing completion.
-
Gotcha - makes a lot more sense now. Moz's DA/PA is clearly the gold standard that most in the industry rely on, but I didn't realize the extent of the processing required to make that happen. Even more props to Moz for the BHAG of taking on such a complex task all these years.
-
The same story here. Quite honestly, I think that last index was very much messed up, probably due to broken index before that. So, I gave up on it and simply waiting for next scheduled one - on 11/17.
-
It's been about a month since the last update on this and just curious if there's any news on the progress? I'm still seeing the same results in OSE that I've been seeing for the last couple of months so it appears it's not fixed yet, but is there any indication of when it might be?
-
Thanks for clarifying!
-
Sometimes yes. Sometimes, we don't know until we reach the last stages of processing whether it's going to finish or take longer. We're trying to get better at benchmarking along the way, too, and I'll talk to the team about what we can do to improve our metrics as an index run is compiling.
-
Thanks!
I have been noticing for quite some time that last minute changes in update release dates are becoming "normal". Is there way you guys can make those changes in update dates be announced earlier than on the expected update release date?
-
It didn't break, but it is taking longer to process than we hoped. Very frustrating, but we have a plan that, starting in a few more weeks, should get us to much more consistent index releases (and better quality ones, too).
-
Hello, Rand. I just noticed that yesterday new update was scheduled for October 8th. And just now it says October 14th! What's going on? I hope it didn't break again...
-
Hi Lehia
Crawl reports are separate from our Mozscape indexes. Also any delays with our index only impact the ability to access new data. With your crawl reports I have a suspicion the URLs with the 404s are ones with trailing slashes e.g. domain.com/ and not domain.com
If not, send us your account info and some examples at help@moz.com and we can take a look!
-
Hey Rand, is this why my crawl reports are saying that i have some 404 client errors on pages where I can't see any issues? Or is this another issue that I'm incurring?
Thanks in advance
-
Thanks Rand for the update. We have hired a full time marketing manager and he has been working hard the past month, I know he's excited to see the new results. "Putty & Paint does not a NEW Boat make" Fixing is a painstaking reality compared to building. Moz is great, so we will wait
-
Hi Joe - fair question.
The basic story is - what the other link indices do (Ahrefs and Majestic) is unprocessed link crawling and serving. That's hard, but not really a problem for us. We do it fairly easily inside the "Just Discovered Links" tab. The problem is really with our metrics, which is what makes us unique and, IMO, uniquely useful.
But, metrics like MozRank, MozTrust, Spam Score, Page Authority, Domain Authority, etc. require processing - meaning all the links needed to be loaded into a series of high-powered machines and iterated on, ala the PageRank patent paper (although there are obviously other kinds of ways we do this for other kinds of metrics). Therein lies the rub. It's really, really hard to do this - takes lots of smart computer science folks, requires tons of powerful machines, takes a LONG time (17 days+ of processing at minimum to get all our metrics into API-shippable format). And, in the case where things break, what's worse is that it's very hard to stop and restart without losing work and very hard to check our work by looking at how processing is going while it's running.
This has been the weakness and big challenge of Mozscape the last few years, and why we've been trying to build a new, realtime version of the index that can process these metrics through newer, more sophisticated, predictive systems. It's been a huge struggle for us, but we're doing our best to improve and get back to a consistent, good place while we finish that new version.
tl;dr Moz's index isn't like others due to our metrics, which take lots of weird/different types of work, hence buying/partnering w/ other indices wouldn't make much sense at the moment.
-
Talk about reading everyone's mind... I should point out though that Rand mentioned above that moz was working on a new real time tool like the ones we have seen elsewhere. I think a little patience might solve everyone's problems.
-
Thanks for the transparency as usual. A question I've always been wondering:
Moz seems to have much more stature, clout, and maybe funding compared to many other SEO software companies based around the world. And of course you offer more of a suite of products rather than just focusing on Open Site Explorer. But to me one of the most important SEO tools is the backlink explorer tools that companies offer, and it seems like OSE, although one of the first, lags compared to a few others. I've read that OSE isn't looking to just grab all the links, but only the most important ones. It seems though that there's been lots of technical challenges, and I can't help but think that there are other companies that have already solved their indexing challenges or are a few steps ahead of OSE.
Would Moz ever go out an buy a pretty good backlink explorer company like Ahrefs or Majestic or some other upstart that's solved that piece of the puzzle? Combining that new technology that's solved the indexing part with your DA algorithm seems like a match made in heaven. I'm sure you guys have considered this years ago internally, but it's a question I've always pondered...
-
Two potential solutions for you - 1) watch "Just Discovered Links" in Open Site Explorer - that tab will still be showing all the links we find, just without the metrics. And 2) Check out Fresh Web Explorer - it will only show you links from blogs, news sites, and other things that have feeds, but it's one of the sources I pay attention to most, and you can set up good alerts, too.
-
And they would have gotten away with it too if weren't for those meddling kids and their pesky subdomains
-
I did notice no new links added to a number of projects in the last 2 months and I was wondering what went wrong. Thanks for clearing up the issue with this post. We look forward to the resolution.
-
Yeah - the new links you see via "just discovered" will take longer to be in the main index and impact metrics like MozRank, Page Authority, Domain Authority, etc. It's not that they're not picked up or not searched, but that they don't yet impact the metrics.
And yes - will check out the other question now!
-
Hi Will - that's not entirely how I'd frame it. Mozscape's metrics will slowly, over time, degrade in their ability to predict rankings, but it's not as though exactly 31 days after the last update, all the metrics or data is useless. We've had delays before of 60-90+ days (embarrassing I know) and the metrics and link data still applied in those instances, though correlations did slowly get worse.
The best way I can put it is - our index's data won't be as good as it normally is for the next 20-30 days, though it's better now than it will be in 10 days and was better 10 days ago than it is today. It's a gradual decline as the web's link structure changes shape and as new site and pages come into Google's index that we don't account for.
-
Webmasters love of sub-domains... shake fist!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
API v1 still gets new data?
Hello! Do the v1 API endpoints provide fresh data, or do I need to use the v2 endpoints for fresh data? According to the v1 API docs "This guide outlines the endpoints for now archived Mozscape API endpoints." Does this mean that the v1 API only serves archived data? Thanks!
API | | peterkovacs0 -
Why the Feb 2018 update was so early?
Hi There! We are using Moz to compare our metrics to increase our SEO / SERP penetration. According to MOZ API Updates, it was mentioned that the next update will be on 26th Feb. But the update was early, could you please let us know the reason for the same. Why is there a discrepancy between the date mentioned for the Moz Update and the date of release? Thanks Malik Zakaria
API | | mzakaria0 -
Sitemaps and Indexed Pages
Hi guys, I created an XML sitemap and submitted it for my client last month. Now the developer of the site has also been messing around with a few things. I've noticed on my Moz site crawl that indexed pages have dropped significantly. Before I put my foot in it, I need to figure out if submitting the sitemap has caused this.. can a sitemap reduce the pages indexed? Thanks David.
API | | Slumberjac0 -
Moz Update
Hi, in the past we could see the next Moz update in https://moz.com/products/api/updates, but i can´t see it anymore there, has been changed to another place?
API | | Agenciaseomadrid0 -
/index.php causing a few issues
Hey Mozzers, Our site uses magento. Pages within the site (not categories or products) are set to display as www.domain.co.uk/page-url/ The hta access is set to redirect all version such as www.domain.co.uk/page-url to a url ending in a / However in google analytics and in moz landing page tracker these urls are being represented by www.domain.co.uk/page-url/index.php When visiting www.domain.co.uk/page-url/index.php a 404 is displayed. I know that by default when directed to a directory it automatically finds and displays the index file. So i understand why this is happening to some degree. However, when manually visiting this link does not exist. This poses a problem when trying to view the landing pages information in moz pro. I have 20 keywords being tracked in relation to www.domain.co.uk/page-url/ but because moz is recording it as www.domain.co.uk/page-url/index.php the keywords are unrelated so not showing information in relation to the page. Any ideas?
API | | ATP0 -
January’s Mozscape Index Release Date has Been Pushed Back to Jan. 29th
With a new year brings new challenges. Unfortunately for all of us, one of those challenges manifested itself as a hardware issue within one of the Mozscape disc drives. Our team’s attempts to recover the data from the faulty drive only lead to finding corrupted files within the Index. Due to this issue we had to push the January Mozscape Index release date back to the 29<sup>th</sup>. This is not at all how we anticipated starting 2016, however hardware failures like this are an occasional reality and are also not something we see being a repeated hurdle moving forward. Our Big Data team has the new index processing and everything is looking great for the January 29<sup>th</sup> update. We never enjoy delivering bad news to our faithful community and are doing everything in our power to lessen these occurrences. Reach out with any questions or concerns.
API | | IanWatson2 -
August 3rd Mozscape Index Update (our largest index, but nearly a monthly late)
Update 5:27pm 8/4 - the data in Open Site Explorer is up-to-date, as is the API and Mozbar. Moz Analytics campaigns are currently loading in the new data, and all campaigns should be fully up-to-date by 4-10pm tomorrow (8/5). However, your campaign may have the new data much earlier as it depends on where that campaign falls in the update ordering. Hey gang, I wanted to provide some transparency into the latest index update, as well as give some information about our plans going forward with future indices. The Good News: This index, now that it's delivered, is pretty impressive. Mozscape's August index is 407 Billion URLs in size, nearly 100 Billion (~25%) bigger than our last record index size. We indexed 2.18 trillion links for the first time ever (prior record was 1.54 trillion). Correlations for Page Authority have gone up from 0.319 to 0.333 in the latest index, suggesting that we're getting a slightly more accurate representation of Google's use of links in rankings from this data (DA correlations remain constant at 0.185) Our hit ratio for URLs in Google's SERPs has gone up considerably, from 69.97% in our previous index to 78.66% in the August update. This indicates we are crawling and indexing more of what Google shows in the search results (a good benchmark for us). Note that a large portion of what's missing will be things published in the last 30-60 days while we were processing the index (after crawling had stopped). The Bad News: August's index was late by ~25 days. We know that reliable, consistent, on-time Mozscape updates are critically important to everyone who uses Moz's products. We've been working hard for years to get these to a better place, but have struggled mightily. Our latest string of failures was completely new to the team - a bunch of problems and issues we've never seen before (some due to the index size, but many due to odd things like a massive group of what appear to be spam domains using the Palau TLD extension clogging up crawl/processing, large chunks of pages we crawled with 10s of thousands of links which slow down the MozRank calculations, etc). While there's no excuse for delays, and we don't want to pass these off as such, we do want to be transparent about why we were so late. Our future plans include scaling back the index sizes a bit, dealing with the issues around spam domains, large link-list pages, some of the odd patterns we see in .pl and .cn domains, and taking one extra person from the Big Data team off of work on the new index system (which will be much larger and real-time rather than updated every 30 days) to help with Mozscape indices. We believe these efforts, and the new monitoring systems we've got will help us get better at producing high quality, consistent indices. Question everyone always asks: Why did my PA/DA change?! There are tons of reasons why these can change, and they don't necessarily mean anything bad about your site, your SEO efforts, or whether your links are helping you rank. PA and DA are predictive, correlated metrics that say nothing about how you're actually performing. They merely map better than most metrics to Google's global rankings across large SERP sets (but not necessarily your SERPs, which is what you should care about). That said, here's some of the reasons PA/DA do shift: The domains/pages with the highest PA/DA scores gain even faster than most of the domains below them, making it harder each index to get higher scores (since PA/DA are on a logarithmic scale, this is smoothed out somewhat - it would be much worse on a conventional scale, e.g. Facebook.com 100, everyone else 0.0003). Google's ranking algorithm introduces new elements, changes, modifies what they care about, etc. Moz crawls a set of the web that does or doesn't include the pages that are more likely to point to a given domain than another. Although our crawl tends to be representative, if you've got lots of links from deep pages on less popular domains in a part of the web far from the mainstream, we may not consistently crawl those well (or, we could overcrawl your sector because it recently received powerful links from the center of the web). My advice, as always, is to use PA/DA as relative scores. If your scores are falling, but your competitors' are falling more, that's not a bad thing. If your scores are rising, but your competitors' are rising faster, they're probably gaining ground on you. And, if you're talking about score changes in the 1-4 points range, that's not necessarily anything but noise. PA/DA scores often shift 1-4 points up or down in a new index so don't sweat it! Let me know if you've got more questions and I'll do my best to answer. You can also refer to the API update page here: https://moz.com/products/api/updates
API | | randfish8 -
Lost many links and keyword ranks since moz index update
Hi All, I came back from work today from a week off to find my site has gone from 681 external inbound links to 202. With this my domain authority, moz trust and moz rank have all also taken a slip. Compounding this, I am seeing a slip most of my keywords rankings. If i try to use the open site explorer to explore my links and see what going on i get the message It looks like we haven't discovered link data for this site or URL. If i check the just discovered links like it suggests I get It looks like there's no Just-Discovered Links data for this URL yet. I know these features worked before the index as i used them. Is this all attributable to the moz index issues that have been noted or could something have happened to my site? Since i started 2 months ago I have made many changes including... Updating the site map that was 4 years out of date and included 400 broken urls Removed blank pages and other useless webpages on the site that contained no content (from the previous administrator) Edited a few pages content from keyword spammy stuff to nicely written and relevant content Fixed url rewrites that made loops and un-accessible product pages All these changes should be for the better but the latest readings have me a little worried. Thanks.
API | | ATP0