Skip to content
What is Google Clamping Down On Spring 2024 Updates Results Header

What Is Google Clamping Down On? Spring 2024 Updates — Whiteboard Friday

Tom Capper

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Table of Contents

Tom Capper

What Is Google Clamping Down On? Spring 2024 Updates — Whiteboard Friday

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Learn about Google updates from this Spring in this Whiteboard Friday with Tom Capper.

Click on the whiteboard image above to open a high-resolution version!

Happy Friday, Moz fans. Today, I want to talk to you about the spring 2024 Google updates and, what I think Google might have been trying to achieve with these updates and also what they say that they're achieving, which can be slightly different.

So, back on March 5, Google announced two updates — a core update and what they're calling a spam update.

Now, obviously, I should mention, I should caveat, Google probably does, I don't know, half a dozen, a dozen updates every day. We've seen numbers in that kind of order in the past when they've disclosed this.

They only announce updates that are either particularly large, so they're sort of warning us that turbulence is coming, or where they want us to understand that they're doing an update and they're trying to push us as an industry to behave in a certain way or push maybe the media industry to behave in a certain way.

So it's always worth keeping that in mind. When I say there were two updates at the start of March, I mean there were two updates we know about at the start of March and that Google chose to comment on.

In-person tickets are 95% sold out; grab yours now while you can! Save $300 with our Night Owl tickets, ending on May 17.

Now, in the spam part of this update, which I think is more interesting, I'll touch briefly on the core component much later, but I mainly want to talk about the spam component.

Scaled content

There were two changes they made or two things that they chose to put a new level of emphasis on, and one of the things they talked about was what they're calling slightly euphemistically scaled content.

Scaled content

Now, in their description of this, they do talk about AI. I think it's pretty obvious, and I've chosen to draw a picture of a robot as well to drive it home. I think it's pretty obvious they're talking about AI-written content to a large degree here. And they have this line where they say, "Oh, it's always been our policy that AI content is fine as long as it's helpful."

There's a bunch of problems there. I mean, firstly, that wasn't always their policy. They used to say, a few years ago even, that computer-generated content was always against their guidelines. So, it's not true to say that they've had a consistent line over time.

But also this "helpful," it's doing a lot of work. I think if I were considering rolling out AI-written content on my site right now, I would be very concerned about being caught in the crossfire here about Google not necessarily identifying particularly well what was and wasn't helpful. There are lots of anecdotes about this in the wild as well.

Yeah, I think you want to be very careful and very slow if you are considering AI content right now. I could totally imagine that Google is being a bit blunt with this in practice, even if that's not what their messaging says.

I also think there's a lot of scaled content that SEOs have produced for a long time, long before the advent of GPT and LLMs, that is, in some cases, helpful, and in some cases, not. I remember working 5 — 10 years ago on lots of sites that had heavily templated local landing pages, for example, which were essentially computer-generated content, not written by AI, but still computer-generated. I would say those pages were helpful, but they were certainly scaled.

And you could imagine there's a lot of nuance here, a lot of sort of deliberate fuzziness and ambiguity in what Google is saying.

Site reputation abuse

The second thing that they pointed out or drew attention to was what they're calling site reputation abuse.

Site reputation abuse

So this is a much narrower, a much more specific thing. So I've drawn up, I must say, a fictional example here, but this is something I see a lot: a news site or some other very reputable, strong domain. So here I've put my fictional example, and this domain does actually exist, but it's not doing anything like this.

My fictional example is Big News on big.com. This would be a high-DA, household name site, and they might have a subdomain, something like vouchers.big.com, which is a voucher code site that Big News does not themselves manage. They're basically renting out their subdomain space to someone else who's taking advantage of the authority of their domain and brand and, as such, is ranking well for these fairly irrelevant terms.

This is a really widespread tactic, I think, particularly in voucher codes actually, but not only. And it's kind of interesting that Google is drawing attention to this and, in doing so, tacitly admitting that this has worked.

Google has said for a long time, basically, that domains don't have authority, that pages have authority or PageRank as they would call it, and that domain-level signals don't really exist.

At times, they've said various things over the years that sort of draw into question that hard rule and this is one of them. If Google really had no domain-level signals, they would not need to have a special update to deal with this kind of thing.

But this also speaks potentially to the importance of the brand maybe as a domain-level signal or maybe in its own right. And again, that is something that I'll come back to in a moment.

So these are the two sort of things that Google wants us to talk about with this update.

However, there have been some other narratives in the industry that I think are very interesting and that Google haven't themselves drawn attention to.

Independent sites are being punished

Independent sites are being punished

So one of them is a lot of SEOs right now, and a lot of website owners are saying that they feel independent sites have recently, and especially in these recent updates, been heavily punished. Small independent sites have done much worse than they have done historically, or maybe they're being outranked by larger sites with the exact same content that was stolen or this kind of thing.

Now, this is very anecdotal, I must say. It's quite challenging to produce hard data on this kind of thing, which I'll come back to in a moment.

This is a narrative that is getting some traction, and I want to talk about two of the reasons why this could be and two explanations for why we might feel an effect like this.

Reddit is being pushed heavily by Google

So the first one is, as a lot of people have noticed, Reddit is being pushed heavily by Google, and Quora as well to a slightly lesser extent. And I don't know if you remember, but one or two years ago, there were a lot of people complaining, "Oh, these days, in order to do anything sensible with Google, I have to put reddit.com on the end of my query in order to get a sensible result."

Well, it seems like Google really took that to heart. I pulled together some quick data. So I looked for, I think, the first Sunday in April, comparing year-on-year what percentage of top 10 organic results were Reddit. And this is the MozCast query set, so it's got 10,000 head terms, US and UK.

And this went from 0.06% to 0.7%, so that's more than a 10x increase. And then for totals, so this includes non-organic results, such as discussions in forums. For example, Reddit went from 0.1% to 1.3%.

And for that, I used the top 20 because, obviously, I had to cut it off somewhere. There are more than ten results on the front page if you're including non-organic results. And obviously, this is pushed heavily by discussion forums, a feature that has seen a huge gain in traction in the last year.

Now, I point out Reddit as a potential explanation for why independent sites might be feeling a little bit hard done by at the moment because, to some extent, search is a zero-sum game. And if Reddit is getting these huge jumps in visibility on queries that call for a sort of personal experience and this kind of thing, well, if Reddit is getting much more traffic, then someone else has to be getting less.

So that's one potential explanation, for which we do have data, of why something like this could be happening.

Google is facing existential threats

The other thing that I might speculate on here is that Google is facing three concurrent existential threats at the moment.

AI content

So AI content, the search results filling with AI content, particularly if it's bad, is an existential threat to Google.

AI competitors

AI competitors, I think just before I recorded this, Perplexity announced that they were going to produce an LLM search engine. OpenAI have said something similar. I don't personally agree, but some people think that ChatGPT is a Google competitor.

So, if you believe any of these things, that is also an existential threat to Google.

Search quality

And then lastly, there have been these comments for many years now about how search quality is seemingly declining. That is also an existential threat to Google.

And arguably, brand signals, turning up the dial on how important brand is to ranking, would be an answer to all three of these.

So this is something that Google could be doing as maybe a bit of a panic reaction or maybe quite a sensible reaction, which, again, would be felt most keenly by independent sites.

So this is just speculation on my part; two potential reasons why that could be true.

Why the speculation?

So, the last thing I want to talk about before we wrap this up is why I am talking in speculative terms. Why am I saying why things might be true?

Data issues include core timing, context vs. sample, and complex causation

I want to issue a little bit of a safety warning about data around some of these updates. There is good data out there. There are people trying hard and thinking about this in a very analytical and sensible way. But I want to stress that this is very difficult. It has a very low level of certainty when you look at these updates on aggregate.

The first thing is, as I mentioned at the start, there's a core update and a spam update in this case, on the same day. In recent times, we've often seen Google do that. Like either it'll be a core update and a helpful content update, or a core update and a review update, or something like this at almost the same time.

I don't know if this is deliberate, but it makes analysis very difficult. I've talked before on a Whiteboard Friday, and maybe it'll even be linked below, about how, with core updates, they're a different beast.

So, with core updates, most sites that are affected by multiple core updates over time see big, positive jumps in some updates and big negative updates in others.

Core updates shouldn't really be thought about as, "Oh, I did something wrong." They should be thought about as Google optimizing and tweaking over time, in my view. And the fact that sites have such a choppy trajectory or an inconsistent trajectory with core updates means that we don't really want to read too much into anyone's movement with one update, and it also complicates any other updates we're looking at, at the same time.

There's a challenge in how you want to look at this.

So either you look at a small sample of sites and then you have the context. I remember when I worked at an agency, I used to read winners and losers analysis for these updates. And sometimes I'd see one of our clients, and I would say, "Oh, no, that site didn't tank. It just swapped domain, and the new domain is shown as a winner, and the old domain is shown as a loser." There's always this kind of context. People change their strategy, split their site, acquire a site, etc.

With a large sample analysis, you don't have this kind of context. With a small sample analysis, you have a small sample size. It's a trade-off that you can't really do much about.

Lastly, the causation picture for these kinds of analyses can be very, very complex.

I can say, "Oh, well, sites that have, for example, author profiles seem to be doing very well." Is that because they have author profiles, though? Is author profiles a ranking factor? Or is it that the kind of site that tends to have author profiles is, for some other reason, doing well or some other complex relationship?

In some cases, it can be that ranking well causes you to adopt specific strategies. It can be completely the other way around. So, this can be very difficult to analyze.

I don't want to discourage anyone, but I do just want to say, yeah, take this kind of data analysis around these updates with just a little bit of salt. It is very challenging. So that's all.

Hopefully, I've given you a lot to think about. Hopefully, your site did well rather than badly, so you're watching this. If not, good luck. I think, as I say, core updates, you are as likely to go up in the next one as not. And hopefully, some of these more questionable tactics that we talked about, hopefully, that's not what you're doing, or you're already on top of it.

So, yeah, thank you.

Transcription by Speechpad

Back to Top
Tom Capper

I head up the Search Science team at Moz, working on Moz's next generation of tools, insights, and products.

Snag your MozCon ticket!

Get actionable insights from industry-leading speakers and take your SEO to the next level

Read Next

Top SEO Tips for 2024 — Whiteboard Friday

Top SEO Tips for 2024 — Whiteboard Friday

Dec 20, 2024
How Links Impact Organic Results and Local Packs — Whiteboard Friday

How Links Impact Organic Results and Local Packs — Whiteboard Friday

Dec 13, 2024
Essential Tips for Directional Reporting in GA4 — Whiteboard Friday

Essential Tips for Directional Reporting in GA4 — Whiteboard Friday

Dec 06, 2024