Can a page be 100% topically relevant to a search query?
-
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent.
Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
-
I consider 100% match purely as theoretically possible. In my modest opinion the visitor determines the relevancy of the landingpage. And it is Google's nobel job to serve the visitor with a page that fits his needs. But in this case no page can be fully satisfying to everybody, due to different search intentions with the same keyword.
When you achieve a high conversion on your page you'v probably written a very relevant page. So let the visitor truly find what he is looking for and Google will notice....
-
Well said, Russ, especially for a "mathy" answer.
I am curious, though, would this "ideal document" you describe have a specific word count?
-
Warning, mathy answer follows. This is a generic description of what is going on, not exact, but hopefully understandable.
Yes, there is some theoretical page that is 100% topically relevant if you had a copy of the "ideal document" produced by the topical relevancy model. This would not look like a real page, though. It would look like a jumble of words in ideal relation and distance to one another. However, most topic models are built using sampling and, more importantly, the comparative documents that are used to determine the confidence level that your document's relevancy is non-random is also sampled. This means that there is some MoE (Margin of Error).
As you and your competitors approach 100% topical relevancy, that Margin of Error likely covers the difference. If you are 99.98% relevant, and they are 99.45% relevant, but the MoE is 1%, then a topical relevancy system cant conclude with certainty that you are more relevant than they are.
At this point, the search model would need to rely on other metrics, like authority, over relevance to differentiate the two pages.
-
With the pace at which things are changing and throwing in machine learning in to the ranking factor, I would say it's close to impossible to have 100% topically relevancy for any good period of time.
-
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
I would agree to a point. At its heart, Google probably uses some form of numerical score for a page as it relates to a query. If a page is a perfect match, it scores 100%. I would also suggest that attaining a perfect score is a virtual impossibility.
The scoring system, however, is dynamic. The page may be perfect for a particular query only at a particular point in time.
- Google's algorithm changes daily. What's perfect today may not be perfect tomorrow.
- Semantic search must be dynamic. If Google discovers a new Proof Term or Relevant Term related to the query, and the page in question doesn't contain that term, the page is no longer perfect.
These are only a couple of examples.
For practical purposes, the amount of testing, research, etc. to achieve a perfect score at some point delivers diminishing returns. The amount of effort required to push a page from 95% to 100% isn't worth the effort, especially since Google's algorithm is a secret.
Sometimes good is good enough.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
75% Overnight Drop in Organic Search Traffic
On April 10th my organic search clicks dropped 75% overnight. I have never seen anything like it. What Google algorithm change could have caused this? I have no manual actions and my indexed page count is about the same. I have noticed that several queries that I was number one for including my brand have dropped by anywhere from two to ten spots. The brand dropping out of the first spot is what really gets me. There is nothing similar to it at all. My speed score is moderate, so I don't think that is it. My site was down most of the day on the 9th or 10th, but that has never caused a drop in search clicks and the next day they were about the same. I noticed that the CDC now occupies the number one spot for my brand. Even though the exact brand name is nowhere in the text of the CDC page. I think this might be due to Google trying to help official health organizations do better due to COVID19, but the queries I have dropped on have nothing to do with Coronavirus. Also, none of my other sites have seen this type of problem. Only the health site seems affected. I recently did a press release campaign and my link counts are up, so I don't that is it either. The brand page is https://stdcarriers.com and an example of an effected query is Celebrities with STDs.
Algorithm Updates | | STDCarriers0 -
Google search analytics position - how is it worked out
In our Google search analytic s graphs total clicks and impressions appear as a sold line on the graph(ie showing a result for each day) Position only shows as an occasional dot or line - not a continuous result for each day) sometimes there are days with no result for position. How do google get these results
Algorithm Updates | | CostumeD0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
How long does google take to re-ranking pages in results?
I mean when google dance, the pages in results go up and down frequency every minue, but finally your page will rank in any position in google, what is the time when you get another position in google
Algorithm Updates | | engtamous0 -
Why is Google.co.uk showing so many US retailers on search queries?
Over the last couple of years Google seems to be showing more and more irrelevant search results in the field I am based in (dietary supplements) with searches for products coming back with results which highlight US vendors. Given I am in google.co.uk and actually logged in as well, why is Google so far off with their results lately?
Algorithm Updates | | predatornutrition0 -
Search history Effects on SERPS
With search engines using adaptive search in their algo, what's the best way to reset? Currently I go to my browser which primarily is Safari second Firefox third Chrome, I empty the cache, clear history and remove all cookies and data. I also disable customerzation based on search history. Is this the best way of starting from scratch so my search results won't incorporate adaptive search tech? I also make sure I'm not signed into my gmail account since that can impact SERPS as well. What else should I be doing to make sure my search is not customized?
Algorithm Updates | | bronxpad1 -
Can someone explain a few hopefully simple questions for me please
Hi everyone First off for local seraches I rank very well pretty much all on first page and high up too. I am also attempting to rank well for the search term 'independent mortgage advice' I currently rank third on page 2 for the above search term. I am happy with this progress as the site is only 3 months old. I am UK based, have a .co.uk domain and although my site server is located in Germany (1and1) I have changed the geographical location in webmaster tools. My competitive domain analysis gives me the follwing results: Domain Authority: 14 Domain MozRank: 2.44 Total Links: 110 Ext. Followed Links: 19 Linking root Domains: 13 Followed Linking Root Domains:9 Linking C-Blocks: 8 Compared to my competitors around me these figures are terrible so why am I doing relatively well and how can I increase some of these figures such as Domain Authority & Domain Mozrank? The page I'm referring to is http://www.keystonemortgages.co.uk I am a novice so please don't mind calling me a numpty if it appears obvious to you Jason PS is it frowned upon to post links with title keywords here?
Algorithm Updates | | JasonHegarty0 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0