Can a page be 100% topically relevant to a search query?
-
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent.
Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
-
I consider 100% match purely as theoretically possible. In my modest opinion the visitor determines the relevancy of the landingpage. And it is Google's nobel job to serve the visitor with a page that fits his needs. But in this case no page can be fully satisfying to everybody, due to different search intentions with the same keyword.
When you achieve a high conversion on your page you'v probably written a very relevant page. So let the visitor truly find what he is looking for and Google will notice....
-
Well said, Russ, especially for a "mathy" answer. I am curious, though, would this "ideal document" you describe have a specific word count?
-
Warning, mathy answer follows. This is a generic description of what is going on, not exact, but hopefully understandable.
Yes, there is some theoretical page that is 100% topically relevant if you had a copy of the "ideal document" produced by the topical relevancy model. This would not look like a real page, though. It would look like a jumble of words in ideal relation and distance to one another. However, most topic models are built using sampling and, more importantly, the comparative documents that are used to determine the confidence level that your document's relevancy is non-random is also sampled. This means that there is some MoE (Margin of Error).
As you and your competitors approach 100% topical relevancy, that Margin of Error likely covers the difference. If you are 99.98% relevant, and they are 99.45% relevant, but the MoE is 1%, then a topical relevancy system cant conclude with certainty that you are more relevant than they are.
At this point, the search model would need to rely on other metrics, like authority, over relevance to differentiate the two pages.
-
With the pace at which things are changing and throwing in machine learning in to the ranking factor, I would say it's close to impossible to have 100% topically relevancy for any good period of time.
-
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
I would agree to a point. At its heart, Google probably uses some form of numerical score for a page as it relates to a query. If a page is a perfect match, it scores 100%. I would also suggest that attaining a perfect score is a virtual impossibility.
The scoring system, however, is dynamic. The page may be perfect for a particular query only at a particular point in time.
- Google's algorithm changes daily. What's perfect today may not be perfect tomorrow.
- Semantic search must be dynamic. If Google discovers a new Proof Term or Relevant Term related to the query, and the page in question doesn't contain that term, the page is no longer perfect.
These are only a couple of examples.
For practical purposes, the amount of testing, research, etc. to achieve a perfect score at some point delivers diminishing returns. The amount of effort required to push a page from 95% to 100% isn't worth the effort, especially since Google's algorithm is a secret.
Sometimes good is good enough.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
Page 1 all of a sudden for two clients
Hello, So, for many months, a couple of my clients have had a handful of terms that they were ranking for on Page 2. All of a sudden in the past month, both clients have moved up to Page 1, #2 for most of their terms. I have been working on some optimization tests and made minor changes, but I am concerned because the consistency of the #2 position for both clients for all of the previously Page 2 ranking keywords. I have seen this type of Google increase for clients before, and my experience has shown that it is a test from Google-so, from Google's perspective: "we're going to move your rankings up to Page 1 and see what you do with this to prove to us that your site is worth the position". Anyone had any experience with this kind of movement? Thanks so much in advance..
Algorithm Updates | | lfrazer0 -
Wordpress Canonical Tag Pointing to Same Page
So I noticed on a few of my clients wordpress tags (via moz) that there are canonical tags on URLs, pointing to that same URL. What is the point of that, and is it harming the website? Is this being done automatically via a plugin? Should I remove the canonical tags or leave as is?
Algorithm Updates | | WebServiceConsulting.com0 -
Can Ecommerce help with Keyword Rankings?
I am curious to know if an ecommerce website plays a role in higher rankings. we have been struggling for some time on a term and all of our competitors have an online shopping cart. we have a custom magento website with a request a quote form as our products are very costly. (range from $500 - $250,000). Is there something we can add to the code to help boost our rankings?
Algorithm Updates | | hfranz0 -
Queries vs Keywords
Can anyone clarify why my list of queries from google webmaster tools varies so much from the keywords that have resulted in clicks? I have a site that, according to Google analytics, has had clicks from 125 key phrases where as in webmaster tools (via analytics) allegedly only 17 queries have resulted in clicks. Is it becuase GA can't handle less than 5 clicks from the webmaster data or is it something else I am missing? The site I am researching for has very little traffic from other search engines.
Algorithm Updates | | SoundinTheory0 -
Why has my homepage been replaced in Google by my Facebook page?
Hi. I was wondering if others have had this happen to them. Lately, I've noticed that on a couple of my sites the homepage no longer appears in the Google SERP. Instead, a Facebook page I've created appears in the position the homepage used to get. My subpages still get listed in Google--just not the homepage. Obviously, I'd prefer that both the homepage and Facebook page appear. Any thoughts on what's going on? Thanks for your help!
Algorithm Updates | | TuxedoCat0 -
Test contet/pages indexed by search engines
During the web development stages of our Joomla CMS website, we have managed to get our site indexed for totally irrelevant test pages mainly to do with Joomla and some other equally irrelevant test content. How damaging is this to our domain from an SEO prospective and is there something we can do about it? When we do a site:domain.com search we see hundreds of testpages with test/irrelevant meta tags etc.
Algorithm Updates | | Fuad_YK0 -
Google Cached Pages
I made some on-site changes to a site last week, in particular their page titles. This was all done on the same day at the same time. Now, one of those pages, got re-indexed on August 8th and has my updated changes, which also helped with my ranking. The other page I made changes to still shows a cached version from July 27th, which is before I made the changes. Why wouldn't google have an updated page from August 8th for both pages, not just one?
Algorithm Updates | | MichaelWeisbaum0