Can a page be 100% topically relevant to a search query?
-
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent.
Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
-
I consider 100% match purely as theoretically possible. In my modest opinion the visitor determines the relevancy of the landingpage. And it is Google's nobel job to serve the visitor with a page that fits his needs. But in this case no page can be fully satisfying to everybody, due to different search intentions with the same keyword.
When you achieve a high conversion on your page you'v probably written a very relevant page. So let the visitor truly find what he is looking for and Google will notice....
-
Well said, Russ, especially for a "mathy" answer. I am curious, though, would this "ideal document" you describe have a specific word count?
-
Warning, mathy answer follows. This is a generic description of what is going on, not exact, but hopefully understandable.
Yes, there is some theoretical page that is 100% topically relevant if you had a copy of the "ideal document" produced by the topical relevancy model. This would not look like a real page, though. It would look like a jumble of words in ideal relation and distance to one another. However, most topic models are built using sampling and, more importantly, the comparative documents that are used to determine the confidence level that your document's relevancy is non-random is also sampled. This means that there is some MoE (Margin of Error).
As you and your competitors approach 100% topical relevancy, that Margin of Error likely covers the difference. If you are 99.98% relevant, and they are 99.45% relevant, but the MoE is 1%, then a topical relevancy system cant conclude with certainty that you are more relevant than they are.
At this point, the search model would need to rely on other metrics, like authority, over relevance to differentiate the two pages.
-
With the pace at which things are changing and throwing in machine learning in to the ranking factor, I would say it's close to impossible to have 100% topically relevancy for any good period of time.
-
100% saturation is impossible to achieve while maintaining any semblance of value. Not only because any proper page inherently has navigation, internal linkage, and myriad other elements, but because to write content about a subject in that amount of detail, one would invariably need to write about sub-topics and related topics. It's just not feasible. But, and here's the kicker, you wouldn't want 100% saturation anyway.
Rich, dynamic content incorporates that which is related to it. Strong pages link out to others, and keep visitors within their media cycle, if not churning them lower down. Good content is content that holds information that's both detailed and general to a topic. I would say, at most, the highest saturation point that still remains within strong SEO and content optimization is about 85-90% when taking into account all page content - and even that's pushing it, really.
-
I would agree to a point. At its heart, Google probably uses some form of numerical score for a page as it relates to a query. If a page is a perfect match, it scores 100%. I would also suggest that attaining a perfect score is a virtual impossibility.
The scoring system, however, is dynamic. The page may be perfect for a particular query only at a particular point in time.
- Google's algorithm changes daily. What's perfect today may not be perfect tomorrow.
- Semantic search must be dynamic. If Google discovers a new Proof Term or Relevant Term related to the query, and the page in question doesn't contain that term, the page is no longer perfect.
These are only a couple of examples.
For practical purposes, the amount of testing, research, etc. to achieve a perfect score at some point delivers diminishing returns. The amount of effort required to push a page from 95% to 100% isn't worth the effort, especially since Google's algorithm is a secret.
Sometimes good is good enough.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not showing the recent cache info: How to know the last cached version of a page?
Hi, We couldn't able to see the last Google cached version of our homepage after March 29th. Just wondering why this is happening with other websites too. When we make some changes to the website, we will wait to our website indexed and cached, so the changes will have some ranking impact. Now we couldn't able to check if the website got indexed with changes. Is there any other way to check the latest cached version or time of last index? Thanks
Algorithm Updates | | vtmoz0 -
Does using non-https links (not pages) impact or penalise the website rankings?
Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk
Algorithm Updates | | vtmoz0 -
Rel canonical on other page instead of duplicate page. How Google responds?
Hi all, We have 3 pages for same topics. We decided to use rel canonical and remove old pages from search to avoid duplicate content. Out of these 3 pages....1 and 2 type of pages have more similar content where 3 type don't have. Generally we must use rel canonical between 1 and 2. But I am wondering what happens if I canonical between 1 and 3 while 2 has more similar content? Will Google respects it or penalise as we left the most similar page and used other page for canonical. Thanks
Algorithm Updates | | vtmoz0 -
Rel canonical on every page of wordpress CMS website
Can we have rel=canonical across all pages of a wordpress CMS website? I don't know why same page has been as canonical but not for duplicate pages
Algorithm Updates | | vtmoz1 -
Google's Local Search Results for Broad Keywords
I have a question regarding Google's local search results for broad keywords. Since Google is changing their algo to reflect local results for broad words, would it be beneficial now to start going after those words as well? For example: we have a client ranking for 'miami security alarm', but I would like to know if it would be beneficial to start optimizing for 'security alarm' as well. Also, since Google's keyword research tool reflects searches on a national level, how would I be able to find out how many searches a broad keyword is receiving on a local level? Thank you in advanced!
Algorithm Updates | | POPCreative0 -
Increasing Brands/Products thus increasing pages - improve SEO?
We curently have 5 brands on our website and roughly 200 pages. Does increasing the number of products you stock and thus increasing the number of pages improve your SEO?
Algorithm Updates | | babski0 -
New Search Engine.... Vanoogle.com
I'd like to see google start a new search engine. They might call it Vanoogle.com (Vanilla Google). This search engine would not be stinked up with social data, freshness inclusions, crap from my last query, skewed based upon my IP, warped because of my browser, targeted because of my cookies, no personalization, no image results, product results, none of that stuff. Ads are OK if labeled. I just want a plain vanilla search. Something that I know is "clean". Just like the good olde days. Millions of people will start using it right away. 🙂 Would you use Vanoogle.com?
Algorithm Updates | | EGOL2 -
Was I Kicked Off Google Page One by Panda/Farmer?
Took over this site in March. Got a Panicked call from client Mid-March that all of a sudden keywords that put the site on Page One weren't working. There are still 9 that work, but apparently there were more. A large percentage of the backlinks are from Article Directories and Link Farms. Is this my problem? Also, a large percentage of the 149 pages suffer from keyword stuffing and were obviously written for Search Engines and not people. How much of a difference does that make?
Algorithm Updates | | reeljerc0