Skip to content
What GPT Means for Structured Data SEO Auditing Success Blog Header

What GPT Means for Structured Data – Whiteboard Friday

Tom Anthony

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Table of Contents

Tom Anthony

What GPT Means for Structured Data – Whiteboard Friday

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Edited by Emilie Martin

Digital whiteboard showing what GPT means for structured data
Click on the whiteboard image above to open a high resolution version in a new tab!

Howdy, Moz fans. We're here in Seattle for MozCon in 2023, and I wanna talk to you about the context explosion.

Content explosion

When we talk about context within the frame of reference of SEO or search, we're typically talking about either implicit context, which describes the searcher, their time of day, their location, their language. Or we're talking about explicit context which is the search query itself. And it's this that I want to talk to you about today because I think we're on the brink of seeing a paradigm shift where there's gonna be an explosion in the amount of explicit context going into searches.

Comparing post-search browsing to ChatGPT

Post-search browsing

So to start, I wanna tell you about my vacation planning for last year. I wanted to take my family on a vacation, and the search that I wanted to do in Google was to show me vacation itineraries for a family of five with a private pool near the beach and some restaurants in or near Europe. However, what I ended up doing was a search like this or variations of this, because the last 20 or 25 years of using Google has taught me and most other people that they're expecting you to type in two to five keywords. They're not expecting a whole sentence or two sentences of text. That interface isn't set up for it. And also, we know that typically, we won't get very good results like that.

So what normally happens is we type in our two to five keywords into Google, and then, if it's a complex search query like vacation planning, then we'll go, and we'll open lots of those links manually, looking through them and seeing does this fulfill our criteria? Basically, applying the rest of that explicit context manually ourselves. And this behavior is known as post-search browsing. It's a well-understood phenomenon; most of the search engines have written academic papers about this. And basically, this is doing the second phase of the same search but manually rather than doing it in the search engine. Then what typically happens is you maybe refine your search in Google, and you repeat this process a few times until you get adequate results.

Comparing to ChatGPT

So let's compare and contrast that to the new world order of ChatGPT.

One of the exciting things about ChatGPT and similar interfaces is the promise of being able to refine and filter your search right there in the search engine in the same interface. And so, you ask it something, you get some results, and then you can ask it clarifying questions or ask it to filter those results. And basically, it brings this second phase of the search right into the same interface. And that is really powerful, it's really compelling, and we want it. But at the same time, ChatGPT and other chat-based interfaces for search haven't had a huge impact. They're still absolutely tiny in terms of usage compared to regular old Google.

The other thing about ChatGPT is it requires a different interface. And we're gonna come back to it, I've just noticed I've got a spelling mistake, so I wanna fix that on the fly right as we're recording. So it has a different interface, and so that, I think, leads to users being resistant to using it. And so, the question is, can we get this same benefit of bringing this second phase of the search into the search engine itself but without the chat? And the good news is all of the ability to understand context and filter comes from the GPT part of ChatGPT rather than the chat part. GPT, of course, stands for generative pre-trained transformers. And whilst ChatGPT is an OpenAI product, GPTs themselves are just a type of large language model that are now out there in the world for anyone to use and there are lots of companies already building things on their GPTs.

And so let's compare and contrast the two worlds.

GPT has an estimated 100 billion neurons, compared to Schema.org that has approximately 1,400 types of entity that it can represent

GPT-4 has an estimated 100 billion neurons, and that is where once it's trained, it's sort of storing its knowledge into connections between those neurons. And that's compare and contrast that with Schema.org, which is the best way we have right now of representing context in the form of entities. And so let's add Schema.org right there, Schema.org has approximately 1,400 types of entity that it can represent. So you can see that if search was to shift from using this sort of model for representing entities in context to something like this, the magnitude of change would be absolutely massive. It would have a very profound effect on search and SEO.

Project Magi

And as evidence that Google is moving in this direction, we wanna talk about Project Magi for a second. A couple of months ago now, Google did an interview with The New York Times where they talked about Project Magi. And this interview was in direct response to Bing launching their chat interface. And in the interview, Google said several interesting things, including that they were gonna be bringing the AI features into their existing search engine. Google already has loads of AI features, but this was in direct response to a GPT Bing launch. And so, it's not hard to understand that Google was talking about GPT AI features when they're talking about this.

They also talked about updating their search engine to anticipate users' needs. And here, I think they're talking about, basically, anticipating the explicit context that we want to add to queries. And the reason that they might want to anticipate that is to find ways to incentivize us as searches to add more explicit context to our searches when we've been trained over the last 25 years to type two to five keywords. But if they're gonna have a GPT-powered backend, they need to incentivize us to add more explicit context into our search query. And so, for my MozCon presentation, I predicted that they were gonna add some sort of faceted search functionality.

Google launches filter bubbles

And then, a couple of months ago, they launched these filter bubbles right in the main search interface.

Google's filter bubbles, enabling a user to extend their search by adding more explicit context to that query.

And so you do a search, if you press one of the filter buttons, it basically extends your search adding more explicit context into that query. And this is basically a way to incentivize users to add more explicit context where historically, we've been trained not to do that. So I think this is gonna lead to an absolute explosion in the length of queries that we're seeing for complex search queries.

The existing long tail is gonna start to look small, and we're gonna see a huge increase in the number of searches that are entirely unique. What exactly this means for us as SEOs is still yet to be determined, but it's a very exciting future. And so I think over the next sort of six to 12 months, we'll see this huge shift happen. Exactly how that looks, it's time yet to tell. Thank you very much.

Back to Top

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

Brand Entity SEO – Whiteboard Friday

Brand Entity SEO – Whiteboard Friday

Nov 01, 2024
Elevating Your SEO Career and Team in the AI Era — Whiteboard Friday

Elevating Your SEO Career and Team in the AI Era — Whiteboard Friday

Oct 25, 2024
Google HCU: What Can You Do? — Whiteboard Friday

Google HCU: What Can You Do? — Whiteboard Friday

Oct 18, 2024