Beyond the 'Engine': Applying the Complete SEO Ontology to Master Search Experience and AI Search

Oct 30, 2025

12

min read

Aren’t you all bored of the GEO, AEO, LLMO, AIVO, AISO polemic?

This alphabet soup of "new" disciplines seems to grow by the week, with each one promising to be the real revolution. But let's focus on the chief offender, the one pushed with the most breathless fervor: GEO.

That’s "Generative Engine Optimization," for those of you still toiling away in the prehistoric mines of mere search. This brilliant acronym promises to change everything. "SEO is dead!" they cry from the conference stages. "Long live GEO!"

And long live, of course, the new six-figure consulting packages and dedicated "GEO transformation" retainers required to master it.

This revolutionary concept was born from an academic paper. A paper that, shockingly, was met with immediate and widespread… confusion. 

Academics, engineers, and even those innocent bystanders known as "non-SEOs" scratched their heads (for those who likes acronyms, there is a recent new one: AIVO for AI Visibility Optimization, which is, at least, more accurate). They pointed out, rather inconveniently, that optimizing for a generative system that retrieves information from documents parsed, crawled and indexed by a search engine looked suspiciously like... optimizing for a search engine.

But what a masterstroke of branding! They chose 'GEO', an acronym already, and quite famously, owned by the entire field of Earth Sciences. It’s a move of such breathtaking terminological hubris that even today's AI models get flustered, helpfully offering up geological surveyors or soil testers when you ask for the "Best GEO agencies." (A minor branding hiccup, surely.)

Let's be fair, though. We can't really blame the big agencies for this marvelous act of terminological innovation. Alas, the fault lies not in our stars, but in ourselves. 

We, the practitioners of SEO, are the ones who set this stage.

For decades, we happily operated in a state of un-systematized chaos, preferring "secret sauce" and "it depends" to a shared ontology. Worse, we doomed ourselves from the start by clinging to a term that was a technical impossibility: Search Engine Optimization.

Let's be honest: how can you "optimize" an engine you don't have the keys to? (We don't work at Google, after all). We were never optimizing the algorithm. We were optimizing for brands and people, fighting to make them visible and resonate within the search results pages.

We had our chance to come up with a name that actually described this. Years ago, even Matt Cutts himself handed us a life raft: Search Experience Optimization. A definition focused on the user, the experience, the human aka the very things we were actually optimizing for. But we let it sink, preferring to sound technical and mysterious.

And so, by choosing the engine over the experience for twenty years, we left the door wide open for someone to simply point to a new "engine" (and a recycled acronym from geology) and sell us our own jobs back at twice the price.

So, perhaps it's time to finally do the work we've shirked. Let's actually map our domain under its proper name - Search Experience Optimization - and see how this new 'GEO' panic is not a revolution, but simply the latest, most specialized layer of the very job we've been doing all along.

While the terminology debates rage on, the practical reality is clear: modern search professionals need tools that actually track performance across these evolving surfaces—whether you call it SEO, GEO, AEO, LLMO, AIVO, or AISO.

Advanced Web Ranking helps you cut through the acronym soup by tracking your visibility across the modern search landscape: Google AI Overviews (AIOs), AI Mode, Perplexity, and other LLM-powered search experiences—alongside traditional search rankings. 

Instead of guessing whether your "optimization" (whatever you choose to call it) is working, you can actually measure it

Try Advanced Web Ranking free and see where you actually show up in the new search reality.

While the terminology debates rage on, the practical reality is clear: modern search professionals need tools that actually track performance across these evolving surfaces—whether you call it SEO, GEO, AEO, LLMO, AIVO, or AISO.

Advanced Web Ranking helps you cut through the acronym soup by tracking your visibility across the modern search landscape: Google AI Overviews (AIOs), AI Mode, Perplexity, and other LLM-powered search experiences—alongside traditional search rankings. 

Instead of guessing whether your "optimization" (whatever you choose to call it) is working, you can actually measure it

Try Advanced Web Ranking free and see where you actually show up in the new search reality.

While the terminology debates rage on, the practical reality is clear: modern search professionals need tools that actually track performance across these evolving surfaces—whether you call it SEO, GEO, AEO, LLMO, AIVO, or AISO.

Advanced Web Ranking helps you cut through the acronym soup by tracking your visibility across the modern search landscape: Google AI Overviews (AIOs), AI Mode, Perplexity, and other LLM-powered search experiences—alongside traditional search rankings. 

Instead of guessing whether your "optimization" (whatever you choose to call it) is working, you can actually measure it

Try Advanced Web Ranking free and see where you actually show up in the new search reality.

The SEO Ontology: From Keywords to AI-Driven Answers

To truly master modern search, you have to do what our industry has stubbornly refused to do: fuse the art of marketing with the science of the stack.

It’s not about stopping to think like a marketer; it's about finally connecting that marketing brain to a deep, technical understanding of linguistics, semantics, and semiotics. 

The real key is to understand what a user means (the semiotics of their experience) and connect it to how a machine understands meaning (the semantics of its index). 

We need a map that does both. This "map" has a formal name: a domain ontology.

An ontology is a blueprint of a field of knowledge. It’s a formal model that defines a set of concepts (the "nouns"), their properties (the "attributes"), and the relationships between them (the "verbs").

It's the very mechanism that allows a search engine to move from matching strings (the letters "a-p-p-l-e") to understanding things (the $3 trillion tech company OR the fruit).

The SEO ontology isn't static; it has evolved. It started with a simple core, expanded into highly specialized verticals, and is now adding its most complex layer yet: generative AI. Let’s map this evolution.

Part 1: The Core Foundation (The Base Ontology)

At its heart, the SEO ontology is built on a few core classes, or "nouns." These are the fundamental entities that exist in the search universe:

  • Website: The main asset, a collection of webpages.

  • Webpage: The individual document that gets indexed and ranked.

  • SearchEngine: The system (e.g., Google, Bing) that does the crawling and ranking.

  • User: The person with a need.

  • SearchQuery: The string of text the User enters to express that need.

  • SERP: The (Search Engine Results Page) list of Webpages the engine returns.

  • Link: The connector that passes authority and context between Webpages.

These entities are governed by three primary practices, each answering a different question for the search engine:

  1. Technical SEO: "Can I find, crawl, and index on this Webpage?" This is the foundation, the plumbing and wiring of the house.

  2. On-Page SEO: "What is this Webpage about?" This is the content, its meaning, and the structure. It establishes relevance.

  3. Off-Page SEO: "Is this Webpage trustworthy and authoritative?" This is primarily about Links and brand mentions from other Websites, which act as third-party votes of confidence.

For years, this was enough. You could build a successful strategy on this foundation alone. But the User's needs became more specific, so the ontology had to specialize.

Part 2: Specialization (Expanding the Ontology for Verticals)

The core ontology assumes all searches are the same. We know they aren't. A search for "best sushi" has a different intent than "B2B lead generation software." This is where the marketing brain meets the tech. 

To handle this, the ontology expanded, creating specialized models for different industries. Here are a few examples.

Local SEO

This is built around a physical Location.

  • New Core Entity: LocalBusiness

  • New Properties: Name, Address, Phone (NAP), BusinessHours, ServiceArea, GeoCoordinates.

  • New Relationship: A User's physical Location is now a primary ranking factor, relating them to a LocalBusiness. The goal isn't just to find information, but to go to a place.

Ecommerce SEO

This is built around a Product.

  • New Core Entities: Product, CategoryPage, ProductFilter (e.g., "size," "color").

  • New Properties: Price, Brand, Availability (In Stock/Out of Stock), RatingValue, SKU.

  • New Relationship: A Product belongs_to a CategoryPage. The User's intent is almost always transactional.

B2B & SaaS SEO

This is built around a Problem and its Solution.

  • New Core Entities: Solution (the service), BuyerPersona (the specific job role being targeted), LeadMagnet (e.g., a WhitePaper or Webinar).

  • For SaaS (Software-as-a-Service): This gets even more specific with Feature, Integration (e.g., "connects with Slack"), UseCase, and ComparisonPage (e.g., "Product vs. Competitor").

  • New Relationship: The goal is lead generation. A LeadMagnet targets a BuyerPersona at a specific FunnelStage.

News SEO

This is built around Time and Events.

  • New Core Entity: NewsArticle.

  • Key Property: Timestamp. Freshness is a dominant ranking factor.

  • New Relationship: A NewsArticle reports_on a real-world Event in near real-time. The goal is immediacy and placement in specialized SERP features like the "Top Stories" carousel.

This pattern repeats for every industry. Real Estate SEO is a hybrid of Local (the Neighborhood) and Ecommerce (the PropertyListing). Travel SEO is a complex mix of Local (the Destination), Ecommerce (the Booking), and News (seasonality and Events).

Part 3: The New Layer: SEO for AI Search

For the last 20 years, this complex, multi-vertical ontology was the map. Now, a new layer is being drawn over the entire thing. This is the evolution to SEO for AI Search.

This is not a replacement for SEO. Instead, it is a new, sophisticated layer that builds on, and completely depends on, the entire foundation we just mapped.

Traditional Search

AI Search Layer

Goal: rank #1

Goal: be cited

Output: list of links

Output: synthesized answer

Based on: keywords + link signals

Based on: retrieval + factual accuracy

Click required

Answer sometimes ends the journey

Structured data = rich snippets

Structured data = parseability for RAG

The fundamental shift is this: The goal is no longer to rank #1 on a list of links. The goal is to be cited in the AI-generated answer.

The AI is the new search engine. The synthesized text snapshot is the new SERP. This introduces new entities and fundamentally changes the value of old ones.

The AI Search Ontology

  • New Core Engine: Generative_Engine (an AI model that synthesizes answers).

  • New Core Result: AI_Answer (the block of AI-generated text).

  • New Input: Prompt (an evolution of SearchQuery, often conversational and complex).

  • The New #1 Ranking: Citation (the link or mention within the AI_Answer that credits your Webpage).

  • The New Playing Field: The SearchEngine's live index, which serves as the retrieval source.

Our new job is to ensure our Webpage is so good, so clear, and so trustworthy that the Generative_Engine chooses it as a source to construct its AI_Answer.

The 'How': Retrieval-Augmented Generation (RAG) is the Bridge

A critical piece of this new ontology is understanding the process that connects the Generative_Engine to the live web. The AI doesn't just "know" the answer from its static training data; it finds fresh information. This process is known as Retrieval-Augmented Generation (RAG).

The Source_Corpus in this model is not the AI's training data, but the live web index itself, accessed in real-time.

To make it simple, RAG is a two-step mechanism, and it's the glue that holds the old and new SEO together:

  1. Retrieval (R): When you enter a Prompt, the Generative_Engine first acts as a query agent:

    1. It sends a series of requests to the SearchEngine (e.g., the Google index) based on the query fan-out generated from interpreting the prompt/conversational query. 

    2. The SearchEngine does its traditional job: it crawls, indexes, and ranks. It retrieves a set of the most relevant, trustworthy, and factually accurate Webpages available in its live index right now.

  2. Generation (G): This retrieved set of documents becomes the temporary, ad-hoc Source_Corpus for the AI. The Generative_Engine then reads, reasons about, and synthesizes these specific Webpages to generate its final, conversational AI_Answer, adding the Citations as proof.

This is the key. The "Retrieval" step is traditional SEO. (Yes, all the "boring" stuff you were told was dead).

It means all the foundational elements of our ontology - Technical_SEO (for crawlability), On-Page_SEO (for relevance), and E-E-A-T (for trust, authority, et al) - are the very factors that determine if a Webpage even makes the cut to be considered for the "Generation" step.

To be generated, you must first be retrieved.

This is why the AI search layer doesn't replace the old ontology; it leans on it. It uses RAG to put immense pressure on these specific areas:

1. The Primacy of E-E-A-T (Trust) E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) moves from a conceptual guideline to a critical, machine-readable set of properties. An AI model cannot risk hallucinating or providing bad information. Its RAG process will therefore be heavily biased toward Webpages that it can prove are trustworthy.

  • Author: Is this entity a recognized Expert?

  • Provenance: Is it clear who wrote this and when?

  • Experience: Does the content demonstrate first-hand use (e.g., "I tested this product...")?

  • Authority: Do other trusted Websites (Links) and entities validate this Author and Website?

2. The Primacy of Structured Content (Facts & Meaning)

In the old ontology, Structured_Data (Schema) was often dismissed as a "nice-to-have" just for getting rich snippets. In the AI ontology, its true, deeper value is revealed, though not in the way most people think.

Let's be clear: structured data never was a magic "ranking factor," and the Generative_Engine isn't just blindly reading your JSON-LD to write its answer. Its role is far more fundamental. Structured_Data, along with clean semantic HTML, provides an unambiguous structure to your content. It makes your Webpage perfectly parsable.

This is the critical, often-missed first step. Before an AI can understand or trust your content, it must first parse it correctly.

As the wise master Yoda might say: Better parsing, the path to better understanding is. Better understanding, to wider indexing of meaning it leads. And wide indexing, when with trust it aligns, the path to visibility this becomes.

This is the key:

  • Structured_Data creates a clean, parsable Webpage.

  • A parsable Webpage allows the indexer to understand the Entities and Facts on it.

  • An understood Webpage is indexed for its full semantic meaning.

  • An accurately indexed Webpage is the only kind that can be retrieved by the RAG system to be used in an AI_Answer.

So while it's not a "ranking factor," it's the very foundation of being eligible to be retrieved and cited at all. It’s how you ensure the AI understands your Product's Price or your Article's Author long before it ever decides to use that fact.

3. The Primacy of Topical Authority (The End of the "One-Hit Wonder")

In traditional SEO, a single, brilliantly optimized "one-hit wonder" page could sometimes capture a top ranking, even if it was orphaned from the rest of your site's content. That strategy is fundamentally broken in the era of AI-generated answers.

Here's the new reality from an "Answer Engine Optimization" (AEO) perspective:

When an AI search engine uses a RAG (Retrieval-Augmented Generation) framework to answer a prompt, it's not a single-step process. It can be simplified into two key phases:

  1. The Search & Filtering Phase: First, the system must decide which sources on the open web to trust. It consults its massive index to find pages that are not just relevant, but also authoritative. This is where your domain's E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is non-negotiable. A lone page on a subject your site never otherwise discusses will likely be filtered out here as an unreliable, non-expert source.

  2. The Retrieval & Generation Phase: After qualifying a small set of trusted, authoritative sources, the RAG model then "reads" them to retrieve the most semantically relevant chunks of text (paragraphs, not entire pages) to synthesize its final, generative answer.

The critical insight is this: Your content cannot be "retrieved" (Phase 2) if your domain isn't "trusted" (Phase 1).

This is why topical authority has become the cornerstone of content strategy. A deep, comprehensive, and logically interlinked cluster of content - a true Topic Cluster - is no longer just good UX. It is tangible, machine-readable proof of your E-E-A-T at the domain level.

It signals to the AI's "Search & Filtering" mechanism that your website isn't just a random page; it's a "preferred retrieval source" for the entire subject. When your domain achieves this status, every page within that cluster becomes a more likely candidate for retrieval, vastly increasing the odds that your information will be the one used to form the AI answer.

Conclusion: It's All Still SEO

This new world of AI-driven answers may seem revolutionary, but it's built on an evolutionary foundation. SEO for AI Search is not a replacement; it is a capstone.

  1. You cannot be cited by an AI if your site isn't in its retrievable index.

  2. It won't be retrieved if your Technical SEO is broken and it can't be crawled and indexed. (Turns out, robots.txt still matters. Who knew?)

  3. It won't be retrieved if your On-Page SEO is a mess and the AI can't determine its relevance to the Prompt.

  4. It will never be trusted for retrieval if your Off-Page SEO and E-E-A-T signals are weak, signalling to the AI that you are not an authoritative source.

The traditional ontology gets your content found, understood, and trusted by the AI's retrieval system. The new AI-search layer ensures your content is so factually accurate, well-structured, and clearly articulated that it's chosen to be the answer in the generation step.

The map has changed, but you still need to know how to read it.

Article by

Gianluca Fiorelli

With almost 20 years of experience in web marketing, Gianluca Fiorelli is a Strategic and International SEO Consultant who helps businesses improve their visibility and performance on organic search. Gianluca collaborated with clients from various industries and regions, such as Glassdoor, Idealista, Rastreator.com, Outsystems, Chess.com, SIXT Ride, Vegetables by Bayer, Visit California, Gamepix, James Edition and many others.

A very active member of the SEO community, Gianluca daily shares his insights and best practices on SEO, content, Search marketing strategy and the evolution of Search on social media channels such as X, Bluesky and LinkedIn and through the blog on his website: IloveSEO.net.

Share on social media

Share on social media

stay in the loop

Subscribe for more inspiration.