A boutique hotel in Canmore has spent the last six months carefully crafting its presence on one AI platform. They worked with a consultant who optimized their business listing for ChatGPT's retrieval system, seeded the right review signals for Perplexity, and made sure their website's structured data would surface in Google's AI Overviews. They feel good about their AI strategy. They're visible.
They're visible to about a third of the market. Maybe less.
A couple in Montreal planning a Canmore trip uses Claude as their travel assistant. Claude doesn't share ChatGPT's retrieval index. A family in London uses Apple Intelligence, which pulls from a different set of data sources entirely. A corporate travel manager in Toronto uses a specialized travel agent built on Mistral that queries only MCP-connected data layers. A pet owner in Vancouver asks their AI pet care coordinator — a niche agent they've been using for two years — for a dog-friendly accommodation recommendation in the Rockies. None of these agents can see what ChatGPT sees. The hotel's carefully optimized presence is invisible to every one of them.
This is the multi-agent problem, and it's about to become the most important strategic reality in local business discoverability.
The coming agent explosion
Right now, most people think of AI agents as a small category: ChatGPT, Claude, Gemini, maybe Perplexity and Copilot. Five or six major players. The temptation is to optimize for each one individually, the way businesses used to maintain separate presences on Google, Yelp, and TripAdvisor.
This is the wrong mental model. We're in the consolidation-before-expansion phase. Within two to three years, the agent landscape will look radically different:
General-purpose assistants will multiply. ChatGPT, Claude, Gemini, Apple Intelligence, Samsung's Bixby successor, Meta's AI assistant, Amazon's Alexa rebuild — every major tech company is building or has built a general-purpose AI assistant. Each one has different training data, different retrieval systems, and different integration partnerships. Being visible to one is not being visible to all.
Specialized agents will emerge in every vertical. Travel agents that know the hospitality industry deeply. Car-buying agents that understand inventory dynamics, financing, and trade-in values. Restaurant concierges that can handle reservations, dietary requirements, and group coordination. Health navigators that match patients to providers based on specialization, insurance, and availability. Pet care coordinators that know the difference between a groomer who's great with anxious dogs and one that runs a puppy mill with a nice Instagram. Each vertical agent will serve a specific audience with depth that general agents can't match.
Enterprise agents will handle business-to-business discovery. A procurement agent helping a restaurant find a wholesale supplier. A fleet manager agent finding the best maintenance deal for twenty vehicles. A corporate travel agent booking accommodations for a team of twelve across three cities. These agents have different requirements and different data sources than consumer agents.
Regional and cultural agents will serve specific markets. A Mandarin-language travel agent built for Chinese tourists visiting Canada. A French-language concierge for Quebec. An agent built specifically for the Canadian Rockies tourism market. Each will query data sources relevant to their audience.
The total number of AI agents that might recommend a local business to a potential customer is going from five today to potentially hundreds within three years. Optimizing for one or two of them is like optimizing your Yellow Pages ad in 2005 — you're investing in a slice of a fragmenting market.
The platform trap
The instinct, when faced with a multi-platform world, is to be everywhere — maintain a presence on every platform, optimize for every agent. This is the strategy most businesses have followed with the current web: a website, a Google listing, a Yelp profile, a TripAdvisor page, an Instagram account, a Facebook page, a LinkedIn profile. Maintain all of them, keep them updated (or try to), and hope that whatever platform a customer uses, you're there.
This strategy works poorly even now — most businesses have at least one stale listing with wrong hours or an old phone number. It will work even worse in the multi-agent future, because the number of agents and their data requirements will grow faster than any business can maintain individual presences.
More importantly, optimizing for individual platforms locks your data into walled gardens. Content crafted for ChatGPT's retrieval system doesn't help you in Claude's ecosystem. Structured data formatted for Google's AI Overviews doesn't reach specialized travel agents. A listing optimized for Perplexity does nothing for Apple Intelligence. Each platform's data requirements are slightly different, and maintaining separate optimization strategies for each is a full-time job that gets harder as the number of platforms grows.
The parallel to the early web is instructive. In the 1990s, businesses built platform-specific presences — an AOL page, a CompuServe listing, a Prodigy storefront. Each platform was a walled garden with its own content format and its own audience. Businesses that invested heavily in AOL-specific content lost everything when AOL declined. The businesses that built for the open web — with standard HTML, accessible to any browser, indexed by any search engine — survived every platform shift because their data wasn't locked into any single ecosystem.
The same dynamic is playing out now. Businesses that lock their structured data into one AI platform are making the AOL bet. Businesses that put their data into an open, protocol-based layer accessible to every agent are making the open web bet. History is clear about which bet wins.
What protocol-based data access looks like
The alternative to platform-specific optimization is protocol-based data access — making your business data available through a standardized protocol that any agent can query, regardless of who built it or what model it runs on.
This is what MCP (Model Context Protocol) is designed for. MCP defines a standard way for AI agents to discover and query external data sources. An agent that supports MCP can connect to any MCP server and access the tools and data it provides. The agent doesn't need custom integration code for each data source. The data source doesn't need to build separate integrations for each agent. One protocol, universal access.
In practice, this means a business that structures its data through a Pawlo MCP server is simultaneously visible to:
Every general-purpose assistant that supports MCP — Claude, ChatGPT (as they add MCP support), Gemini, and others.
Every specialized agent built on any framework that supports MCP — travel agents, car-buying agents, restaurant concierges, health navigators.
Every future agent that hasn't been built yet but will support the MCP standard.
The hotel in Canmore that structures its data through an MCP-connected data layer doesn't need to optimize separately for ChatGPT, Claude, Apple Intelligence, and the fifty specialized travel agents that will launch in the next two years. It structures its data once — real pet policy, current availability, differentiators, seller intent — and every agent that supports the protocol can query it.
One data structure. Every agent. Simultaneously. This is the structural advantage of protocol-based data access over platform-specific optimization.
The fragmentation risk is real and growing
Some business owners might think: "ChatGPT has the most users, so I'll just optimize for ChatGPT and worry about the rest later." This is a reasonable-sounding strategy that will age poorly, for three reasons.
Market share in AI is not stable. A year ago, ChatGPT had dominant market share. Today, Claude, Gemini, and Perplexity have captured significant portions of the market. Apple Intelligence is rolling out to over a billion devices. Samsung's AI assistant reaches hundreds of millions of Android users. The market share of any single agent is declining as the total number of agents grows. Betting on one platform when the platform landscape is this volatile is high risk.
Specialized agents will capture the highest-intent queries. The person asking ChatGPT for a hotel recommendation might be casually browsing. The person using a specialized travel agent has specific intent and is closer to booking. The person using a pet care coordinator is actively looking for exactly the kind of service you provide. These specialized, high-intent agents will each have small market share but disproportionate commercial value — and they're the ones most likely to query protocol-based data layers because they need structured, specific data that general web scraping can't provide.
Wholesale shifts happen without warning. When Apple Intelligence starts handling local queries natively on every iPhone, the market share of other agents will shift overnight. When Google's agent starts booking directly from search, the discovery landscape changes in a week. Businesses that are visible only through one platform's retrieval system will experience these shifts as sudden, unexplained drops in agent-mediated traffic. Businesses that are visible through a protocol-based layer will barely notice, because the protocol works regardless of which agent is ascendant.
What businesses should do
The strategic advice is simple, even if the landscape is complex:
Don't optimize for agents. Optimize your data. The question isn't "how do I get ChatGPT to recommend me?" It's "is my business data structured, specific, and current enough that any agent — regardless of platform — can confidently recommend me?" If the answer is yes, you're visible everywhere. If the answer is "only on one platform," you're visible to a declining fraction of the market.
Choose protocol-based data layers over platform-specific ones. When deciding where to structure your business data, ask whether the data will be accessible to every agent or only to one. A data layer built on MCP makes your data available to the entire agent ecosystem. A platform-specific optimization makes it available to one ecosystem. The first is a permanent investment. The second is a platform bet.
Think about the agents that don't exist yet. The most valuable AI agents for your business might not have been built yet. The specialized travel agent that focuses on pet-friendly mountain vacations. The car-buying agent that serves first-time EV buyers. The health navigator for post-surgical rehabilitation. These agents will emerge, they'll need data, and they'll query the data layers that exist when they launch. If your data is there, you're recommended from day one. If it's not, you're invisible until you catch up.
The Canmore hotel that structured its data through an MCP-connected layer doesn't need to know which agents will exist in 2028. It knows that its trail adjacency, its no-weight-limit pet policy, its current availability, and its midweek flexibility will be queryable by whatever agents emerge — because the data is in an open, protocol-based format that any agent can access.
The multi-agent future isn't coming — it's here. The businesses that structure their data for one protocol will be visible to every agent. The businesses that optimize for one platform will watch their visibility fragment as the market splinters. The choice between these two strategies is the most important data decision a local business will make in 2026.
