Tuesday, March 24, 2026
HomeTechnologyThe Lacking Mechanisms of the Agentic Financial system – O’Reilly

The Lacking Mechanisms of the Agentic Financial system – O’Reilly

For the previous two years, I’ve been working with economist Ilan Strauss on the AI Disclosures Mission. We began out by asking what regulators would wish to know to make sure the security of AI merchandise that contact a whole bunch of tens of millions of individuals. We are actually exploring the lacking mechanisms which might be wanted to allow the agentic economic system.

This essay traces our path from disclosures by way of protocols to markets and mechanism design. Reasonably than merely stating our conclusions, I’m sharing our thought course of and a number of the conversations and historic examples which have formed it.

We shall be holding plenty of targeted convenings to discover these concepts over the following couple of months, and my hope is that shared context will allow extra productive engagement with what could be very a lot a piece in progress.

The disclosure drawback

Ilan Strauss and I began the AI Disclosures Mission in early 2024 with a conviction that almost all regulators had little thought how AI labored or the place it was going. The sector was so younger that lots of the early regulatory proposals had been misguided. We thought that regulators and {industry} ought to begin by agreeing on requirements for disclosure, in order that we may all be taught collectively because the know-how develops. You’ll be able to’t regulate what you don’t perceive.

Considered one of our first insights was that focusing solely on mannequin security was a mistake, a lot as if regulators inspected vehicles on the manufacturing facility however fully ignored their use on the roads. We believed (and nonetheless do) that the main focus ought to be on AI as deployed. And we imagine that disclosures shouldn’t focus simply on capabilities however on enterprise fashions and the working metrics that AI firms use to form how their merchandise function.

Ilan and I had labored collectively beforehand with Mariana Mazzucato at College Faculty London on what we referred to as “algorithmic consideration rents,” finding out how platforms like Amazon and Google management person consideration to extract financial rents from their suppliers. We noticed that natural search at Google and Amazon was an enormous advance in market coordination, utilizing a whole bunch of alerts to search out the most effective match for a person’s intent. In impact, each firms had constructed a greater “invisible hand.” And but after many years of success, they turned away from that advance. To make use of Cory Doctorow’s coinage, they started “enshittifying” their providers by substituting inferior paid outcomes for the highest natural search outcomes in an effort to pad their backside line.

We’d additionally watched social media begin out with the promise of preserving you in contact with your folks and foster productive conversations, however then as a substitute started to optimize for engagement on the expense of every thing else. By the point anybody understood what was occurring, the injury had been carried out. We will see the inflection level of their monetary metrics, however neither regulators nor the general public can see the modifications in working metrics that drove the financials. What if we may seize what attractiveness like earlier than it will get enshittified, and determine how that modifications over time?

We additionally noticed that trendy know-how firms are fully completely different from industrial period companies, the place you may perceive key components of the enterprise by tracing the inputs and the outputs by way of the monetary statements. As a substitute, the enterprise is basically pushed by intangibles, that are lumped into one impenetrable black field.

We wished to be taught from that mistake. Whereas the horse was already out of the barn on search and social media, we hoped to get disclosure of working metrics into AI governance whereas there was nonetheless an urge for food for regulation. Sadly, that window was very quick. The failure turned out to be productive, although, as a result of it pressured us to suppose tougher about regulation extra broadly and what different leverage factors may be discovered.

Protocols as practical disclosures

The primary flip in our considering got here after we realized that disclosures aren’t simply informational. A very powerful disclosures are practical. We got here to see the parallels between disclosures and communications protocols, the agreed-on strategies by which networked methods share data. For instance, the HTTP protocol that underlies the World Broad Internet specifies how an online browser and internet server talk in an effort to show an online web page.

This can be a structured communication with guidelines that should be adopted and knowledge that should be exchanged in a specific order.  An HTTP request that identifies the person agent as a command line program similar to curl fairly than a graphical browser similar to Chrome triggers a distinct response from the server. The user-agent string isn’t a report filed with a regulator. It’s an operational sign embedded within the protocol, and it carries numerous data.

When you see protocols as a system of practical disclosures, you begin noticing that each regulatory system has a form of communications and management protocol at its coronary heart. Usually Accepted Accounting Ideas (GAAP) or IFRS, the European equal, are protocols for communication between firms and their accountants, auditors, banks, buyers, and tax authorities. Even street markings and street indicators are a communications protocol, giving data to drivers about native situations, legal guidelines, and the correct use of the street. These are gradual, analog protocols, however they’re protocols nonetheless.

Protocols could be inspected. Observability is the important thing to governance. Police observe speeders on the street; bank card processors and banks look ahead to bank card fraud on their cost networks; e-mail processors filter spam because it passes by way of nodes on the community. The observability factors for AI are nonetheless rising, however that’s the place regulators ought to be targeted.

Even past being a locus for observability and regulability, protocols themselves do an unlimited quantity of the governing work in trendy know-how methods. Spanning every thing from how packets get from one place to a different to what will get displayed, who has permission to see it, and typically even what it prices, they in the end decide who can interoperate with whom. That led us to a good larger realization.

Protocols form markets

Take into consideration the early form of the AI chatbot market. It was a winner-takes-all race to be the dominant platform for AI in the best way Home windows grew to become the platform for PCs, or iOS and Android for telephones. Whoever wins controls the market. Then Anthropic launched MCP, the Mannequin Context Protocol. Rapidly, the panorama regarded extra like an online. There might be many winners. It didn’t matter what mannequin you had been operating or whose APIs you had been calling so long as you adopted the protocol. And because the agentic AI market unfolded, the protocol wasn’t simply MCP. An AI agent might be a person of the present web protocol stacks. Whether or not MCP itself survives or is outmoded by different protocols, the form of the market was remodeled.

This perception reframed our complete venture. Protocols will not be simply technical infrastructure. They’re market-shaping mechanisms.

Workflows are additionally protocols

I talked final week with a number of the of us engaged on the Lengthy Now Basis’s partnership with Ethereum’s Summer time of Protocols venture, and that widened my lens even additional.

When software program folks hear “protocol,” we consider communication protocols: TCP/IP, HTTP, MCP, or, say, Stripe’s Machine Cost Protocol (MPP).

To the Lengthy Now of us, a protocol is any standardized means of doing one thing. Wildfire administration groups observe protocols. So do flood response groups, hospital emergency rooms, and air site visitors controllers. Atul Gawande’s guide The Guidelines Manifesto was an try to ascertain a standard protocol for surgical working theaters. This can be a very completely different definition of protocol, and but placing the 2 meanings of the phrase into the identical body makes a brand new form of sense.

In his introduction to the Summer time of Protocols’ Protocol Reader, Venkatesh Rao cited Ethereum researcher Danny Ryan’s definition of a protocol as a “stratum of codified habits” enabling coordination. He identified that protocols are inclined to turn into invisible as soon as adopted. Rao calls this a “Whitehead advance,” after the thinker Alfred North Whitehead’s statement that civilization advances by extending what we are able to do with out considering.

However he additionally made the thought-provoking level {that a} protocol is an “engineered argument,” in distinction with an API, which he says is an “engineered settlement” enforced by one dominant actor. There’s extra to it than simply the facility asymmetry of enforced settlement, although. In a followup dialog, Venkatesh Rao famous that protocols are “not simply codified modes of data change, however modes of reside, structured, argumentation, usually with an lively computational factor. For instance, CSMA/CD (Ethernet) should detect packet collisions and compute and execute a random delay for retransmittal of packets. This isn’t mere structured communication. That is argumentation with what philosophers name dynamic semantics.”

Rao continued: “The second you transcend computing protocols, real-world suggestions loops from materials penalties turn into actually vital. For instance, container-shipping is kind of shut architecturally to TCP/IP (the large distinction being that packets could be dropped and retransmitted whereas misplaced containers are literally misplaced), however as a result of it has a materially embodied suggestions loop, regulatory mechanisms begin to behave extra like management methods than communication methods.”

I really like the concept of protocols as an engineered argument. The dynamism this implies goes to be ever extra true in a way forward for agentic protocols. However this notion additionally triggered one other thought, which is that markets are additionally engineered arguments. My bridge to this reformulation was the distinction between de jure protocols that come up from a proper requirements course of, and de facto protocols that come up by way of market competition.

Within the early days of the web, the Web Engineering Job Power (IETF) was all about engineered arguments. Individuals had concepts about how the web must work, and to show their level they needed to present up with interoperable implementations. Nobody had the power to implement something. Settlement needed to evolve. As Dave Clark famously put it, “We reject: kings, presidents, and voting. We imagine in: tough consensus and operating code.” The de facto protocols of the web that emerged from the IETF ended up considerably outperforming the competing de jure networking protocols that emerged from telecommunications requirements our bodies. The IETF framed the argument; whoever confirmed up made their case and gained or misplaced by the use of adoption.

It additionally made me bear in mind one other many years outdated story that I had lived by way of. Microsoft and Netscape had been duking it out within the internet server market and had been constructing their very own “engineered agreements” for what was up the stack from the bottom internet server performance. Everybody thought that Apache wasn’t maintaining, however that they had a trump card. They offered an extension layer. And that engineered every kind of productive arguments between a market of competing builders fairly than a single engineered settlement imposed by both a dominant participant OR a dominant committee.

Rao additionally famous that protocols unfold slowly however turn into practically inconceivable to dislodge as soon as established. For instance, SMTP (the protocol for e-mail) dates again to 1982, and has outlasted many opponents. There’s numerous path dependence. And so getting the primary steps proper is a vital a part of engineering the argument.

And in his essay “Requirements Make the World” for the Summer time of Protocols venture, David Lang makes the purpose that technical requirements type a 3rd pillar of recent society, alongside personal organizations and public establishments. They aren’t the state and so they aren’t the market, however they’re important to each. Once they work properly, requirements turn into enabling applied sciences. The web. The delivery container. Normal time. They’re civilizational infrastructure.

Briefly, we’re not simply constructing communication protocols for software program brokers. We’re creating a brand new strategy to standardize the most effective practices and workflows that can form the human + AI future, permitting people and brokers to cooperate throughout organizations, industries, and borders.

Expertise can be seen as protocols

As soon as the Lengthy Now workforce planted in my thoughts the connection between workflows and protocols, it occurred to me that Agent Expertise are additionally a “stratum of codified habits,” and even perhaps a set of competing “engineered arguments” for the way to do work with AI.

On the easiest degree, a Talent is a chunk of structured data: right here’s the way to create a Phrase doc; right here’s the way to extract the textual content from a PDF; right here’s the way to publish on the Hugging Face Hub. There could be many Expertise that try to codify the identical data, however some could also be higher than others. As Expertise multiply, how will we discover the most effective ones? That is in some ways analogous to the natural internet search drawback, which Google solved by aggregating a whole bunch of helpful alerts.

And we’re seeing that there’s a form of hierarchy of expertise. Jesse Vincent’s Superpowers framework, which has turn into one of the extensively adopted open supply initiatives in AI-assisted improvement, doesn’t simply give brokers particular person capabilities. It encodes a complete software program improvement methodology: brainstorm earlier than you construct, plan earlier than you code, take a look at earlier than you ship, evaluation earlier than you merge. That’s a standardized workflow. It’s loads just like the sorts of protocol that the Lengthy Now of us had been speaking about, expressed in a type that brokers can observe.

The present protocols that the protocol analysis neighborhood talks about, like wildfire administration protocols or hospital triage protocols, encode finest practices right into a repeatable, teachable course of for human groups. They’ve but to be tailored for brokers. And in reality, lots of them are by no means going to be fully agentic. We might want to construct mechanisms for workflows that embody each AI brokers and people working collectively.

Agent expertise in some (however not all) areas elevate the identical questions that industrial requirements have all the time raised: who decides what the most effective apply is? How do you confirm high quality? How do you govern updates? We could also be speaking about expertise that encode the workflow for regulatory compliance in a selected {industry}, or for conducting an environmental impression evaluation, or for managing a medical trial. Are the requirements de jure or de facto, the results of an engineered settlement by a committee or an engineered argument that allows a vibrant market?

At O’Reilly, that is one thing we take into consideration loads. We’re an organization constructed on codifying professional data. We’ve revealed books and arranged conferences and on-line coaching that taught folks the way to do new issues. Now we’re asking “What does it appear like to publish the talents that train brokers the way to do issues? And the way can we be certain that these expertise are discoverable, reliable, and monetizable, not only for us however for each area professional who has data value encoding?” And the way do they emerge from competition in a vibrant market fairly than by decree?

We imagine we’ll all be higher off with an engineered argument than an engineered settlement. And that brings me to mechanism design.

The lacking mechanisms

Economists use the time period “mechanism design” to explain the engineering of guidelines and incentive constructions that lead self-interested actors to provide outcomes which might be good for everybody. It’s typically referred to as “reverse recreation concept.” Reasonably than analyzing the equilibria that emerge from a given algorithm, you begin with the result you need and work backward to design the foundations that can get you there.

Mechanism design concept bought its begin within the Nineteen Sixties when Leonid Hurwicz took up the issue of how a planner could make good choices when the data wanted to make them is scattered amongst many alternative folks, every of whom has their very own pursuits. His key perception was that individuals gained’t reliably reveal what they know except it’s of their curiosity to take action. So how do you design a system that aligns their incentives?

The sector that Hurwicz based and that Eric Maskin and Roger Myerson developed by way of the Seventies and 80s earned all three the Nobel Prize in Economics in 2007.

I first encountered the sector when Jonathan Corridor, on the time the Chief Economist at Uber, waved Al Roth’s guide Who Will get What — and Why at me and stated “That is my Bible.” In it, Roth describes his personal work on mechanism design, which gained him the 2012 Nobel Prize in Economics together with Lloyd Shapley. Roth utilized mechanism design to kidney matching markets, markets for school admissions, for regulation clerks and judges, and for hospitals and medical residents. After I first talked to Jonathan after which Al Roth, my layman’s takeaway about mechanism design was that it was merely the applying of financial concept to design higher markets.

And I’ve since come to suppose much more broadly about what mechanism design may imply in a know-how context. In my broader framing, packet switching was a breakthrough in mechanism design. So for that matter was TCP/IP, the World Broad Internet, and the protocol-centric structure of Unix/Linux, which enabled open supply and the distributed, cooperative software program improvement surroundings we take with no consideration immediately. PageRank and the remainder of Google’s natural search system additionally appears to me to be a form of mechanism design. So do Pay Per Click on promoting and the Google advert public sale. All of them are methods of aligning incentives such that self-interested actors produce outcomes which might be good for others as properly.

In order that brings me again to AI. Proper now, there’s an issue that makes the AI/human data market much less environment friendly than it might be. The disrespect for IP that has been proven by the AI labs and functions in the course of the coaching stage, and even now throughout inference, has led to efforts by content material house owners to guard their content material from AI. Don’t crawl. Lawsuits. Reluctance to share data. Even the AI labs are complaining in regards to the theft of their IP and attempting to guard their mannequin weights from distillation.

It’s an economic system crying out for mechanism design.

The lesson of YouTube Content material ID is value studying. Twenty-five years in the past, the music {industry} was in the identical place that content material creators are in immediately with AI. In response to unauthorized use of their music by creators, music publishers’ demand to YouTube was “Take it down.” However as Google engineer Doug Eck defined to me, YouTube got here up with a greater reply: “How about we enable you to monetize it as a substitute?” I don’t know the main points of how that call was made however I do know the eventual consequence. Aligned incentives led to a vibrant creator economic system by which YouTube’s video creators, the music firms, and Google all bought to share within the worth that was created.

That ought to give us inspiration for the way to remedy a number of the issues we face now with AI. Whether or not it’s with Agent Expertise, NotebookLM, or different emergent artifacts of the brand new AI/human data economic system, we have to align the incentives. If we are able to develop the pie, and in a means the place no single gatekeeper captures the majority of the profit, there’s a strategy to create a vibrant market. However that requires constructing mechanisms that don’t exist but.

What mechanisms are lacking from the agentic economic system? Right here’s a partial checklist:

Expertise markets. There’s an unlimited financial alternative for people to create and commerce expertise that brokers can use. These will not be simply easy aggregation of context with device use directions, however higher-level, industry-specific workflows that encode deep human experience. At O’Reilly, we’re determining the way to flip our data and that of our authors into expertise, the way to make them discoverable, and the way to promote them. However as of but, there’s no means for a broader neighborhood of talent creators to take part.

High quality and governance for expertise. Some expertise will want the identical sorts of governance that industrial requirements have. Who certifies {that a} medical expertise package deal follows present medical tips? Who updates it when the rules change? We haven’t begun to construct the establishments that will govern agent expertise at that degree.

Registries and discovery. The MCP neighborhood has been engaged on a registry protocol, as is the Ethereum neighborhood.

This isn’t only a technical improvement however a enterprise alternative. I nonetheless bear in mind when Community Options was operating the unique prime degree web area title registry beneath contract from the Nationwide Science Basis. When the federal government stated it wouldn’t finish the funds, Community Options deliberate to stroll away. Then they realized what that they had. On the early web, area title registration grew to become a surprisingly large enterprise. Now it’s simply boring civilizational infrastructure. Is there one thing related for AI fashions, functions, and brokers?

Natural seek for brokers. Google’s first nice innovation on the net wasn’t the way to make ppc advertisements actually work with a data-driven advert public sale. It was natural search: a means of coordinating a market with a whole bunch of alerts that ignored value and labored independently of whether or not the vacation spot content material was free or paid. The New York Instances (or oreilly.com) is subscription-based, however that isn’t a think about whether or not Google exhibits it to you. Google found out alerts that allow them say, “That is the most effective end result for this question.” Websites behind paywalls found out the way to disclose sufficient for folks to determine whether or not they wished to take the following step and enter right into a transaction. That’s an engineered argument.

We’re going to wish the equal for expertise and agent providers. We’ll begin with curated marketplaces. Vercel already has one. However we’re a good distance from one thing as efficient as Google’s peak in natural search. The search area shall be enormous, with a whole bunch of tens of millions, perhaps billions of brokers looking for one of the best ways to perform trillions of distinct duties. Expertise will help them save on inference prices and ship higher outcomes. The query is what alerts will drive discovery of the most effective match.

Extension architectures. MCP’s extension mannequin (together with the brand new Apps Extension) is promising. That is the Apache mannequin yet again: preserve the core easy, let folks layer completely different approaches on prime, and let the market type out which of them win. It’s, in essence, an engineered argument fairly than an engineered settlement.

Cost layers. Stripe has been engaged on agentic commerce, however it appears to be targeted on conventional e-commerce transactions like reserving a ticket or shopping for a product. What a couple of cost layer for expertise? There have been proposals for monetizing MCP calls, pay per name, pay per token, however none have caught on but. Coinbase’s x402 protocol may find yourself enjoying a job.

Progressive entry and authentication. MCP Server Playing cards promise to let a service specify its phrases: right here’s what we cost, right here’s the way you authenticate. That’s a practical disclosure layer that would allow commerce. It may allow progressive privileges: a free O’Reilly subscriber will get one set of instruments, a paying subscriber will get a richer set, all on prime of the identical MCP server. Once more, that’s an engineered argument with the market deciding the winners.

Neutrality in agent routing. When ChatGPT decides to indicate you a Reserving.com widget as a substitute of an Airbnb widget, who made that alternative, and on what foundation? OpenAI claims business issues aren’t an element. That’s onerous to take at face worth. We’d like one thing like the unique precept of natural search: floor the most effective end result for the person, not probably the most worthwhile one for the platform.

We don’t know the longer term, however we are able to set ourselves as much as form it for the higher

I’m sufficiently old to recollect when UUCP was giving strategy to the web, and there was an actual debate over whether or not express path routing or area routing was higher. On reflection, it’s blindingly apparent that path routing wasn’t going to scale. However it’s worthwhile to know that on the time, folks weren’t in any respect clear about that!

The identical is true now. A few of what I’ve described will grow to be the equal of express path routing: a useless finish that was solely believable for a small scale community. Different elements will grow to be as basic as DNS or HTTP. However we’re not attempting to select the winners. We’re attempting to engineer the argument.

If we are able to allow higher markets, it should permit a means of discovery. Individuals attempt various things, most fail, some catch on. The job proper now’s to construct the mechanisms that assist the market to evolve.

We’d like mechanisms that no single gatekeeper can management. Modular, decentralized architectures let folks experiment with enterprise fashions, routing choices, cost methods, and high quality alerts. And alongside these markets, we are going to finally want establishments (a few of which shall be protocols) to take care of requirements that can turn into the infrastructure of the following economic system.

This text recapitulates a dialog with Ilan Strauss and Ido Salomon, and a separate dialog on the broader that means of protocols within the context of {industry} workflows and civilizational infrastructure with Venlaktesh Rao and Timber Schroff of the Ethereum Basis’s Summer time of Protocols program, and Denise Hearn and James Residence of the Lengthy Now Basis. Rao’s Protocol Reader and  David Lang’s “Requirements Make the World,” revealed by way of the Summer time of Protocols venture, inform the argument about protocols as civilizational infrastructure.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments