Decrypted | Insights from Virtru to Unlock New Ideas

The Era of AI: Why “Metadata on Data” is Critical Infrastructure

Written by John Ackerly | Feb 25, 2026 2:30:00 PM

In 2026, the world is buzzing with a new term: the Agentic Enterprise. A recent outlook from Salesforce hit the nail on the head, describing 2026 as a massive turning point where AI agents move from being simple "chatbots" to "outcome owners." While this statement may seem obvious to most industry observers — there is one specific idea in the report that I’d like to emphasize: metadata on data is quietly becoming critical infrastructure.

At Virtru, we’ve been shouting this from the rooftops for years. If the future of your business depends on autonomous AI agents making decisions, then the future of your business depends on the metadata that governs the very data that those agents feed upon.

Why Metadata on Data is the New "Critical Infrastructure"

In the old world of IT, infrastructure meant servers, firewalls, and clouds. In the new world of AI, infrastructure is the context surrounding your data.

Think about it: An AI agent is only as good as the data it can access. But in a complex enterprise, how does an agent know which data is sensitive? How does it know who owns a file, whether it can be shared with a partner, or if it contains PII that must stay within a specific geographic boundary?

If you try to manage these rules at the "app" level or the "network" level, you will fail. The Agentic Enterprise moves too fast for that.

To thrive, security and governance must become more granular. In other words, it must be baked into the data itself. This is precisely the definition of metadata for data — it’s a digital wrapper that travels with the information wherever it goes, describing for anyone and everyone (including robots) what they can, and cannot do with the information. Without this universal and standardized metadata layer, AI agents are a liability; with it, they are your greatest competitive advantage.

Recommended Reading: While AI Accelerates, the Data Owner is Left in the Dust

The Open TDF Standard: The Blueprint for Agentic Governance

If metadata on data is critical infrastructure, then the Trusted Data Format (TDF) is the open standard for how that infrastructure should be built.

Created by our co-founder, Will Ackerly, and originally used by the intelligence community to protect the world’s most sensitive secrets, TDF is an open standard that wraps data in a secure, policy-enabled envelope.

Here is why TDF is the perfect example of the metadata-driven future:

  • Object-Level Governance: TDF attaches granular policies directly to the data. When an AI agent encounters a TDF-protected file, the metadata tells the agent—and the access control server—exactly what the permissions are.
  • Interoperability: Because TDF is an open standard, it prevents vendor lock-in. In a world of "multi-agent" ecosystems, your data needs to be readable and governable across different platforms (Salesforce, Google, Microsoft, AWS, etc.) without losing its security posture – and without sacrificing sovereignty.
  • Sovereignty: Metadata allows for persistent control. If a piece of data – wrapped in standard TDF metadata – is fed into an LLM or shared with an autonomous agent, TDF allows you to revoke access or change permissions in real-time, even after the data has left your perimeter.

Trust is the Ultimate Product

At Virtru, we see the challenges centered around data security as more than just a technical hurdle. It is a trust hurdle.

Your customers and partners are going to ask: "How are you protecting my data?" If your answer is "we have strong perimeter security," you’ve already lost. If your answer is "our data carries its own sovereign security and metadata via the TDF standard," you are demonstrating a level of maturity that defines a market leader.

The Agentic Enterprise isn't just about faster workflows; it’s about secure, automated trust. By investing in metadata as critical infrastructure today, you aren't just checking a GRC box—you are building the foundation for the next decade of innovation.

Recommended Reading: Why Virtru Is Critical Infrastructure for Defense's Data-Driven Future

The Ecosystem Imperative: No Single Vendor Can Own the Data Control Plane

Here's the reality that forward-thinking enterprises and governments already understand: the data security landscape is undergoing a fundamental architectural shift—from perimeter-centric walls to data-centric controls. As organizations embrace AI, zero trust, and data sovereignty, the complexity isn't decreasing—it's exploding.

And no single vendor can solve the full stack alone.

The future requires not only open standards for metadata on data – but an ecosystem of innovators integrating value across specialized capabilities. Identity and access management vendors (ICAM/IGA/PAM) must interoperate with data security posture management and data loss prevention solutions (DSPM/DLP). SASE and zero trust network access providers (SASE/ZTNA) need to coordinate with cloud infrastructure platforms.

Hardware security modules (HSMs), confidential computing platforms, and cryptographic key management systems must work seamlessly together. Systems integrators need common frameworks to orchestrate these capabilities across hybrid and multi-cloud environments.

This future isn't about any one company winning. It's about an entire ecosystem coalescing around open standards for data-centric security—which is precisely why OpenTDF sits at the core of the Virtru Data Security Platform, and why our platform is built to be extensible and programmable by design.

The companies that will thrive in the age of AI aren't those trying to avoid disruption, nor are they the ones building proprietary walled gardens. They're the ones providing essential infrastructure capabilities (like metadata on data) while enabling ecosystem integration—making AI deployment not just possible, but secure, trustworthy, and sovereign.

That's the opportunity. That's the future. And that's the ecosystem we're fostering at Virtru.