Decrypted | Insights from Virtru to Unlock New Ideas

Data-Centric Security Technologies: The Protection Layer

Written by Mike Morper | Dec 22, 2025 2:00:04 PM

When data leaves your infrastructure (shared with a partner, accessed remotely, stored in a cloud service) your traditional security controls stop working. Network perimeters can't follow files to partner systems. VPNs don't protect data once it's downloaded. These traditional, perimeter-focused access controls only work inside your environment.

Data-centric security changes this by embedding protection directly into data objects themselves. In this post, we're going to explore the four core technologies that make this possible: Attribute-Based Access Control (ABAC), Trusted Data Format (TDF), encryption across the data lifecycle, and Policy Enforcement Points (PEPs).

This is the third post in our series on implementing Zero Trust through Data-Centric Security. New here? You can catch up on Zero Trust foundations and the data protection ecosystem, or jump right in—this post stands on its own.

Attribute-Based Access Control (ABAC)

Attribute-Based Access Control (ABAC) enables intelligent access decisions based on who you are, what you're trying to access, where you are, and what the current operational context requires—moving beyond the limitations of simple role-based permissions to support dynamic operations and secure collaboration across organizational boundaries.

ABAC represents a sophisticated evolution beyond traditional role-based systems, directly leveraging the rich metadata foundation established through DSPM discovery, expert classification, and existing tagging systems. Where metadata can be described as the "decision engine" for DCS, ABAC serves as the operational implementation of that engine, transforming metadata into actionable access control decisions.

ABAC evaluates access requests by considering multiple dimensions simultaneously:

  • User attributes: Clearance level, organizational affiliation, departmental assignment, project membership
  • Resource attributes: Classification level, sensitivity tier, data category, originator restrictions
  • Environmental attributes: Network security posture, geographic location, time of access, device trust level
  • Action or entitlement attributes: Read, write, forward, print, export

This multi-dimensional analysis enables the context-aware security decisions essential for complex operational requirements across both government and commercial environments.

Operational Examples

Government Context: An intelligence analyst accessing the same classified report will receive different permissions based on their current assignment (headquarters vs. forward deployed), operational context (routine analysis vs. crisis response), and intended action (reading vs. sharing with coalition partners). ABAC makes these distinctions automatically using the metadata established through discovery and classification processes.

Commercial Context: A financial analyst at a multinational bank accessing customer portfolio data will receive different permissions based on their current role (portfolio management vs. compliance review), regulatory context (domestic transactions vs. cross-border reporting requirements), and intended action (viewing vs. exporting for third-party analysis). Similarly, a product engineer at a manufacturing company might access design specifications with permissions that vary based on their project assignment, whether they're working from a corporate network or a supplier facility, and whether they need read-only access or modification rights.

Technical Implementation

Policy Decision Points (PDPs): ABAC systems utilize policy engines that evaluate access requests against complex rule sets, processing the attribute combinations discovered by DSPM tools and applied by expert classification processes. These engines use boolean logic and contextual evaluations to make real-time access decisions.

Attribute Integration: ABAC implementations seamlessly integrate with the metadata ecosystem described in our last blog, consuming attributes from DSPM discovery platforms, manual classification tools like Titus and Boldon James, and existing organizational identity systems. This integration ensures that investment in discovery and classification capabilities directly enhances access control precision.

Standards-Based Policies: ABAC policies leverage standardized languages such as XACML (eXtensible Access Control Markup Language) that enable complex rule definitions while maintaining interoperability across different systems and organizations—critical for joint operations, coalition partnerships, and commercial ecosystems involving multiple business partners or third-party vendors.

Strategic Benefits

Dynamic Operational Support: ABAC policies automatically adapt to changing operational contexts, leveraging real-time metadata updates from discovery systems to ensure personnel receive appropriate access permissions without manual security administration intervention. Whether supporting military operations where mission requirements shift rapidly, M&A activities where data access needs change as deals progress, or incident response scenarios where cross-functional teams need temporary elevated access, ABAC adapts automatically.

Cross-Boundary Collaboration: ABAC facilitates secure information sharing across organizational boundaries by implementing policies that consider organizational affiliation, partnership status, project participation, and information sharing agreements simultaneously—all derived from the comprehensive metadata foundation. For defense organizations, this means secure collaboration with allied and partner nations. For commercial enterprises, this enables secure collaboration with suppliers, customers, business partners, and contractors while maintaining appropriate data governance.

Beyond Traditional RBAC: While role-based access control assigns permissions based on predefined organizational roles, ABAC considers the specific context of each access request using the rich metadata generated through discovery and classification processes. This flexibility supports operations where personnel may temporarily require different access levels based on project assignments, emergency situations, or cross-functional collaboration needs—whether that's a military operation requiring rapid information sharing across service branches or a product launch requiring temporary access to competitive intelligence across business units.

The Trusted Data Format (TDF)

Trusted Data Format (TDF) is an open standard that binds security policies directly to data objects, ensuring protection travels with information regardless of storage location, sharing destination, or processing environment, enabling secure collaboration while maintaining persistent protection and detailed audit trails. As a vendor-neutral standard, TDF enables interoperability while preventing technology lock-in—critical for long-term organizational technology independence.

TDF implements data-centric security by wrapping data objects with embedded encryption and policy controls, transforming ordinary files into self-protecting data objects that enforce access policies and maintain audit trails. TDF represents the operational implementation of the persistent protection principles described in our last blog, ensuring that metadata discovered through DSPM platforms and applied through expert classification becomes inseparably bound to the data itself.

TDF supports various data encoding formats (XML, JSON, binary) and data types (structured, unstructured, including streaming payloads), making it applicable across diverse operational environments. As an open source standard, variations of the Trusted Data Format have been created by standards bodies and coalitions, including IC-TDF for Intelligence Community applications and ZTDF for Zero Trust implementations.

As an open standard, TDF addresses the vendor independence and transparency requirements essential for organizations requiring long-term access to critical data. Organizations benefit from TDF's open standard foundation through:

  1. Assured long-term access to protected data regardless of vendor relationships or commercial market changes
  2. Customization capability for unique operational requirements and industry-specific compliance needs
  3. Transparency for security evaluation and approval processes, particularly important in regulated industries
  4. Community-driven innovation that accelerates capability development while reducing individual organizational costs

Organizations can implement TDF using multiple vendor solutions or open source implementations, preventing technology lock-in while ensuring long-term access to critical security capabilities.

Recommended Reading: DSPM Meets EDRM: Extending Data-Centric Security Beyond the Perimeter

Operational Context Examples

Government Scenario: When DSPM systems discover classified content and experts apply appropriate markings, TDF ensures those security policies travel with the data through all subsequent sharing, storage, and access scenarios—from secure servers to coalition partner systems to tactical edge devices.

Commercial Scenarios: When a pharmaceutical company's DSPM systems identify clinical trial data containing patient information, TDF protection ensures HIPAA compliance requirements travel with that data whether it's accessed by internal researchers, shared with FDA reviewers, or transmitted to partner research institutions. Similarly, when a financial services firm protects trading algorithms or customer portfolio data with TDF, those protections remain active whether the data resides in on-premises servers, cloud analytics platforms, or is shared with regulatory auditors.

Technical Architecture

TDF objects consist of three primary components that work together to implement the data protection ecosystem:

  1. Encrypted Payload: The actual data encrypted using symmetric encryption (typically AES-256)
  2. Encrypted Metadata: Policy information, access controls, and audit requirements encrypted separately to enable policy evaluation without exposing sensitive content
  3. Key Access Objects: Encrypted data encryption keys that can only be retrieved by authorized systems using valid authentication and policy evaluation


Recommended Reading: The Microsecurity Revolution

Standards-Based Interoperability: TDF's open-standard architecture enables seamless integration across different systems, applications, and organizations, eliminating the need for proprietary software dependencies. This interoperability is essential for joint operations and coalition partnerships in defense contexts, as well as for commercial ecosystems where diverse technology environments—from legacy ERP systems to modern cloud platforms—must work together securely.

Policy Integration: TDF objects utilize the metadata foundation established in our last blog, incorporating access policies that reflect DSPM discovery results, expert classification decisions, and organizational security requirements in a portable and enforceable format.

Operational Benefits and Strategic Advantages

  • Persistent Protection: TDF protection remains active throughout the complete data lifecycle, implementing the "last mile" protection that complements "first mile" DSPM discovery. Files can be copied, forwarded, cached, or stored across multiple systems while maintaining their security properties and access controls. This persistence is critical whether you're protecting classified intelligence that must maintain security across coalition networks or protecting proprietary product designs that must remain secure as they move through global supply chains.
  • Vendor Neutrality: As an open standard, TDF enables organizations to avoid vendor lock-in while fostering innovation through community development. This is particularly important for government organizations requiring long-term technology independence, but equally valuable for commercial enterprises that need assurance of continued data access regardless of vendor business decisions or market consolidation.
  • Dynamic Policy Management: TDF enables data owners to update access policies even after data has been shared, supporting dynamic operational requirements where access needs change rapidly. In military contexts, this might mean revoking access to intelligence when security conditions change or personnel transfer. In commercial contexts, this enables organizations to immediately revoke contractor access when projects complete, update data sharing permissions when regulatory requirements change, or respond to security incidents by restricting access to potentially compromised data—all without attempting to recall or delete distributed copies.
  • Comprehensive Auditing: TDF objects maintain detailed audit trails that track access attempts, policy evaluations, and data usage patterns, providing operational intelligence for proactive security management. These audit capabilities support both security operations (detecting anomalous access patterns that might indicate compromise) and compliance requirements (demonstrating appropriate data handling for regulatory frameworks like CMMC, NIST, ITAR, GDPR, or HIPAA).

Encryption Across the Data Lifecycle

Comprehensive encryption ensures data protection whether stored, transmitted, or actively being processed—enabling secure operations across diverse environments from remote field locations to cloud computing platforms while maintaining performance and operational effectiveness.

Effective data-centric security requires encryption protection throughout the complete data lifecycle, from initial creation through final disposition. This comprehensive approach ensures that sensitive information remains protected regardless of processing context, storage location, or transmission method.

Implementation Components

  1. Data at Rest: Storage encryption protects information stored on servers, databases, mobile devices, and backup systems using strong encryption algorithms and secure key management, ensuring protection against physical compromise and unauthorized access. This protection is essential whether securing classified intelligence on government servers, protecting patient records in healthcare databases, or safeguarding intellectual property in corporate file repositories.
  2. Data in Transit: Network encryption protects information during transmission across networks, internet connections, and communication systems using protocols that provide authentication, integrity, and confidentiality throughout data movement. This protection operates across diverse network environments—from secure government networks to public internet connections used for remote work or partner collaboration.
  3. Data in Use: Advanced encryption techniques protect information during active processing, enabling secure computation on encrypted data without exposing sensitive content to processing systems or unauthorized users. These capabilities are increasingly important as organizations leverage cloud analytics, AI/ML processing, and third-party computing resources where the processing infrastructure itself may not be fully trusted.

Operational Integration

Lifecycle encryption integrates with organizational key management systems (which we'll explore in detail in our next post) to provide centralized policy enforcement and audit capabilities while supporting diverse operational requirements. Integration with ABAC systems enables encryption policies that adapt to operational context, user attributes, and environmental conditions—ensuring that encryption provides security without constraining operational flexibility.

Policy Enforcement Points (PEPs) at the Data Layer

Policy Enforcement Points (PEPs) provide the technical enforcement mechanisms that translate security policies into operational controls, ensuring consistent protection regardless of the application, system, or network being accessed—critical for operations spanning diverse environments and organizational boundaries.

PEPs in data-centric architectures implement access control decisions generated by ABAC systems, utilizing the metadata foundation established through discovery and classification processes. Unlike traditional PEPs that operate at network boundaries (where they can be bypassed once an attacker gains internal access), data-layer PEPs are embedded within or tightly coupled to data objects themselves, ensuring that the rich metadata from DSPM discovery and expert classification directly drives security enforcement.

Operational Examples

Government Context: When an analyst attempts to access classified intelligence, the PEP evaluates their clearance level, current assignment, operational context, and intended action against the policies embedded within the TDF-protected document, all using the metadata established through the discovery and classification ecosystem.

Commercial Contexts: When a sales representative attempts to access customer financial data in a CRM system, the PEP evaluates their role, current location (corporate network vs. public wifi), device security posture, and intended action (viewing vs. exporting) against embedded policies. When a contractor at a manufacturing facility attempts to access product specifications, the PEP verifies their project assignment, current contract status, and whether access is being requested from approved systems—all in real-time, without requiring manual verification by security administrators.

Recommended Viewing: Virtru Data Security Platform: Email Policy Enforcement Point (PEP)

Technical Implementation

Software Integration: PEPs integrate with applications, operating systems, and data access layers through software libraries and agents that intercept data access requests, ensuring seamless operation within existing organizational technology environments. This approach enables data protection to work consistently across diverse applications, from custom-built operational systems to commercial SaaS platforms.

Standards-Based Enforcement: PEPs use open standards and APIs to integrate with diverse technology environments, supporting the vendor-neutral approach necessary for organizations that require technology independence and long-term operational flexibility. This standards-based approach is critical for government operations, but equally valuable for commercial enterprises that need security capabilities that work consistently across hybrid and multi-cloud environments.

Real-Time Evaluation: PEPs evaluate access requests in real-time using current attribute information from identity systems, environmental context, and embedded policies, enabling dynamic access control that adapts to changing operational conditions. This real-time capability ensures that access decisions reflect current rather than historical context—whether that's detecting that a user credential has been compromised, identifying that a device no longer meets security compliance requirements, or recognizing that operational conditions have changed in ways that affect data access appropriateness.

Critical Enforcement Functions

Metadata-Driven Decisions: PEPs translate the rich metadata from discovery and classification processes into actionable access control decisions, implementing ABAC policies that consider user attributes, environmental context, and operational requirements simultaneously.

Usage Control Beyond Access: PEPs enforce sophisticated usage limitations that go beyond simple allow/deny access decisions. These controls include restrictions on copying, forwarding, printing, screenshotting, and time-limited access capabilities. In government contexts, these controls are essential for protecting intelligence sources and methods. In commercial contexts, they enable organizations to share sensitive information with partners or contractors while preventing unauthorized redistribution—allowing a supplier to view product specifications without enabling them to forward those specifications to competitors, or permitting a financial advisor to access customer data during the advisory relationship while preventing data retention after the relationship ends.

Dynamic Revocation: PEPs enable real-time access revocation by checking policy updates and key validity before each access attempt, allowing data owners to immediately respond to changing security conditions or compromised credentials. This capability means that organizations can revoke access to distributed data immediately when employees leave, when contractors complete projects, when security incidents are detected, or when regulatory requirements change—without attempting to track down and delete every copy of the data.

Technology Integration: The Complete Protection Layer

These four core technologies work together to implement the comprehensive data protection ecosystem, transforming how organizations secure and share business-critical information. The integration creates end-to-end protection that spans from metadata-driven decisions through persistent protection and intelligent enforcement.

This technology integration enables several capabilities:

Cross-Boundary Operations: Information can move securely across organizational boundaries, security domains, and partner ecosystems while maintaining consistent protection. For defense organizations, this means sharing intelligence across classification levels and with coalition partners. For commercial enterprises, this means collaborating with suppliers, customers, and business partners without creating security vulnerabilities or complex bilateral security agreements for each relationship.

Environment Independence: Protection operates consistently whether data resides in on-premises systems, cloud platforms, mobile devices, or partner infrastructures. Organizations don't need to establish equivalent security controls in every environment where data might travel—the protection travels with the data itself.

Operational Agility: Security adapts automatically to changing contexts without manual intervention. Access decisions reflect current operational conditions, user attributes, and threat environments without requiring security administrators to manually update permissions as situations evolve.

Comprehensive Auditability: Organizations maintain complete visibility into data usage patterns regardless of where access occurs, supporting both security operations and compliance requirements across diverse regulatory frameworks.

Standards-Based Assurance: The open standards foundation ensures long-term viability while preventing vendor lock-in, supporting the technology independence essential for critical systems—whether those are national security systems requiring assured long-term access or commercial systems where organizations need independence from vendor business decisions.

What's Next: The Management and Operations Layer

These four technologies (ABAC, TDF, encryption, and PEPs) create the protection envelope around your data, enabling the persistent, intelligent, context-aware security that defines data-centric approaches. But implementing these protections at enterprise scale requires sophisticated infrastructure for managing cryptographic operations, integrating with authoritative identity systems, and orchestrating policies across diverse environments.

In our next post, we'll explore the management and operations layer: key management strategies that enable organizational sovereignty and coalition collaboration, identity integration that provides authoritative attributes for access decisions, and the orchestration capabilities that make data-centric security operationally practical across complex organizational environments.

This post is the third in a series on implementing Zero Trust through Data-Centric Security. Read our previous entries on The Foundations of Zero Trust and Understanding Data-Centric Security: From Zero Trust Principles to Practice.