Over 6,000 years ago, the first written message was delivered by hand on a clay tablet in ancient Mesopotamia. Back then, civilizations were plagued with duplicate data, slow transmission, and little security.
Would it surprise you that many government agencies still deliver information by hand, and still face the same challenges today?
‘Secure digital transfer’ has been available for quite some time, but there are strong forces inhibiting states from using it as the status quo. Here are a few events that summarize the past year:
State governments handle a vast array of sensitive information, including personally identifiable information (PII), public safety records, health data, tax files, administrative documents, insurance details, and financial records. They also must maintain compliance with stringent regulations such as: CJIS, HIPAA, FERPA, IRS Pub1075, etc.
States must frequently share sensitive information internally and externally – with other government agencies, educational institutions, private contractors, and citizens. This necessary exchange of data presents state governments with a critical choice between these four options:
Conventional SaaS architecture requires states to trust 3rd party companies with its data – as an extension of itself. Despite malicious actors exfiltrating terabytes of citizen data from SaaS platforms, the lure of SaaS products is often too impractical not to indulge.
This leads to an alarming domino effect of 3rd party data processors & subcontractors having access to large amounts of citizen content – which is then copied to many other 3rd-party networks without states being able to track or control it.
Due to internal conflict, budget, or convenience, states often delegate data responsibilities to individual agencies which may select their own storage & cybersecurity tools. This goes against centralization efforts and creates ‘shadow IT’ due to the data silos built outside of Central IT’s purview.
Many state agencies have their own skilled IT professionals, but the downside of separate tenants and data centers results in conflicting admin rights, duplicate emails, files, cyber tools, and IT payments.
To combat numbers 1 + 2, a state will sometimes revert back to hosting data centers on-site, which often goes against modernization efforts – while increasing maintenance costs, resource bandwidth, and service outages.
Many state agencies still burn CDs, fax, FedEx, & hand-deliver hard copies of sensitive information because it’s been protocol for so long, or because they don’t have a secure way to send it electronically.
The hidden gem of the bunch is often considered an ‘unreachable utopia.’
“Data-centric security” is an approach to security that emphasizes the dependability of the data itself rather than the security of networks, servers, or applications.
Instead of protecting data itself, many states initially deploy combinations of End-user Training, Data Loss Prevention (DLP), Single Sign-On (SSO), Endpoint Detection & Response (EDR), and follow National Protocols (NIST, FedRAMP, FIPS, etc.) in hopes of increasing levels of security.
The reason this doesn’t fix the problem is simple: data still needs to be shared externally.
This leads to these core issues:
To paint the picture, consider a single PDF file generated by a state employee, and sent to an external recipient. At minimum, that file is now likely stored on an external:
The state owns 1 secure copy of the PDF on its network, but now there are 7 unsecured copies stored on foreign devices instantaneously – all outside of its visibility and control.
…and that’s before the recipient potentially downloads the PDF to personal file storage (e.g - Google Drive, Dropbox, iCloud, etc.), or forwards it to someone else.
In common workflows like this, the state actually begins with a ‘minority ownership’ of its data.
Even if a state’s network is fully secure, the hundreds of millions of unstructured data objects shared externally meet the same fate – copied several times to different servers, endpoint devices, personal media, backups, and archives (all outside the state’s network).
This results in massive data sprawl – even in the most secure internal environments.
Some states have dared to ask the questions that defy archaic techniques. Questions like:
This technology not only exists today, but is readily available and called the Trusted Data Format (TDF).
Virtru’s TDF format not only removes 3rd party access to unstructured data, it enables states to track & control files inside of external storage devices. This enables states to retract their data at any time.
TDF is a wrapper that goes around the data itself - like an ‘armored vehicle with a lasso.’ It enables data objects to travel into foreign environments without states losing control. TDF is the technology used in Virtru’s Platform.
This approach enables governments to move off ‘stone age’ workflows, reduce data sprawl, & take back control of the data objects normally lost in everyday workflows.
With Virtru's data-centric approach using the Trusted Data Format, states can finally control content wherever it travels. To learn more about how Virtru can help your state seamlessly protect & share sensitive information, book a demo today.
Recommended Reading: Maryland Uses Virtru to Enable Distributed Teams with Easy-to-Use Data Protection for CJIS Compliance in the Cloud | State Government Migrates from MOVEit Transfer to Virtru Secure Share
Christian is the Senior Enterprise Director of Telecom and State & Local government at Virtru.
View more posts by Christian EngSee Virtru In Action
Sign Up for the Virtru Newsletter
Contact us to learn more about our partnership opportunities.