Data Protection Compliance Isn’t What You Think — A Field CTO’s Take
When it comes to data protection compliance, every regulation you’ve ever dealt with (HIPAA, CMMC, FISMA, PCI, DORA, the state privacy laws, the OMB executive orders) has one thing in common.
They’re not asking you to be innovative. They’re asking you to show your work.
Remember 6th grade math? You’d get the answer right, but the teacher still flipped your paper over and said, “Show me how you got there.” That’s what regulators are doing right now. The days of saying “we have resilience in our infrastructure” and calling it done? Over. Now they want to see the process. The policy. The proof. They want to see what happens when the plan hits reality and fails gracefully instead of failing catastrophically.
That shift changes everything about how you approach data protection compliance.
What Data Protection Compliance Actually Requires
Strip away the acronyms, the compliance frameworks, the specific industry requirements. At the core, every data-centric regulation is asking two things:
Do you know what you have? And can you prove what you’ve done with it?
That’s visibility and assurance. Those are my two words. Not sexy words, but the right ones.
Visibility: Do I know that EJ’s data exists in my environment? Do I know that his social security number, his email address, his credit card number, some combination of identifiers that dial the wheel into a regulatory obligation, is sitting somewhere in my infrastructure?
Assurance: If I have a policy that says that data gets protected a certain way, do I know the policy actually works? Not just that I wrote it. That it works.
Most organizations have gotten pretty good at the first one, at least for the obvious places. Where they fall apart is the second.
The Skip Column: Where Data Protection Compliance Breaks Down
Here’s a scenario I walk customers through all the time.
You have a policy: any file leaving your organization going to a trusted partner gets a Microsoft Information Protection sensitivity label. Restricted, authentication required, the whole thing. Beautiful policy. Solid governance. Your auditor is going to love it.
Except.
Not every file in your environment is a .docx or an .xlsx. You’ve got CSV files. Text files. PowerPoints from 2014. These file types can’t take a sensitivity label. They’re invisible to that policy.
So what does normal policy do? It skips them.
Now you’ve got a skip column. And when the auditor shows up and asks what happened to those files, “we skipped them” isn’t an answer. It’s a liability.
What if your fallback remediation, instead of skipping, automatically encrypted the file? Or redacted the sensitive content? So the policy either does this or it does that, and it never just skips?
Now when the auditor asks, you pull up your logs: here are the 100 files, here are the 75 that got labeled, here are the 25 that couldn’t take a label and got encrypted instead. Skip column? Zero.
That’s assurance. That’s what data protection compliance looks like in practice.
Data in Motion Is the Hard Part
Protecting data at rest isn’t new technology. I said this at a Gartner event not long ago. Symantec bought Vontu in 2007. Broadcom bought Symantec. You’re talking about DLP technology that has graduated from college and is looking for its first apartment. It’s that old.
The reason every modern regulation has data in motion as a requirement isn’t because regulators suddenly got smarter. It’s because they finally stopped assuming your data stays where you put it.
They’re not assured it’s sitting in your mainframe. They’re not assured it’s staying in your AWS bucket. They know that users interact with files. They know data gets migrated. They know your third-party credit card printer has your customers’ names on it. And when that vendor gets breached, the headline doesn’t say “Billy’s Card Shop Compromised.” It says your name.
You’re culpable for the headline even if you’re not legally culpable for the breach.
So the question for data protection compliance shifts from “is our data encrypted at rest?” to “is our data protected wherever it goes?” Those are fundamentally different questions, and they require fundamentally different architecture.
The Quantum Problem Nobody’s Talking About Correctly
Everyone talks about quantum computing like the risk is that old algorithms get cracked. That’s real, but it’s half the conversation.
The other half is agility.
Keys are getting massive. What used to be measured in kilobytes is now orders of magnitude larger. Can your infrastructure even support storing them? And more importantly, when the current generation of quantum-resistant keys eventually gets cracked (and something always does), how fast can you rotate?
Key rotation isn’t a hygiene exercise anymore. It’s an emergency response capability. The organizations that are going to be fine aren’t the ones that picked the best algorithm today. They’re the ones that built the infrastructure to change it tomorrow.
Regulation is starting to understand this. It’s not there yet. Regulation never leads, it chases. But the NIST post-quantum cryptography standards, the updated FISMA requirements, the CMMC guidance, they’re all pointing in the same direction. Be ready to change.
We’re Not a Hammer. We’re a Tool Belt.
Here’s what I hear from customers all the time:
“PKWARE is a really great encryption company.”
That’s true. But it’s incomplete.
We also do redaction. We do labeling. We do deletion. We do movement, as in, don’t leave it here, put it somewhere appropriate. Having those options matters because the right answer for data protection compliance isn’t always the same answer. Encrypt everything sounds like a policy until you run into a file that needs to be deleted, not encrypted. Redact everything sounds like a policy until you’re dealing with a partner file transfer that needs to stay intact.
The best thing for your data depends on the workflow and the use case. A hammer looks for a nail. A tool belt looks at the job.
When a customer has five products in their data security stack (PKWARE on endpoints and file servers, BigID for cloud repositories, Varonis for access controls, a CASB doing its thing) and a regulator asks them to demonstrate compliance, they’re going to four products, assembling four reports, reconciling four policy frameworks. That’s not a security problem. That’s a resource problem on a team that’s already stretched thin.
Data-first architecture solves for that. If your security posture is built around the data itself, not the network, not the firewall, not the cloud bucket, your policies travel with the data regardless of where it lands. S3 bucket, Google Cloud, Azure, endpoint, mainframe. The protection isn’t tied to the location. It’s tied to the content.
What PKWARE Actually Does for Data Protection Compliance
When you boil down what regulators actually need, it’s this:
Know what you have. Take action on what matters. Prove that it worked.
Discovery. Remediation. Reporting. And flexible remediation, which is the part most vendors skip because they only have one answer.
We assume the data is going to move. That assumption is baked into how we build policy. So when it does move (and it will), you’re not scrambling to retrofit protection onto something that’s already in motion. You’re ahead of it.
And if you’re working with DLP tools already, that’s not a conflict. Signal us to the DLP. Let us scan, encrypt, and write to the metadata header that says: PKWARE scanned this. It passed policy. You don’t need to rescan it. Let the encrypted file through. You can’t read it anyway, but you know it was handled.
That’s not replacement. That’s augmentation. That’s what a data-first company can do for a network-first stack.
For teams looking to understand the regulatory landscape in more depth, the CISA data security guidance is a solid starting point alongside whatever framework your industry requires.
The Closet Analogy
I’ll leave you with this.
You want to clean out your closet. You don’t start by examining each shirt individually and deciding on the spot. You start by taking inventory. Dress shirts over here. Casual over there. T-shirts in the back. Now I know what I’m working with.
Once you have inventory, the decisions get fast and defensible. Keep, donate, trash. Policy applied.
That’s what every regulation is asking you to do with your data. Take inventory. Apply policy. Show the work.
If you can’t answer those three questions confidently right now, for every data type, in every environment, including the ones moving between them, the regulation isn’t your problem.
Your visibility is.
EJ Pappas is the Field CTO at PKWARE. He works with enterprise security, compliance, and governance teams navigating data protection compliance across regulated industries.
When it comes to data protection compliance, every regulation you’ve ever dealt with (HIPAA, CMMC, FISMA, PCI, DORA, the state privacy laws, the OMB executive orders) has one thing in common.
They’re not asking you to be innovative. They’re asking you to show your work.
Remember 6th grade math? You’d get the answer right, but the teacher still flipped your paper over and said, “Show me how you got there.” That’s what regulators are doing right now. The days of saying “we have resilience in our infrastructure” and calling it done? Over. Now they want to see the process. The policy. The proof. They want to see what happens when the plan hits reality and fails gracefully instead of failing catastrophically.
That shift changes everything about how you approach data protection compliance.
What Data Protection Compliance Actually Requires
Strip away the acronyms, the compliance frameworks, the specific industry requirements. At the core, every data-centric regulation is asking two things:
Do you know what you have? And can you prove what you’ve done with it?
That’s visibility and assurance. Those are my two words. Not sexy words, but the right ones.
Visibility: Do I know that EJ’s data exists in my environment? Do I know that his social security number, his email address, his credit card number, some combination of identifiers that dial the wheel into a regulatory obligation, is sitting somewhere in my infrastructure?
Assurance: If I have a policy that says that data gets protected a certain way, do I know the policy actually works? Not just that I wrote it. That it works.
Most organizations have gotten pretty good at the first one, at least for the obvious places. Where they fall apart is the second.
The Skip Column: Where Data Protection Compliance Breaks Down
Here’s a scenario I walk customers through all the time.
You have a policy: any file leaving your organization going to a trusted partner gets a Microsoft Information Protection sensitivity label. Restricted, authentication required, the whole thing. Beautiful policy. Solid governance. Your auditor is going to love it.
Except.
Not every file in your environment is a .docx or an .xlsx. You’ve got CSV files. Text files. PowerPoints from 2014. These file types can’t take a sensitivity label. They’re invisible to that policy.
So what does normal policy do? It skips them.
Now you’ve got a skip column. And when the auditor shows up and asks what happened to those files, “we skipped them” isn’t an answer. It’s a liability.
What if your fallback remediation, instead of skipping, automatically encrypted the file? Or redacted the sensitive content? So the policy either does this or it does that, and it never just skips?
Now when the auditor asks, you pull up your logs: here are the 100 files, here are the 75 that got labeled, here are the 25 that couldn’t take a label and got encrypted instead. Skip column? Zero.
That’s assurance. That’s what data protection compliance looks like in practice.
Data in Motion Is the Hard Part
Protecting data at rest isn’t new technology. I said this at a Gartner event not long ago. Symantec bought Vontu in 2007. Broadcom bought Symantec. You’re talking about DLP technology that has graduated from college and is looking for its first apartment. It’s that old.
The reason every modern regulation has data in motion as a requirement isn’t because regulators suddenly got smarter. It’s because they finally stopped assuming your data stays where you put it.
They’re not assured it’s sitting in your mainframe. They’re not assured it’s staying in your AWS bucket. They know that users interact with files. They know data gets migrated. They know your third-party credit card printer has your customers’ names on it. And when that vendor gets breached, the headline doesn’t say “Billy’s Card Shop Compromised.” It says your name.
You’re culpable for the headline even if you’re not legally culpable for the breach.
So the question for data protection compliance shifts from “is our data encrypted at rest?” to “is our data protected wherever it goes?” Those are fundamentally different questions, and they require fundamentally different architecture.
The Quantum Problem Nobody’s Talking About Correctly
Everyone talks about quantum computing like the risk is that old algorithms get cracked. That’s real, but it’s half the conversation.
The other half is agility.
Keys are getting massive. What used to be measured in kilobytes is now orders of magnitude larger. Can your infrastructure even support storing them? And more importantly, when the current generation of quantum-resistant keys eventually gets cracked (and something always does), how fast can you rotate?
Key rotation isn’t a hygiene exercise anymore. It’s an emergency response capability. The organizations that are going to be fine aren’t the ones that picked the best algorithm today. They’re the ones that built the infrastructure to change it tomorrow.
Regulation is starting to understand this. It’s not there yet. Regulation never leads, it chases. But the NIST post-quantum cryptography standards, the updated FISMA requirements, the CMMC guidance, they’re all pointing in the same direction. Be ready to change.
We’re Not a Hammer. We’re a Tool Belt.
Here’s what I hear from customers all the time:
“PKWARE is a really great encryption company.”
That’s true. But it’s incomplete.
We also do redaction. We do labeling. We do deletion. We do movement, as in, don’t leave it here, put it somewhere appropriate. Having those options matters because the right answer for data protection compliance isn’t always the same answer. Encrypt everything sounds like a policy until you run into a file that needs to be deleted, not encrypted. Redact everything sounds like a policy until you’re dealing with a partner file transfer that needs to stay intact.
The best thing for your data depends on the workflow and the use case. A hammer looks for a nail. A tool belt looks at the job.
When a customer has five products in their data security stack (PKWARE on endpoints and file servers, BigID for cloud repositories, Varonis for access controls, a CASB doing its thing) and a regulator asks them to demonstrate compliance, they’re going to four products, assembling four reports, reconciling four policy frameworks. That’s not a security problem. That’s a resource problem on a team that’s already stretched thin.
Data-first architecture solves for that. If your security posture is built around the data itself, not the network, not the firewall, not the cloud bucket, your policies travel with the data regardless of where it lands. S3 bucket, Google Cloud, Azure, endpoint, mainframe. The protection isn’t tied to the location. It’s tied to the content.
What PKWARE Actually Does for Data Protection Compliance
When you boil down what regulators actually need, it’s this:
Know what you have. Take action on what matters. Prove that it worked.
Discovery. Remediation. Reporting. And flexible remediation, which is the part most vendors skip because they only have one answer.
We assume the data is going to move. That assumption is baked into how we build policy. So when it does move (and it will), you’re not scrambling to retrofit protection onto something that’s already in motion. You’re ahead of it.
And if you’re working with DLP tools already, that’s not a conflict. Signal us to the DLP. Let us scan, encrypt, and write to the metadata header that says: PKWARE scanned this. It passed policy. You don’t need to rescan it. Let the encrypted file through. You can’t read it anyway, but you know it was handled.
That’s not replacement. That’s augmentation. That’s what a data-first company can do for a network-first stack.
For teams looking to understand the regulatory landscape in more depth, the CISA data security guidance is a solid starting point alongside whatever framework your industry requires.
The Closet Analogy
I’ll leave you with this.
You want to clean out your closet. You don’t start by examining each shirt individually and deciding on the spot. You start by taking inventory. Dress shirts over here. Casual over there. T-shirts in the back. Now I know what I’m working with.
Once you have inventory, the decisions get fast and defensible. Keep, donate, trash. Policy applied.
That’s what every regulation is asking you to do with your data. Take inventory. Apply policy. Show the work.
If you can’t answer those three questions confidently right now, for every data type, in every environment, including the ones moving between them, the regulation isn’t your problem.
Your visibility is.
EJ Pappas is the Field CTO at PKWARE. He works with enterprise security, compliance, and governance teams navigating data protection compliance across regulated industries.


