Sure I can protect data with CryptProtectData, but how do I remove the ability to decrypt it?


A customer was using the Crypt­Protect­Data function to protect some information, and they used the corresponding function Crypt­Unprotect­Data to decrypt the buffer and recover the information. But they wanted to know how to render the protected information un-decryptable when their program is uninstalled.

The decryption key is tied to the user, so there is no way to revoke it. But what you can do is put something in the entropy (nonce). In order to decrypt the data, the caller must be running as the correct user, and the caller must be able to produce the entropy.

For example, you might generate a random number when the program is installed and save that random number somewhere. Whenver you need to encrypt or decrypt data, you combine that random number with whatever entropy source you would normally have used.

Of course, you didn't really "protect" the data since the random number had to be saved somewhere, and the user could fish it out of wherever you saved it. Even if you saved the nonce off the machine (say, on a Web server), or if the nonce were systematically generated from other data, the user can still fish it out: They could hook the Crypt­Unprotect­Data function and see what you passed as the entropy.

Mind you, the user could also simply hook the Crypt­Unprotect­Data function and capture the unprotected data! Once that's done, it doesn't matter what you do: You can uninstall the app, intentionally corrupt the data, or delete it outright. The cat is already out of the bag: You decrypted it in a place that the user has access to.

Mind you, you already lost even before this point, because the user could have hooked Crypt­Protect­Data and captured the unprotected data before it got encrypted in the first place.

Basically, this is a pointless effort. Even if you can make the information to become unrecoverable after the program uninstalls, the user could simply capture the data before uninstalling your program. The only way to keep this from happening is never to let the user see the unprotected data in the first place.

Comments (25)
  1. Brian_EE says:

    Why does this sound like someone trying to develop ransomware – Encrypt the data, call us to decrypt – uninstall our malware and data encrypted forever!

    1. kantos says:

      Because it probably is the barely legal equivalent: where an ISV does something really shady to “Protect their intellectual property”. For obvious reasons Raymond can’t tell us any sort of why to protect the guilty party. Nor am I going to speculate further.

    2. laonianren says:

      You can have software that legitimately stores a secret required to access some networked resource, and the customer demands that the secret is rendered inaccessible when the software is uninstalled. The user is trusted; they just don’t want the secret hanging around any longer than necessary.

      Ideally it wouldn’t work like this, but in my experience it’s common for accessing legacy stuff in enterprise environments.

      1. Joker_vD says:

        Then just delete the secret during the uninstall? After that it’ll become inaccessible all right.

    3. The MAZZTer says:

      Sounds more like someone trying to develop DRM without understanding how computers work (if the user can see it on their screen or hear it on their speakers, it has to be in a form they could save and reproduce later regardless of if you want them to be able to or not). Or at least something along those lines.

      1. Mason Wheeler says:

        > Sounds more like someone trying to develop DRM without understanding how computers work

        Why do you repeat yourself like that? :P

      2. Yuri Khan says:

        Well, DRM *is* a form of ransomware.

      3. DWalker07 says:

        Right, that’s known as the “analog hole”. DRM-enabled HDMI signals, DRM-enabled decoder chips, etc. can “protect” data all the way to the end (where the streams supposedly can’t be captured by hooking into the sound or video card output).

        You can always set up a Webcam pointed at the screen to record visuals, and a tape recorder to record audio. The resolution will likely suffer though.

        1. Tanveer Badar says:

          What is to stop a sufficiently strong willed electronic engineer to build something to capture the actual signals going to screen and speakers?

          1. cheong00 says:

            I think there already exists HDMI convertor/splitor that removes HDCP.

          2. DWalker07 says:

            Right; you could craft a cable that captures the signals and stores them. It would be hard for the average hobbyist to do this; if the financial return is good enough, or the curiosity factor is high enough, anyone with resources can build something like that. Probably not without running afoul of the DMCA-related laws, so these devices might be hard to sell.

      4. Phlip says:

        So then the answer to the original problem is the same as the one for DRM – if you can’t make it technologically impossible for the user to decrypt the data, just pass a broadly-written poorly-thought-out law to make it illegal for them do decrypt it. Problem solved.

  2. Erik F says:

    I wonder how all this works in the brave new world of GDPR. The only thing that I can say for certain is that the only real winners will be the lawyers, as usual. :-)

  3. Brian says:

    Boy, people are suspicious. Consider the situation where an ISV captures some information at installation time, some of which is PII or other data that you don’t want to store on the disk in clear text. Some of that information is used in communications between the ISV’s web site and the installed program. Yes, at de-installation time, the data will likely be deleted, but stuff tends to get silently cached. Data cleanliness principals say “clean up as much as you can”, breaking that decryptability seems like a good thing to throw into a spec.

    Over the years I’ve found that simply relying on CryptProtect(/Unprotect)Data without putting some thought and additional entropy is usually a bad idea.

  4. Brian says:

    Let’s ignore everything Raymond said and try to solve an easier problem:
    Given a cyphertext and a key:
    1. Assume that the user will never copy or otherwise save the plaintext
    2. Assume that the user may unintentionally the key.
    3. Assume that the user will never intentionally bypass this system (i.e., the user *wants* the system to work).

    Is there any mechanism that will allow me to destroy the key while having confidence that the user has not made a copy of it?

    There are two answers:
    1. Ensure that the user never has access to the key at all (e.g., decryption as a service). This requires us to trust a 3rd party, which is often exactly what we don’t want.
    2. Store both the key on tamper-proof hardware(*) that supports running “decrypt” and “encrypt” but does not support copying the key. When the application is uninstalled, send a, “delete the key” command to the hardware.
    The best example of #2 is the combination of bitlocker and a TPM module.

    (*) – With the possible exception of some sort of future quantum-based hardware, this probably doesn’t truly exist. But we can approximate it by hoping that the hardware is sufficiently tamper-resistant that it isn’t worth bypassing. Whether this is true depends on the quality of the hardware and the value of the data.

  5. Aged .Net Guy says:

    As always, you need to decide if you’re defending this secret against the NSA or against Marge in Accounting. The fact it’s logically impossible to protect it perfectly and practically impossible to stop the NSA, etc., doesn’t mean it’s pointless to lock out Marge & her ilk.

    I agree there are plenty of nefarious reasons someone might want to incorporate logic like this into their app.

    But there are also some decent reasons, mostly related to not leaving valuable secrets lying around. Whether that’s valuable user data, or valuable application internal data.

  6. Mr Chen, you’ve been holding up this “Secrets cannot be stored” mantra since at least 2014, but in fact, they can be safely stored into a secure cryptoprocessor. The processor receives the encrypted data and gives back the decrypted data, whose exposure is intended at the time of decryption. (So, you are welcome to hook into it.) The cryptoprocessor is a black box. It cannot be debugged or hooked into.

    1. Joshua says:

      Sigh. Time to break out the HF.

    2. Denis Silvestre says:

      And how do the unencrypted data flow from the software to the processor?
      * Via an hardware bus? Easily sniffable;
      * Via a difficult to sniff bus (say SATA or PCIe)? Hook the drivers, profit.

      1. Yeah. But it involves being on the other side of the airtight hatchway.

    3. war59312 says:

      Black box argument doesn’t work. Just because you cant hack it doesn’t mean no one else can.

      If encrypted date is ever, ever!, unecrypted somewhere in the chain.. It’s 100% poissiable to hack it and get the data. Period! It’s only a mater of time and cost!

      1. No, it isn’t. It is the matter of not properly reading one’s comment before replying! 😉 Data wasn’t the concern at all. The encryption key was. And the matter of time and cost isn’t something to dismiss as easily as you did. In fact, in cryptography, they are worshiped as the sole pillars on which the whole field relies.

        Of course, I used the Microsoft lingo, “secret”. Everywhere else, “secret” means “data subjected to encryption”; but in Microsoft, it means the cryptographic key. Well, quite frankly, Microsoft has a funny lingo, with all the wrongly defined phrases like “x86”, “system partition”, “system32”, “SAC-T” and many others.

  7. jpa says:

    Forward secrecy is not a pointless effort! It still protects the data from future users of the computer.

  8. Stuart says:

    Those damn airtight hatchways, at it again!

  9. IanBoyd says:

    It’s a defense-in-depth, and opportunistic data cleansing, measure.

    If my application is being uninstalled, it’s a perfect time to remove any sensitive information. Even though it is encrypted, the most difficult to crack data is the data that doesn’t exist.

    (it is a parallel to the axiom: the fastest code is the code that doesn’t run)

Comments are closed.

Skip to main content