Why is it so hard to write a program that requires UI Access privilege?


If you want your program to have the UI Access privilege, you have to jump through a few hoops. The program needs to be digitally signed, and it needs to go into the system32 directory. Why is it so hard to get UI Access?

Because UI Access bypasses User Interface Privilege Isolation (UIPI) security measures. The more valuable the target, the more substantial the security measures.

UI Access allows low-integrity programs to access and interact with the user interface of high-integrity programs. This has historically been the source of security vulnerabilities. UIPI was created in part to prevent this type of security attack. If a low-integrity program could programmatically manipulate the user interface of a high-integrity program, then all an attacker needs to do is sit and wait for the user to elevate a command prompt, and then start programmatically driving the command prompt to do whatever it wanted.

If all you had to do to obtain UI Access was simply ask for it (by setting uiaccess="true" in your manifest), then every piece of malware would just do that, and boom, the value of UIPI has effectively vanished. (This is the sort of trap that leads to eventually, nothing is special any more.)

Okay, so the digital signature requirement is there to create a barrier to entry for malware authors. It also creates some degree of accountability (since you have to identify yourself to a certificate authority, though as we've seen in the past, this relies on certificate authorities remaining trustworthy), And it allows your application's ability to obtain UI Access to be revoked by revoking the certificate.

But why does the file have to be in the system32 directory?

As we saw some time ago, the directory is the application bundle. If programs with UI Access could be installed anywhere, then an attacker could exploit an insecure application directory to plant a rogue copy of a system DLL in that directory, allowing itself to be injected into the process with UI Access and thereby compromise it.

The thinking here is "If the application cannot create its install directory, then the application cannot create its install directory wrong."

Requiring progams with UI Access to be installed into the system32 directory is an additional secure by default measure. In order to compromise the application bundle, the attacker must already have compromised the system32 directory, at which point he's already on the other side of the airtight hatchway.

Comments (23)
  1. Brian Friesen says:

    I've always wondered why Windows doesn't at least offer a setting to require digitally signed DLLs in addition to EXEs.  It seems to me that would have solved a lot of problems.  So even if the attacker planted a rogue system DLL into an insecure application directory, that DLL would have to be signed which means the identity of the attacker would be known.

  2. Raphael says:

    Because, digital signatures are evil, of course. Just ask the Linuxers on Slashdot.

  3. @Brian: I thought trying to plant rogue system DLLs into insecure application directories, by itself, was already an exercise in futility, due to the Known DLLs feature?

    blogs.msdn.com/…/187752.aspx

  4. parkrrrr says:

    A small correction: uiaccess programs can also be in a subdirectory of the Program Files directory. This is also safe because, as you mentioned a few days ago, you already need to be an admin to write there, too (by default.)

  5. Joshua says:

    If I ever had to write one of these, my installer would generate a key, put the public key in the key store, sign the binary, then throw the private key away. Update would repeat the process (removing the now useless old key). You can't revoke a root certificate unique to one machine.

  6. John says:

    I still don't understand why the EXE has to be signed. If you have write access to system32, you own the machine anyway. You can overwrite system DLLs (after disabling windows file protection, which should not be hard if you can write to system32), or as Joshua mentioned, you can modify the key store.

    [The intended audience for UI Access is accessibility tools, and the companies who write accessibility tools already have signing certificates, so it's no extra burden for them. The signing requirement improves accountability and reduces the scope for abuse. -Raymond]
  7. Joshua says:

    @John: Because Microsoft wants to be able to lock down signing so all keys can be traced back to Microsoft, but they found out too late that wasn't gonna fly (anti-competitive) and didn't back their changes out.

    The more they tighten their grip on signing, the more systems will be left permanently in test mode, and all the gains demanded by Holywood will be lost. I have no part nor parcel in the debate anymore, but I see where that action will end up.

  8. Matt says:

    @Brian: It does. Windows RT does exactly this. All EXEs and DLLs must be digitally signed, otherwise the load fails. You can also set this flag in earlier versions of Windows through group policy (it's part of the App Whitelisting stuff).

  9. Killer{R} says:

    So if /me is an attacker code runing at low IL, I should look into system32 directory, find here any signed .exe image with UI-Access attribute set and start it then inject myself into its address space like the Alien does and here we are? Or I've missed something? (like UAC prompt when launching UI-Access process even under low-il)

  10. parkrrrr says:

    @Killer: You're missing the part where you successfully inject yourself into the address space of a program with uiaccess. What that flag does is ask the OS to set your process to high IL without activating any admin token you may have, thus giving you the same protections against injection attacks that every other high IL process has. If you can inject yourself into a running UIAccess process, you can inject into regedit instead and get the benefits of the admin token too.

    (As an aside, one side-effect of UIAccess is that lots of other stuff gets more interesting. For example, you can't get the current instance of Excel from the Running Object Table anymore, because each integrity level has its own distinct ROT.)

  11. Joshua says:

    @Killer{R}: What happens is CreateRemoteThread, *ProcessMemory, and a few more that don't come to mind immediately fail as the process was elevated to a higher IL on startup. Now it might have a valid current directory or PATH attack against it though …

  12. Killer{R} says:

    @parkrrrr: if setting UI Acccess flag raises IL of process to high without activating clearing deny-only from Admin's SID in token – then I'm ok with this. But "UI Access allows low-integrity programs to access and interact with the user interface of high-integrity programs. " says clearly 'low-integrity' :) and me too lazy to check now how it implemented actually :)

  13. parkrrrr says:

    @Killer: You can verify for yourself by running a UIAccess program like Narrator and checking the IL with Process Explorer, but that's definitely how it works. I've reason to be somewhat well-acquainted with it.

  14. Yuhong Bao says:

    "Because Microsoft wants to be able to lock down signing so all keys can be traced back to Microsoft, but they found out too late that wasn't gonna fly (anti-competitive) and didn't back their changes out."

    Huh?

  15. Ian Yates says:

    For those suggesting putting a certificate into the machine's store, etc – I suspect that if you were to get your app certified by Microsoft (assuming it's not malware, at which time if you have admin rights during the install then just do your evil at that time) then it would fail some of the Microsoft Platform Ready tests…  I had to get our old app certified a couple of weeks ago and the restrictions are a lot tighter than they used to be back in the Vista days – ASLR, DEP, etc on all DLLs.  This was really fun on my Delphi v7 code :)   (I flagged the DLLs as being NX and ASLR compat and then tested our software – happy to have them flagged to be more secure anyway)

  16. Joshua says:

    @Ian Yates: I make no effort to follow Microsoft's certification program after they demonstrated a willingness to break programs trying to do the right thing in favor of programs making no attempt. I simply do the most expedient thing as defined by the OS behavior and the functional requirements. If the next version of Windows broke it, I can usually adapt in a release or two depending on how bad.

    For the record, we were broken for UAC for over a year because there was no way to decide at run time whether or not to use the virtual store (we couldn't decide at compile time due to the way the virtual store broke the classical model of filesystems that turned out to be version-dependent of a third party component we had no control over).

  17. Axxan says:

    "If a low-integrity program could programmatically manipulate the user interface of a high-integrity program, then all an attacker needs to do is sit and wait for the user to elevate a command prompt, and then start programmatically driving the command prompt to do whatever it wanted."

    But that *exact* thing *is* possible; proof: twitter.com/…/279328380254576642

  18. Matt says:

    @Joshua "@Killer{R}: What happens is CreateRemoteThread, *ProcessMemory, and a few more that don't come to mind immediately fail as the process was elevated to a higher IL on startup. Now it might have a valid current directory or PATH attack against it though …"

    Which is why you can't set the Environment variable of a higher integrity process than yourself. Also, if there's a security bug in a high integrity component, all bets are off (trivially so).

    For example, with a bug in a kernel-mode driver, I can jump straight from low-integrity to ring-0. "If you have an 0-day in a program running as X, I can elevate to X" is a tautology.

  19. @Joshua says:

    You're not being creative enough. Any compile-time parameter can be trivially turned into a runtime-detected one:

    void main()

    {

    if(ShouldUseVS("thirdpartycomponent.dll"))

     ShellExecute("app_with_vs.exe");    // compiled without VS

    else

     ShellExecute("app_without_vs.exe"); // compiled with VS

    }

  20. AndyCadley says:

    As I understand it, the signing requirement was really there because it dissuaded a lot of developers from simply enabling UiAccess because "it fixes this issue with Vista+" rather than properly re-assessing what they're doing and its security implications.

    Remember that any UiAccess app that is running becomes a potential vector for attack by malware, so a bunch of apps with it enabled unnecessarily would reduce the effectiveness of UIPI.

  21. Rick C says:

    @Axxan, can't replicate.  Would be nice if the tweeter had shown how to actually do what he says:

    command prompt:

    'win32api.SendMessage' is not recognized as an internal or external command, operable program or batch file.

    I thought, maybe this is a PowerShell thing, but apparently not:

    At line:1 char:45

    • win32api.SendMessage(win32con.HWND_BROADCAST, win32con.WM_CHAR, 0x0041)
    •                                             ~

    Missing argument in parameter list.

       + CategoryInfo          : ParserError: (:) [], ParentContainsErrorRecordException

       + FullyQualifiedErrorId : MissingArgument

  22. Rick C says:

    oh, I get it, after some googling.  He cheated a little by using Python and not saying so.

  23. Mike Dimmick says:

    @Joshua: Microsoft do not track digital signing at all. In order to sign code you must simply acquire a code signing certificate from any of the third-party certification authorities that offer the facility. Obviously you need to check that the CA is reputable and isn't going to have its root certificate blocked because they messed up.

    The CA should check that the information you've given them about who you are is correct, to avoid problems with impersonation (again, has failed in the past). After that, you can sign any code you like with your code-signing certificate until it expires. You should make sure to also use a time-stamping server to validate when you signed the code: Windows will accept signed code if the certificate that signed it has expired, as long as the independent time stamp shows that the certificate was valid when it was signed.

    Microsoft have recently been failing to get the independent time stamps for their own components, which will cause problems when an attempt is made to install the updates in future. See technet.microsoft.com/…/2749655 for the details.

    If you sign your code, UAC prompts will also come up in grey rather than orange and will identify you; SmartScreen is less likely to report that your package is 'not commonly downloaded'.

Comments are closed.