Why doesn’t my keyboard hook get called for keyboard messages I manually posted?

A customer was generating synthesized keyboard input by posting WM_KEY­DOWN and related messages to a window. They found that a WH_KEYBOARD keyboard hook was not triggering for these keyboard messages and they wanted to know why.

You already know enough to answer this question, though the pieces are scattered about over several years for you to put together.

First of all, you can't simulate keyboard input with Post­Message because you are going straight to the end product without going through the other paperwork that leads to that end product.

It's like sending a letter to a friend by putting it directly in his mailbox, and then saying, "My friend filed a change of address with the postal service, and anything sent to his address is supposed to be redirected to his new address, but my letter is still sitting in his mailbox at his old address." Well, yeah, because you didn't actually mail the letter. You bypassed the mail system and merely replicated the end product (a letter in the mailbox). You will also find that if you go to the postal service's Web site and ask for a delivery confirmation of the letter, the Web site is going to say, "Sorry, we have no record of having delivered that letter to that mailbox."

The window manager does not have a fake message detector that looks to see if you posted a fake input message, and if so, reverse-engineers the logic that led to that fake input message and internally simulates the actions of that reverse-engineered logic. If you post a WM_CHAR message, it's not going to say, "Well, let me see, to get to that message, the user needed to have pressed the Shift key, then pressed the A key, then released the A key, then released the Shift key, so I'll retroactively send out WH_KEYBOARD hook notifications for those events, and if any of the hooks blocks the event, then I created a time paradox, and I have to go back in time and kill the program that posted the WM_CHAR message." (It's also not going to insert the posted message in the processing queue in the correct order relative to true input messages.)

If you want to simulate input, you need to send it through the input system with functions like Send­Input.

Comments (18)
  1. skSdnW says:

    SendInput is OK for full system automation but I think most people want to send input to specific windows and that is why people often end up faking key up/down messages.

    A SendInputToWindow API would be nice…

  2. John Doe says:


    > You already know enough to answer this question (…)

    Actually, we may have the power to know, since you have stated in this blog more than it'll ever be possible to extract from MSDN alone.

    But we don't have an obligation to know.  It's not like the law, where you can't allege ignorance.

    [I'm still trying to figure out what mental model of the system allows for the cognitive dissonance of the question "Why isn't the input system detecting the message that I'm using to bypass the input system?" -Raymond]
  3. John Doe says:

    Sorry, I just couldn't hold this one.

    Raymond Chen facts:

    – Raymond Chen's blog is the law.

  4. Andrew says:

    @John Doe creates a nit, just so he can pick it?

  5. SilverbackNet says:

    @skSdnW SendInputToWindow? So what do you do about those low-level system hooks that you're trying to access in the first place? One or all of them may redirect input to an entirely different window (or non-window process), or you have to bypass them to get to your window, and you might as well have just posted a message.

  6. Ben Voigt says:

    @SilverbackNet: No, this hypothetical API would arrange for the message to be ordered in the queue with other input messages, carry a proper timestamp and keyboard status (making `GetKeyboardState` from within the window procedure work correctly), etc.  Still very different from "just post a message", although you do have a point that it wouldn't go through the entirety of the input-processing chain, so some surprising behavior might happen (are keyboard hooks in the "determine to which window the message goes" portion of the chain, or the "enqueue event for known target window" portion?)

  7. Nick says:

    @John Doe: You the reader of this blog are assumed to have always read all posts on this blog. Though the post makes it clear that you may not realize which pieces answer the question (then answers the question).

    I like that the conclusion is, functionally, "If you want to simulate input, you have to simulate input, not the effects of input."

  8. Kevin says:

    @Ben Voigt: Now you're in the business of lying to applications.  What happens if someone else was monitoring focus events?  Do they see the focus change to the other window?  Does the application which had focus get a blur event?  Or do you just send the input to a window which doesn't have focus?  How many apps are going to freak out when they get input in a blurred state?

  9. Mc says:

    I enjoyed and understood the mailbox analogy, thanks.

  10. Medinoc says:

    I agree with Ben Voigt, the big problem for automation here is that there is no way to send "real" keyboard input (which includes the shift state and all) while targeting a given window.

    >Or do you just send the input to a window which doesn't have focus?  How many apps are going to freak out when they get input in a blurred state?

    How many indeed? Most apps I've seen usually don't care about whether they have the focus, but can care about which of their controls does. It could generate problems for a dialog box's default button: sending the ENTER key would push the *default* default button, rather than a button made default when a given field has the focus.

    Maybe a BeginAutomateWindow() function that creates a fake focus in addition to the fake input?

  11. skSdnW says:

    The Windows Shell also uses/used? PostMessage to generate keyboard input. IIRC in the toolbar as menu in rebar implementation…

    @SilverbackNet: Yes the low-level hooks are called before the destination window is known but this new function could easily have a flags parameter that lets the caller decide if they want to bypass the low and normal hooks.

  12. John Doe says:

    @Raymond, what I said is more general than this specific topic, it's about the "You already know" part in isolation.   That's why I quoted it alone.

    Hence the joke: since we must know what you know, or have let us know, then this blog is like the law.  As @Nick said, it is implied that we know just by reading even the latest blog post.  It goes like those books about <choice id="thing"><platform/><framework/><language/></choice> that require you to have some knowledge and experience in <choice ref="thing"/>.

    Damn, now it's not funny anymore…

    You're my hero, not Chuck Norris (he may be by implication, since he taught you Win32  blogs.msdn.com/…/10399692.aspx , and he taught you well.)

  13. David V. Corbin says:

    I find that people who end up in the conundrum haven't followed solid principles in the design of their application or have designed an application which is not intended to be driven by (keyboard)automation. The entire WM_*, WH_* are implementation details of an application, one of the first things [IMPO] that should be done is to transform this raw information into a state that has proper meaning for the application. Then (if keyboard type automation is desired) create some mechanism (service endpoint?) for passing in these meaningful messages.

    Granted this does not help with "other peoples" applications which have not taken such an approach. In these cases it *may* be wise to contact the vendor, see if there is an automation mechanism, and if not, perhaps switch to another vendor's offering [i.e. vote with your feet].

  14. Joshua says:

    I once did have to abuse a library component in this manner. Due to unrelated reasons, the library version was already frozen so I could guarantee the library would handle it. Didn't stop Win32 from freaking out one one customer site though (a certain long sequence caused focus tracking to fail for who knows what reason–the solution was a slightly different sequence). As far as I could work out, Win32 tried to emulate the mouse up/down state from the generated messages and so would miss the real state change later.

  15. 640k says:

    @David V. Corbin: No, most vendors are *not* very supportive if you try to automate their software. Microsoft is one of these vendors.

    Then your only solution is by faking gui interaction somehow, and a local solution would be preferred in this case, instead of sending messages through the whole input pipeline of the OS, which can effect other applications and certainly prevents a real user to be interactive at the same time.

    [Support for accessibility is required for any software that sells to the United States government, so go ahead and use automation to click the OK button. Just understand that in the next version, there may not be an OK button. -Raymond]
  16. boogaloo says:

    I am impressed by the optimism of people who think you can just add an extra call that magically allows them to do what they want and applications will then work exactly the way they want, even though they will now be driven differently than they were designed or tested for.

    The correct way to do this would be to have a virtualised desktop that you (and only you) can send input to.

  17. Anon says:

    @boogaloo – but then you run into the same issue, because the vitural desktop is also an aplication that seems to route the input.

  18. Sandy Nyhman says:

    In business environments it's often a requirement to send key presses to applications that the operator is using. (This is the reason that VBA includes a SendKeys function.) Often, actually emulating input won't work because of focussing issues (e.g. sometimes you don't want the focus to change, or issues occur trying to synchronise the focus change with the input). Virtualised desktops are right out, since they cannot be used to send key presses to running interactive applications. Industry applications are often old or just really bad; accessibility may be a requirement for government software, but for industry applications it's just one feature on a long list of necessary features that nonetheless won't get implemented for lack of money. Of course you can forget about automation interfaces, there won't be any. So usually, the only thing you can do is sending or posting messages and hoping the target application will buy into the charade.

Comments are closed.

Skip to main content