🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

WM_LBUTTONDOW/UP events not coming through from touchpad when long pressing

Started by
4 comments, last by irreversible 5 years, 1 month ago

I want to respond to a long tap on a touchpad to abort accelerated scrolling (like in Chrome the user can rest their finger(s) on the touchpad at some arbitrary time to stop long scrolling). Surprisingly this is turning out to be extremely frustrating since the touchpad isn't sending a WM_LBUTTONDOWN message separately, but only the DOWN/UP pair if the tap is short enough. In essence, it's only sending clicks, but not long presses.

And it's not just my app. This web-based example isn't receiving the inputs either: https://w3c.github.io/uievents/tools/mouse-event-viewer.html

What's more annoying is that for instance Chrome and Windows' own settings apps seem to be able to respond to a long tap.

Here's what I've tried and concluded:

- raw input doesn't discern the events either
- it's not related to gestures (WM_GESTURE)
- it's not related to touch (WM_TOUCH after registering)
- down/up events from touchpad buttons and a dedicated mouse are received just fine
- GetAsyncKeyState() for VK_LBUTTON returns false

What am I missing here? Is this palm rejection*? Do I have to get the event through communicating with the driver directly? If so, how, and is there a standardized way to do it? 

 

* if the touchpad were an actual (multi)touch device, which it isn't (GetSystemMetrics(SM_DIGITIZER) reports 0), it might be possible to call RegisterTouchWindow(hwnd, TWF_WANTPALM | TWF_FINETOUCH). But this doesn't work either.

Advertisement

I assume you are on Windows (because of the WM_X events) and you are using some kind of Surface to test your app at?

I might be wrong but from my experience Windows simulates a right-click on holding the tap, so check if your app is getting a WM_RBUTTONDOWN message instead

Actually - after digging through Chromium's source code the answer is that there's a whole separate API I wasn't aware of. Direct Manipulation is both fast and powerful, as well as painfully complicated to manage, especially if your codebase is OpenGL-based (which mine is). After I got the bulk of the thing running it turned out that since the DM provides asynchronous scrolling updates, it requires access to an IDCompositionSurface-based draw object, which is based on DirectX. In practice this means building a layer between GL and DX simply to support scrolling.

In case anyone is reading this and wondering, apparently the easiest way to go about this is via ANGLE. I'm presently snooping though Chromium's codebase, trying to figure out if it's worth the headache and I'm on the verge of giving up. I'm actually annoyed to boot that MS doesn't allow the API to function without attaching an output device and source buffer to the compositor (that is to say I haven't found a way to have DM simply report scroll/touch offsets/transforms/viewports as basic numbers - if anyone knows a way, I'd love to daisy chain these into my own existing render stack).

 

8 hours ago, Shaarigan said:

I assume you are on Windows (because of the WM_X events) and you are using some kind of Surface to test your app at?

I might be wrong but from my experience Windows simulates a right-click on holding the tap, so check if your app is getting a WM_RBUTTONDOWN message instead

I'm on a regular laptop and trying to figure out how to handle a good old touchpad/trackpad.

My guess is that DM gets this data directly from the driver.

Not sure about the Direct Manipulation API, which bit/interfaces were you looking at specifically?

I was under the understanding it was built on the window events, but maybe I understood it wrong.

 

I am surprised raw input didn't report anything, which exactly did you register for? I am sure I recall some stuff specifically about contact points / touch pads. There is also WM_TOUCH, do you get any messages for that?

A brief follow-up.

After considerable snooping around it turns out it's possible to use Direct Manipulation in client mode - eg as a standalone source of gesture (zoom, scroll) inputs (for example, the code that comes as part of the Windows classic sample pack uses DM in conjunction with D3D and separating the two appears impossible at first). Since it took me considerable time to figure all this out, I thought I'd share how you can accomplish this:

1. CoCreateInstance an instance of IDirectManipulationManager (CLSID_DirectManipulationManager)
2. Get its update manager - you'll be using this from now on (I'll refer to it as the manager from hereon)
3. Use the manager to create a dummy viewport with some arbitrary size
4. Set the viewport configuration and call SetViewportOptions(DIRECTMANIPULATION_VIEWPORT_OPTIONS_MANUALUPDATE)
5. Add your event handler to the viewport (derive it from IDirectManipulationViewportEventHandler)
6. Activate the manager

7. In your message loop listen for DM_POINTERHITTEST. When you receive this message and the message comes from a touchpad, start polling the manager by calling Update() (the frame info provider can be NULL). Respond to the OnViewportStatusChanged() and OnContentUpdated() functions in your handler
7.1 I think polling is only necessary for touchpads since there's no event that would tell you when the user places their finger(s) on or removes their finger(s) from the pad. DM handles this internally and lets you know via OnViewportStatusChanged(). In order to tell whether DM_POINTERHITTEST is from a touchpad, get the GetPointerType() from user32.dll (it has the signature: BOOL(WINAPI*)(UINT32, POINTER_INPUT_TYPE*) ) and see if the returned type from GetPointerType(GET_POINTERID_WPARAM(wparam), &type) is PT_TOUCHPAD
8. If it is, call viewport->SetContact(GET_POINTERID_WPARAM(wparam)) 
8.1 Stop polling when you receive the current status DIRECTMANIPULATION_READY in OnViewportStatusChanged()

9. In OnContentUpdated() call content->GetContentTransform() to get the floating point transform
9.1 You can make the transform relative within a gesture by calling ZoomToRect() on the viewport and resetting the coordinates in response to DIRECTMANIPULATION_READY

 

A couple of notes:

  • In a naive implementation scrolling by clienting DM like this seems is less smooth than what the MS-provided compositor does. However, the latter is purely DirectX-based and handles drawing/input handling in some complicated mulithreaded way
  • If you don't want to wrap the DM code, it should be possible to use something like ANGLE to relatively easily get your GL FBO into D3D (or I guess you could manually download->(convert?)->upload the frame buffer). However this still requires your display surface to be DX-based. My knowledge of DX in general is poor so I won't comment further. Have a look at the link above for an official example how to do this if you're working with DX natively
  • I wrote my own input accelerator on top of the data received from DM. However the results should get better once I transition away from WM_TIMER-based polling, optimize the draw code (framerate/smoothness really matters for a natural scrolling experience) and implement a floating point scroll accumulator that feeds directly into my accelerator as opposed to marshalling DM input through WM_MOUSEWHEEL messages (which only supports integral coordinates)
  • If you have some time for a fantastic read on a topic that seems extremely simple and mundane, have a look at the article Scrolling With Pleasure by Pavel Fatin

I hope this helps saves some poor soul a few days of in the future. 

This topic is closed to new replies.

Advertisement