Closed Bug 663286 Opened 13 years ago Closed 7 years ago

Implement w3c touch events for firefox-desktop (i.e., unify and centralize old firefox-desktop touch-event code and new fennec touch-event code)

Categories

(Firefox :: General, defect)

defect
Not set
normal

Tracking

()

RESOLVED FIXED

People

(Reporter: cjones, Unassigned)

References

Details

(Whiteboard: [tech-p3])

From what I understood from Felipe, we're in the following state (1) in firefox-desktop, chrome can listen for "gesture events", but only on mac. We have no plans to expose this to web content. (Ping me for more details.) (2) in firefox-desktop, we have an implementation of a touch-event spec that's not the current w3c one. The implementation only works on win7. (3) in fennec, we have an implementation of (some parts?) of the w3c spec. AFAIK the implementation only works on android. AIUI, (2) and (3) are completely separate. We should unify (2) and (3). It would also be nice to find some way to implement raw touch events for OS X, but that's probably a big enough project for a followup.
(In reply to comment #0) > (3) in fennec, we have an implementation of (some parts?) of the w3c spec. > AFAIK the implementation only works on android. The current Fennec implementation is actually cross-platform; it translates mouse events in chrome into IPC messages, which are then translated to touch events in the content process. But in bug 603008 we may replace or augment this with an Android-specific implementation that uses touch events in chrome.
(In reply to comment #0) > From what I understood from Felipe, we're in the following state > (1) in firefox-desktop, chrome can listen for "gesture events", but only on > mac. We have no plans to expose this to web content. (Ping me for more > details.) We support gesture events also on Win7, IIRC. > (2) in firefox-desktop, we have an implementation of a touch-event spec > that's not the current w3c one. The implementation only works on win7. True. MozTouchEvents > It would also be nice to find some way to > implement raw touch events for OS X, but that's probably a big enough > project for a followup. Does OSX have touch events? Are there any touch enabled OSX devices?
(In reply to comment #2) > (In reply to comment #0) > > It would also be nice to find some way to > > implement raw touch events for OS X, but that's probably a big enough > > project for a followup. > Does OSX have touch events? Are there any touch enabled OSX devices? Newer OS X laptops have large touchpads that understand multitouch. If I remember what Felipe said correctly, while OS X will deliver *gesture* events to us, it's hard for us to listen to the raw touch events used to compute the gestures. We may need epic hackery to do that.
Getting "touch-like" events from touchpad would be quite different comparing to real touch events.
I am working on this +an Android implementation in Bug 603008. I'm hoping to have a WIP up for feedback sometime soon, but I'm trying to implement something farily generic there so that this shouldn't be to hard to implement on other platforms. If someone starts work in here, it would be good to talk.
(In reply to comment #4) > Getting "touch-like" events from touchpad would be quite different comparing > to real touch events. I don't understand this. Can you explain more? (Are you referring to coordinates being in different spaces, "touchpad space" vs. "screen space"?)
Yes, I'm referring to the coordinates and stuff related to that.
Yeah, there's a pretty big difference in the event model between touch events for the actual screen (which we have on android/ipad/win7 tablets) and touch events on a touchpad (macos and other touchpads).
Oft-requested feature for b2g desktop builds.
Blocks: b2g-v-next
Not a use case for production, but nice-to-have for developers using desktop builds.
Whiteboard: [tech-p3]
Now that Macs have things like Magic Trackpad (https://www.apple.com/magictrackpad/) and Force Touch Trackpad, which both support multitouch I think Firefox should provide a way for web applications to hook into those events. So not just for developers, but really for production apps. Imagine that you could pinch and zoom in our out in Google Maps. Why would those events only interact with the whole site? They should be per DOM elements. Similar to how we can intercept scroll gestures and modify the behavior in the part of the app based on scrolling. Also pinching and other multi touch gestures would be great to have available in the app. Maybe it is not necessary that they are concrete touch events (there are no reasonable coordinates for them), but things like gestures, so that I can detect if there was a pinch gesture done while mouse was over some DOM elements, that would be great.
Touch events aren't really for trackpad-type devices. Touch events are, well, for touch screens and such. pointer events on the other hand could support multiple pointers even from a trackpad.
So is there a more suitable ticket for support for multiple pointer events on multi-touch trackpads?
Yes. Feel free to file such bug.
Opened issue 1164665.
Microsoft has a fantastic blog post describing the perfect way to handle this on Windows: https://blogs.windows.com/msedgedev/2017/03/08/scrolling-on-the-web/ Firefox should ideally handle input the way Edge does.
We already do this.
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
Firefox can already deliver touch events on desktop? Care to provide a pointer how to access those? Because to me it looks I can get information only like my trackpad is a mouse, but not a multi-input/finger device.
For touch events you need a touchscreen, not a trackpad. The problem with a trackpad is that you can't easily map a spot on the trackpad to a spot on the screen so it doesn't make much sense to generate touch inputs for those; the usage patterns are very different.
Multi pinch zoom, rotations and many other interesting gestures can be done on the screen the same as on the trackpad. If native apps on desktop can do such gestures, it would be great if I could do them also in web apps on desktop.
While that's true, the gestures would have to be implemented without W3C touch events, which is what this bug is about. In order to dispatch W3C touch events to web content, we would need to have screen coordinates for the touch events, which is the thing we can't get with trackpad events. Detecting gestures directly from trackpad input and dispatching those gestures as events (without dispatching touch events) is possible, but AFAIK not covered by any spec. It sounds like this is what you want, and I agree it would be a potentially useful API to expose to the web.
I am not sure if we could not just map the trackpad to the currently focused window and map coordinates there for touch events, instead to the screen. It would be like scroll events, where you listen on the window. And you would get coordinates of fingers and and then you could decide what to do about that. It is not perfect, like screen, but still, it would allow you to do thins on the web from desktop. On the other hand, I do have a desktop with touchscreen monitor. I also do not see touches getting through various touch demos on the web. Maybe they are getting lost on the whole touchscreen driver -> Wayland -> Firefox path and is not Firefox fault.
(In reply to Mitar from comment #22) > I am not sure if we could not just map the trackpad to the currently focused > window and map coordinates there for touch events, instead to the screen. It > would be like scroll events, where you listen on the window. And you would > get coordinates of fingers and and then you could decide what to do about > that. It is not perfect, like screen, but still, it would allow you to do > thins on the web from desktop. Again - that's certainly possible, but that's different from what the W3C touch event spec says, so it would have to go in a different type of event, which is out of scope for this bug. Feel free to file a new bug for this. > On the other hand, I do have a desktop with touchscreen monitor. I also do > not see touches getting through various touch demos on the web. Maybe they > are getting lost on the whole touchscreen driver -> Wayland -> Firefox path > and is not Firefox fault. Linux is a bit of a special case: (a) I'm not sure if it works with Wayland and (b) you might need to set MOZ_USE_XINPUT2=1 in your environment before starting Firefox. You might also need to set dom.w3c.touch_events.enabled=1 in about:config. That's waiting on bug 1207700.
I opened https://bugzilla.mozilla.org/show_bug.cgi?id=1431012 Thanks for pointing out the information for Linux.
You need to log in before you can comment on or make changes to this bug.