Closed
Bug 663286
Opened 13 years ago
Closed 7 years ago
Implement w3c touch events for firefox-desktop (i.e., unify and centralize old firefox-desktop touch-event code and new fennec touch-event code)
Categories
(Firefox :: General, defect)
Firefox
General
Tracking
()
RESOLVED
FIXED
People
(Reporter: cjones, Unassigned)
References
Details
(Whiteboard: [tech-p3])
From what I understood from Felipe, we're in the following state
(1) in firefox-desktop, chrome can listen for "gesture events", but only on mac. We have no plans to expose this to web content. (Ping me for more details.)
(2) in firefox-desktop, we have an implementation of a touch-event spec that's not the current w3c one. The implementation only works on win7.
(3) in fennec, we have an implementation of (some parts?) of the w3c spec. AFAIK the implementation only works on android.
AIUI, (2) and (3) are completely separate.
We should unify (2) and (3). It would also be nice to find some way to implement raw touch events for OS X, but that's probably a big enough project for a followup.
Comment 1•13 years ago
|
||
(In reply to comment #0)
> (3) in fennec, we have an implementation of (some parts?) of the w3c spec.
> AFAIK the implementation only works on android.
The current Fennec implementation is actually cross-platform; it translates mouse events in chrome into IPC messages, which are then translated to touch events in the content process. But in bug 603008 we may replace or augment this with an Android-specific implementation that uses touch events in chrome.
Comment 2•13 years ago
|
||
(In reply to comment #0)
> From what I understood from Felipe, we're in the following state
> (1) in firefox-desktop, chrome can listen for "gesture events", but only on
> mac. We have no plans to expose this to web content. (Ping me for more
> details.)
We support gesture events also on Win7, IIRC.
> (2) in firefox-desktop, we have an implementation of a touch-event spec
> that's not the current w3c one. The implementation only works on win7.
True. MozTouchEvents
> It would also be nice to find some way to
> implement raw touch events for OS X, but that's probably a big enough
> project for a followup.
Does OSX have touch events? Are there any touch enabled OSX devices?
Reporter | ||
Comment 3•13 years ago
|
||
(In reply to comment #2)
> (In reply to comment #0)
> > It would also be nice to find some way to
> > implement raw touch events for OS X, but that's probably a big enough
> > project for a followup.
> Does OSX have touch events? Are there any touch enabled OSX devices?
Newer OS X laptops have large touchpads that understand multitouch. If I remember what Felipe said correctly, while OS X will deliver *gesture* events to us, it's hard for us to listen to the raw touch events used to compute the gestures. We may need epic hackery to do that.
Comment 4•13 years ago
|
||
Getting "touch-like" events from touchpad would be quite different comparing
to real touch events.
Comment 5•13 years ago
|
||
I am working on this +an Android implementation in Bug 603008. I'm hoping to have a WIP up for feedback sometime soon, but I'm trying to implement something farily generic there so that this shouldn't be to hard to implement on other platforms. If someone starts work in here, it would be good to talk.
Reporter | ||
Comment 6•13 years ago
|
||
(In reply to comment #4)
> Getting "touch-like" events from touchpad would be quite different comparing
> to real touch events.
I don't understand this. Can you explain more? (Are you referring to coordinates being in different spaces, "touchpad space" vs. "screen space"?)
Comment 7•13 years ago
|
||
Yes, I'm referring to the coordinates and stuff related to that.
Comment 8•13 years ago
|
||
Yeah, there's a pretty big difference in the event model between touch events for the actual screen (which we have on android/ipad/win7 tablets) and touch events on a touchpad (macos and other touchpads).
Reporter | ||
Comment 10•12 years ago
|
||
Not a use case for production, but nice-to-have for developers using desktop builds.
Whiteboard: [tech-p3]
Comment 11•10 years ago
|
||
Now that Macs have things like Magic Trackpad (https://www.apple.com/magictrackpad/) and Force Touch Trackpad, which both support multitouch I think Firefox should provide a way for web applications to hook into those events. So not just for developers, but really for production apps. Imagine that you could pinch and zoom in our out in Google Maps. Why would those events only interact with the whole site? They should be per DOM elements. Similar to how we can intercept scroll gestures and modify the behavior in the part of the app based on scrolling. Also pinching and other multi touch gestures would be great to have available in the app.
Maybe it is not necessary that they are concrete touch events (there are no reasonable coordinates for them), but things like gestures, so that I can detect if there was a pinch gesture done while mouse was over some DOM elements, that would be great.
Comment 12•10 years ago
|
||
Touch events aren't really for trackpad-type devices. Touch events are, well, for touch screens and such.
pointer events on the other hand could support multiple pointers even from a trackpad.
Comment 13•10 years ago
|
||
So is there a more suitable ticket for support for multiple pointer events on multi-touch trackpads?
Comment 14•10 years ago
|
||
Yes. Feel free to file such bug.
Comment 15•10 years ago
|
||
Opened issue 1164665.
Comment 16•7 years ago
|
||
Microsoft has a fantastic blog post describing the perfect way to handle this on Windows: https://blogs.windows.com/msedgedev/2017/03/08/scrolling-on-the-web/
Firefox should ideally handle input the way Edge does.
Comment 17•7 years ago
|
||
We already do this.
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
Comment 18•7 years ago
|
||
Firefox can already deliver touch events on desktop? Care to provide a pointer how to access those? Because to me it looks I can get information only like my trackpad is a mouse, but not a multi-input/finger device.
Comment 19•7 years ago
|
||
For touch events you need a touchscreen, not a trackpad. The problem with a trackpad is that you can't easily map a spot on the trackpad to a spot on the screen so it doesn't make much sense to generate touch inputs for those; the usage patterns are very different.
Comment 20•7 years ago
|
||
Multi pinch zoom, rotations and many other interesting gestures can be done on the screen the same as on the trackpad. If native apps on desktop can do such gestures, it would be great if I could do them also in web apps on desktop.
Comment 21•7 years ago
|
||
While that's true, the gestures would have to be implemented without W3C touch events, which is what this bug is about. In order to dispatch W3C touch events to web content, we would need to have screen coordinates for the touch events, which is the thing we can't get with trackpad events. Detecting gestures directly from trackpad input and dispatching those gestures as events (without dispatching touch events) is possible, but AFAIK not covered by any spec. It sounds like this is what you want, and I agree it would be a potentially useful API to expose to the web.
Comment 22•7 years ago
|
||
I am not sure if we could not just map the trackpad to the currently focused window and map coordinates there for touch events, instead to the screen. It would be like scroll events, where you listen on the window. And you would get coordinates of fingers and and then you could decide what to do about that. It is not perfect, like screen, but still, it would allow you to do thins on the web from desktop.
On the other hand, I do have a desktop with touchscreen monitor. I also do not see touches getting through various touch demos on the web. Maybe they are getting lost on the whole touchscreen driver -> Wayland -> Firefox path and is not Firefox fault.
Comment 23•7 years ago
|
||
(In reply to Mitar from comment #22)
> I am not sure if we could not just map the trackpad to the currently focused
> window and map coordinates there for touch events, instead to the screen. It
> would be like scroll events, where you listen on the window. And you would
> get coordinates of fingers and and then you could decide what to do about
> that. It is not perfect, like screen, but still, it would allow you to do
> thins on the web from desktop.
Again - that's certainly possible, but that's different from what the W3C touch event spec says, so it would have to go in a different type of event, which is out of scope for this bug. Feel free to file a new bug for this.
> On the other hand, I do have a desktop with touchscreen monitor. I also do
> not see touches getting through various touch demos on the web. Maybe they
> are getting lost on the whole touchscreen driver -> Wayland -> Firefox path
> and is not Firefox fault.
Linux is a bit of a special case: (a) I'm not sure if it works with Wayland and (b) you might need to set MOZ_USE_XINPUT2=1 in your environment before starting Firefox. You might also need to set dom.w3c.touch_events.enabled=1 in about:config. That's waiting on bug 1207700.
Comment 24•7 years ago
|
||
I opened https://bugzilla.mozilla.org/show_bug.cgi?id=1431012
Thanks for pointing out the information for Linux.
You need to log in
before you can comment on or make changes to this bug.
Description
•