-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Touch gesture support #279
Comments
emilk
added
feature
New feature or request
web
Related to running Egui on the web
labels
Apr 7, 2021
I plan to spend some time on it on the weekend 🚧 |
11 tasks
emilk
added a commit
that referenced
this issue
May 6, 2021
* translate touch events from glium to egui Unfortunately, winit does not seem to create _Touch_ events for the touch pad on my mac. Only _TouchpadPressure_ events are sent. Found some issues (like [this](rust-windowing/winit#54)), but I am not sure what they exactly mean: Sometimes, touch events are mixed with touch-to-pointer translation in the discussions. * translate touch events from web_sys to egui The are a few open topics: - egui_web currently translates touch events into pointer events. I guess this should change, such that egui itself performs this kind of conversion. - `pub fn egui_web::pos_from_touch_event` is a public function, but I would like to change the return type to an `Option`. Shouldn't this function be private, anyway? * introduce `TouchState` and `Gesture` InputState.touch was introduced with type `TouchState`, just as InputState.pointer is of type `Pointer`. The TouchState internally relies on a collection of `Gesture`s. This commit provides the first rudimentary implementation of a Gesture, but has no functionality, yet. * add method InputState::zoom() So far, the method always returns `None`, but it should work as soon as the `Zoom` gesture is implemented. * manage one `TouchState` per individual device Although quite unlikely, it is still possible to connect more than one touch device. (I have three touch pads connected to my MacBook in total, but unfortunately `winit` sends touch events for none of them.) We do not want to mix-up the touches from different devices. * implement control loop for gesture detection The basic idea is that each gesture can focus on detection logic and does not have to care (too much) about managing touch state in general. * streamline `Gesture` trait, simplifying impl's * implement first version of Zoom gesture * fix failing doctest a simple `TODO` should be enough * get rid of `Gesture`s * Provide a Zoom/Rotate window in the demo app For now, it works for two fingers only. The third finger interrupts the gesture. Bugs: - Pinching in the demo window also moves the window -> Pointer events must be ignored when touch is active - Pinching also works when doing it outside the demo window -> it would be nice to return the touch info in the `Response` of the painter allocation * fix comments and non-idiomatic code * update touch state *each frame* * change egui_demo to use *relative* touch data * support more than two fingers This commit includes an improved Demo Window for egui_demo, and a complete re-write of the gesture detection. The PR should be ready for review, soon. * cleanup code and comments for review * minor code simplifications * oops – forgot the changelog * resolve comment https://github.com/emilk/egui/pull/306/files/fee8ed83dbe715b5b70433faacfe74b59c99e4a4#r623226656 * accept suggestion #306 (comment) Co-authored-by: Emil Ernerfeldt <emil.ernerfeldt@gmail.com> * fix syntax error (dough!) * remove `dbg!` (why didnt clippy see this?) * apply suggested diffs from review * fix conversion of physical location to Pos2 * remove redundanct type `TouchAverages` * remove trailing space * avoid initial translation jump in plot demo * extend the demo so it shows off translation Co-authored-by: Emil Ernerfeldt <emil.ernerfeldt@gmail.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It would be great if
egui
could support pinch-to-zoom, multi-finger swipes and other touch gestures. For a start, the most important place to support it is the web (i.e. inegui_web
).(NOTE: full multi-touch support, e.g. dragging two sliders at once, is a separate issue).
I suggest we start by adding touch events to
egui::Event
. Let's do something similar towinit
:egui::InputState
can then keep track of the touches in aHashMap<TouchId, Pos2>
and detect gestures. For instance, if there is exactly two touches then it could measure zoom, translation and rotation to support manipulating 2D surfaces. An egui application would then just do something likeself.zoom *= ui.input().zoom();
to update its zoom-state each frame.The text was updated successfully, but these errors were encountered: