Skip to content

A lightweight touch gesture recognition library written in C.

License

Notifications You must be signed in to change notification settings

Russell-Newton/MultiTouch

Repository files navigation

MultiTouch

Built with CMake Demo Powered by Emscripten Docs generated by Doxygen

Deploy Artifacts to GitHub Pages Checks: pre-commit Unit Tests

All Contributors

A lightweight touch gesture recognition library created in C as a part of Georgia Tech's Spring-Fall 2022 Junior Design program.

See the Demo!


Contents


Installation

Prerequisites

  1. Install build-essential to have access to make and gcc:

    sudo apt update && sudo apt install build-essential
  2. Install CMake:

    sudo apt-get -y install cmake

ℹ️ Windows development is possible with tools like Chocolatey.

Option 1: Include Source in Your Project

  1. Clone the repository into your project.

    git clone https://github.com/Russell-Newton/MultiTouch.git <Destination>
  2. Include the source in your project

    • If you use CMake, then in a CMakeLists.txt of your project, add the gesturelibrary folder of the repository as a subdirectory using add_subdirectory. Delete the section of gesturelibrary/CMakeLists.txt in the SKIP_TESTS if statement.
    • If you do not use CMake, include the files in the gesturelibrary/include folder and add the files in the gesturelibrary/src folder to your executable.

Option 2: Build Static Library and Link to Your Project

  1. Clone the repo.

    git clone https://github.com/Russell-Newton/MultiTouch.git
  2. Build the CMake project.

    cd MultiTouch
    cmake -S gesturelibrary -B build -D SKIP_TESTS=true
  3. Compile the library with make.

    cd build
    make
  4. Include the library when compiling your program:

    * Add `-I...pathto/MultiTouch/gesturelibrary/include` to your compile command.
    * Add `...pathto/MultiTouch/build/libGestureLibrary.a` to your compile targets.
    

Troubleshooting

If build errors occur, make sure you have make and cmake installed and added to your path. Ensure that you have a C compiler like gcc. In Unix, make and gcc can be installed by running:

sudo apt update && sudo apt install build-essential

Other common build issues may be related to where the CMake build directory is located. Make sure you run make from within the directory created by running cmake.


Usage

  1. Include <gesturelib.h> and the header files for any estures you are interested in. For example, <tap.h> and <drag.h>.
  2. Adjust the gesture parameters in <gestureparams.h> to your desired values. The variables can be set at runtime, but will require the gesture library to be reinitialized after modification.
  3. Call init_gesturelib().
  4. Create an adapter for your touch input device. Adapters transform device input data into touch_event_ts.
  5. Whenever a touch is received, create a touch_event_t with your adapter and send it to process_touch_event().
    • If you want the library to determine which finger this event corresponds to, set event.group = TOUCH_GROUP_UNDEFINED.
  6. Recognized gestures can be obtained from the library synchronously or asynchronously.
  • To synchronously access recognized gestures,

    1. Call the get_[gesture] function of the gesture you are interested in. For example, get_tap and get_drag.
    2. This returns an array of gesture structs for the gesture you are interested in. For example, tap_t and drag_t.
    3. You can read the data from the array, but if a thread is currently executing the process_touch_event() function, then the data in the array may change as you are reading it.
  • To asynchronously access recognized gestures,

    1. Create custom listeners or enable/disable built-in listeners with the provided utility functions:
      • add_recognizer()
      • remove_recognizer()
      • enable_recognizer()
      • disable_recognizer()
    2. Listeners accept a const [gesture_t]* and can read the data from the updated gesture. The gesture data will not change until the next invocation of process_touch_event.

Listeners

Listeners are single functions that accept gesture-specific data and have a void return type. They are called whenever a recognizer's state machine updates its internal state. A listener should be registered after calling init_gesturelib().

Example:

// main.c
#include
<stdio.h>
#include
<gesturelib.h>
#include
<tap.h>

void tap_listener(const tap_t* event) {
if (event.type == RECOGNIZER_STATE_COMPLETED) {
printf("Tap received at (%.3f, %.3f)!", event.x, event.y);
}
}

int main(int argc, char *argv[]) {
init_gesturelib();

// register the new listener
set_on_tap(tap_listener);

// rest of program
}

Design

Touch Preprocessing

After touch data has been transformed into a touch_event_t and sent to our library, the library will perform some additional preprocessing. If the event has its group set to TOUCH_ID_UNDEFINED, the library will determine which touch group it belongs to. If the device provides a touch group, the library will not assign one.

The touch group represents the finger a touch event was made by. That is, touch group 0 corresponds to events created by the first finger pressed, 1 to the second, 2 to the third, and so on.

Touch group assignment is determined by event type:

  • If the event is a down event, attempt to assign it to the first unused group. Track this event as the most recent event in the group it was assigned to, marking the group as active. If there are no unassigned groups, leave the group as unassigned.
  • If the event is a move event, find the active group this event is closest to. Assign it to that group and track this event as the most recent in the group. If there are no active groups, leave it unassigned.
  • If the event is an up event, perform the same logic as with a move event. This time when a group is assigned, the group is marked as inactive.

ℹ️ Group assignment ensures that fingers generate the same group as long as they're in contact with the touch device.

After the preprocessing has finished, a touch event is sent to every enabled recognizer in the order in which they were added to the library.

Recognizers

Gesture recognizers are built like state machines. They receive touch events and update their state. When the state is updated, they call on the registered event listener, if applicable.

Builtin single-finger gesture recognizers save data about every possible touch group that could be performing the gesture they recognize.

Builtin multi-finger recognizers are more complicated and store data about every possible group for every possible user id. User id is set by the data adapter and could be determined by factors like which device received the touch or where on the screen the touch was received.

⚠️ All touch events with the same uid will be considered as part of the same multi-finger gesture for recognition purposes.

Gestures

Gesture recognition starts with a base gesture: stroke. Any other gestures can be recognized by composing and performing additional processing on strokes and other composite gestures.

Stroke

Stroke is a simple gesture with a simple state machine:

The state updates are less important than the data that stroke collects. Stroke collects data on:

  • Initial down event position and time
  • Current move/up event position and time
  • Move event speed (as a moving average with configurable window size)
  • Touch group and user

When creating more complicated gestures, having access to this data can be incredibly useful.

Multistroke

Multistroke is a multi-finger counterpart to stroke. All strokes with the same user id get grouped into the same multistroke. The first down event starts a multistroke, and the last up event for the user id ends the gesture. In addition to the information contained in each stroke, a multistroke also tracks:

  • Current centroid position
  • Centroid instantaneous displacement
  • Least-squares estimated rotation and zoom information

Tap

To perform a tap, press down and release within a short time and without moving too much.

Tap is a simple gesture that contains information about where and when the tap was started and released. If the time between start and release is too long or the distance too great, the tap will fail.

Double-Tap

To perform a double-tap, tap twice in close succession.

Double-tap stores the same information as a tap.

Hold

To perform a hold, press down for a longer amount of time before releasing.

Hold stores the same information as a tap.

Drag

To perform a drag, press down and move your finger across the screen.

Drag tracks starting position, current position, and current velocity. Current velocity is retrieved in the same fashion as stroke.

Hold and Drag

Multidrag

Like multistroke, multidrag is a multi-finger counterpart to drag. The same logic that applies to multistroke applies to multidrag. It stores the same information as multistroke, but has a slightly different state machine and property calculations.

Multidrag is used for processing zooms and rotates.

Zoom

To perform a zoom, press down with at least two fingers and move them closer together or farther apart.

Zoom tracks how many fingers are involved in the gesture and an estimated zoom factor.

Rotate

To perform a rotation, press down with a least two fingers and revolve them around a common center point.

Rotate tracks how many fingers are involved in the gesture and an estimated rotation amount.


Release Notes

Version 1.0.0 (Latest)

Features

  • Created gesture recognizers:
    • Stroke (generic down/move/up gesture, no events generated)
    • Tap
    • Double tap
    • Hold
    • Drag
    • Hold and Drag
    • Zoom
    • Rotate
    • Multi-finger drag
    • Extensible to more gesture types
  • Recognizers generate events if listeners are supplied with set_on_<gesturetype>().
    • All saved gesture recognizer data can be accessed with get_<gesturetype>().
  • Gesture Library can auto-assign finger numbers to touch events if supplied group is TOUCH_GROUP_UNDEFINED.
  • Autogenerated documentation created with Doxygen: accessible here
  • Library demo created with Emscripten and Vite+Vue: accessible here

Future Work

  • Scale gesture parameters by DPI
  • Rebase tap, double tap to repeat tap (see branch)
  • Implement identification of multi-finger taps, repeat tap distinctly from individual taps

Bug Fixes

  • Fixed a bug where double-tapping would break with three fingers
  • Fixed a bug where zoom and rotate gestures were marked as complete when a single drag was performed

Known Issues

  • On the demo:
    • Opening the right-click context menu does not create an up event. This causes multi-finger gestures to misbehave (the library thinks a finger is down because it never received an up event)
    • Depending on the browser and OS, some gestures may try to complete as part of the browser's default handling as apposed to within our library
    • Zoom and Rotate generate a toast when each finger is released from the screen, including the last. This makes sense for releasing all fingers except the last finger on the screen. This last event doesn't carry any meaningful data.

Version 0.4.0

Expand for full details

New Features

  • Zoom and Rotate split into their own gestures
  • Removed swipe gesture
  • Finished implementing gestures: tap, double tap, hold, hold and drag
  • Demo page updates:
    • Links back to home page
    • Communicates with library using new listener structure
    • GestureCanvas component now sets display text within Demo component
    • Folder structure overhauled

Bug Fixes

  • Zoom and rotate gestures work with more than 2 fingers

Known Issues

  • Zoom and rotate gesture occasionally marked as complete on the demo page when a single drag has been performed
  • Multi-finger double tap tests failing for unknown reason

Version 0.3.0

Expand for full details

New Features

  • Functioning swipe and drag gestures
  • Minimally functioning zoom and rotate gesture
  • Gesture library compiles into .js and .wasm with emscripten
    • Functions exposed by Module object to pack and unpack library structs without needing heap DMA

Bug Fixes

  • Faulty unit tests removed

Known Issues

  • Zoom and rotate gesture only works with 2 fingers
  • 3+ finger zoom and rotate is planned for next sprint

Version 0.2.1

Expand for full details

New Features

Known Issues

  • Some unit tests SEGFAULT. These have been commented so the unit test workflow passes.

Version 0.2.0

Expand for full details

New Features

  • Framework for recognizer files (header and c files) created
  • File organization updated
  • Doxygen document generator linked to library
  • Vue project environment set up
  • Demo webapp front landing page created
  • GitHub Actions workflow created to generate and deploy Doxygen documentation from doxygen-config
  • Created prebuild and predev npm scripts that compile C code with Emscripten, allowing for use in the webapp
  • Created build:run npm script that runs npm run build and live-server

Bug Fixes

N/A

Version 0.1.0

Expand for full details

New Features

  • Sprint 1 limited to research, no features created
  • Project is buildable

Bug Fixes

N/A

Known Issues

N/A

Research Done

  • Specified input/output format for data
  • Specified library and architecture structure

Contributors

Russell Newton
Russell Newton

πŸ’» πŸ“– πŸš‡ 🚧
Wenjun Wang
Wenjun Wang

πŸ’» πŸ“– πŸš‡ ⚠️
jrdike
jrdike

πŸ’» πŸ“– ⚠️
Iftekherul Karim
Iftekherul Karim

πŸ’» πŸ“– ⚠️
deborahsrcho
deborahsrcho

πŸ’» 🎨 πŸ–‹