$1 source code: JavaScript, C#
Dynamic time warping: C#
Rubine classifier: C#
Pseudocode: $1, Protractor
Unistroke gesture logs: XML
Paper: PDF
This software is distributed under the New BSD License agreement.
About
The $1 Unistroke Recognizer is a 2-D single-stroke recognizer designed for rapid prototyping of gesture-based
user interfaces. In machine learning terms, $1 is an instance-based nearest-neighbor classifier with a 2-D Euclidean
distance function, i.e., a geometric template matcher. $1 is a significant extension of the proportional shape matching
approach used in SHARK2, which itself is
an adaptation of Tappert's elastic matching approach
with zero look-ahead. Despite its simplicity, $1 requires very few templates to perform well and is only about
100 lines of code, making it easy to deploy.
An optional enhancement called Protractor improves $1's speed.
Upon its publication at UIST 2007, $1 was invited
for a special reprise presentation at SIGGRAPH 2008. Seventeen years later, $1 won the UIST 2024 Lasting Impact Award.
The $N Multistroke Recognizer extends $1 to gestures with multiple strokes.
The $P Point-Cloud Recognizer performs unistroke and multistroke recognition without the
combinatoric overhead of $N, as it ignores stroke number, order, and direction. The $Q Super-Quick Recognizer
extends $P for use on low-powered mobiles and wearables, as it is a whopping 142× faster and slightly more accurate.
The $-family recognizers have been built into numerous projects and even industry prototypes,
and have had many follow-ons by others. Read about the $-family's impact.
Demo
In the demo below, only one unistroke template is loaded for each of the 16 gesture types. You can add additional
unistrokes as you wish, and even define your own custom unistrokes.
Make strokes on this canvas. If a misrecognition occurs,
add the misrecognized unistroke as an example of the intended gesture.
Add as example of existing type:
Add as example of custom type:
Delete all user-defined gestures:
UIST 2024 Lasting Impact Award Talk
Our Gesture Software Projects
$Q: Super-quick multistroke recognizer - optimized for low-power mobiles and wearables
$P+: Point-cloud multistroke recognizer - optimized for people with low vision
$P: Point-cloud multistroke recognizer - for recognizing multistroke gestures as point-clouds
$N: Multistroke recognizer - for recognizing simple multistroke gestures
$1: Unistroke recognizer - for recognizing unistroke gestures
AGATe: AGreement Analysis Toolkit - for calculating agreement in gesture-elicitation studies
GHoST: Gesture HeatmapS Toolkit - for visualizing variation in gesture articulation
GREAT: Gesture RElative Accuracy Toolkit - for measuring variation in gesture articulation
GECKo: GEsture Clustering toolKit - for clustering gestures and calculating agreement
Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2013).
Relative accuracy measures for stroke gestures.
Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '13).
Sydney, Australia (December 9-13, 2013).
New York: ACM Press, pp. 279-286.
Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2012).
Gestures as point clouds: A $P recognizer for user interface prototypes.
Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '12).
Santa Monica, California (October 22-26, 2012).
New York: ACM Press, pp. 273-280.
Anthony, L. and Wobbrock, J.O. (2012).
$N-Protractor: A fast and accurate multistroke recognizer.
Proceedings of Graphics Interface (GI '12).
Toronto, Ontario (May 28-30, 2012).
Toronto, Ontario: Canadian Information Processing Society, pp. 117-120.
Li, Y. (2010).
Protractor: A fast and accurate gesture recognizer.
Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '10).
Atlanta, Georgia (April 10-15, 2010).
New York: ACM Press, pp. 2169-2172.
Rubine, D. (1991).
Specifying gestures by example.
Proceedings of the ACM Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '91).
Las Vegas, Nevada (July 28 - August 2, 1991).
New York: ACM Press, pp. 329-337.