$1 Unistroke Recognizer

Jacob O. Wobbrock, University of Washington [contact]
Andrew D. Wilson, Microsoft Research
Yang Li, University of Washington

Currently at Google

Download

$1 source code: JavaScript, C#
Dynamic time warping: C#
Rubine classifier: C#
Pseudocode: $1, Protractor
Unistroke gesture logs: XML
Paper: PDF

This software is distributed under the New BSD License agreement.

About

The $1 Unistroke Recognizer is a 2-D single-stroke recognizer designed for rapid prototyping of gesture-based user interfaces. In machine learning terms, $1 is an instance-based nearest-neighbor classifier with a Euclidean scoring function, i.e., a geometric template matcher. $1 is an extension of the proportional shape matching approach used in SHARK2, which itself is an adaptation of Tappert's elastic matching approach with zero look-ahead. Despite its simplicity, $1 requires very few templates to perform well and is only about 100 lines of code, making it easy to deploy. An optional enhancement called Protractor improves $1's speed. The $N Multistroke Recognizer extends $1 to gestures with multiple strokes. The $P Point-Cloud Recognizer is the latest in the dollar family, performing unistroke and multistroke recognition without the combinatoric overhead of $N.

Demo

In the demo below, only one unistroke template is loaded for each of the 16 gesture types. You can add additional unistrokes as you wish, and even define your own custom unistrokes.


Use Golden Section Search (original)
Use Protractor (faster)

Make strokes on this canvas. If a misrecognition occurs, add the misrecognized unistroke as an example of the intended gesture.

The <canvas> element is not supported by this browser.
Add as example of existing type:
Add as example of custom type:
Delete all user-defined gestures:  

$1 By Others

Our Related Software Projects

Our Related Publications

Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2014). Gesture heatmaps: Understanding gesture performance with colorful visualizations. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '14). Istanbul, Turkey (November 12-16, 2014). New York: ACM Press, pp. 172-179.

Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2013). Relative accuracy measures for stroke gestures. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '13). Sydney, Australia (December 9-13, 2013). New York: ACM Press, pp. 279-286.

Anthony, L., Vatavu, R.-D. and Wobbrock, J.O. (2013). Understanding the consistency of users' pen and finger stroke gesture articulation. Proceedings of Graphics Interface (GI '13). Regina, Saskatchewan (May 29-31, 2013). Toronto, Ontario: Canadian Information Processing Society, pp. 87-94.

Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2012). Gestures as point clouds: A $P recognizer for user interface prototypes. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '12). Santa Monica, California (October 22-26, 2012). New York: ACM Press, pp. 273-280.

Anthony, L. and Wobbrock, J.O. (2012). $N-Protractor: A fast and accurate multistroke recognizer. Proceedings of Graphics Interface (GI '12). Toronto, Ontario (May 28-30, 2012). Toronto, Ontario: Canadian Information Processing Society, pp. 117-120.

Anthony, L. and Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface (GI '10). Ottawa, Ontario (May 31-June 2, 2010). Toronto, Ontario: Canadian Information Processing Society, pp. 245-252.

Wobbrock, J.O., Wilson, A.D. and Li, Y. (2007). Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST '07). Newport, Rhode Island (October 7-10, 2007). New York: ACM Press, pp. 159-168.


Copyright © 2007-2014 Jacob O. Wobbrock