This software is distributed under the New BSD License agreement.
About
The $P Point-Cloud Recognizer is a 2-D stroke-gesture recognizer designed for rapid prototyping of gesture-based
user interfaces. $P is the first point-cloud matcher within the $-family of
recognizers, which includes $1 for unistrokes and $N for multistrokes.
Although about half of $P's code is from $1, unlike both $1 and $N, $P does not represent gestures as ordered
points (i.e., strokes), but as unordered point-clouds. By representing gestures as point-clouds,
$P can handle both unistrokes and multistrokes equivalently, and without the combinatoric overhead of $N, as stroke
number, order, and direction are all ignored. When comparing two point-clouds, $P finds a solution to the classic
assignment problem between two bipartite
graphs using an approximation of the Hungarian algorithm.
Superseding $P is $P+, which is more accurate and optimized for people with low vision; and
$Q, which is a super-quick recognizer optimized for low-power mobiles and wearables, achieving
a whopping 142× speedup with slightly improved accuracy. $P was also customized for the
Smart Touch project
to enable matching of point-clouds with different numbers of points.
Upon its publication at ICMI 2012, $P was given
an Outstanding Paper Award. A decade later, the same paper was recognized with the
ICMI 2022 Ten-Year Technical Impact Award.
$P and other $-family recognizers have been built into numerous projects and even industry prototypes,
and have had many follow-ons by others. Read about the $-family's impact.
Demo
In the demo below, only one point-cloud template is loaded for each of the 16 gesture types. You can add additional
templates as you wish, and even define your own custom gesture templates.
Make strokes on this canvas.
Right-click the canvas to recognize.
If a misrecognition occurs, add the mis- recognized gesture
as an example of the intended gesture.
Add as example of existing type:
Add as example of custom type:
Delete all user-defined gestures:
Video
Our Gesture Software Projects
$Q: Super-quick multistroke recognizer - optimized for low-power mobiles and wearables
$P+: Point-cloud multistroke recognizer - optimized for people with low vision
$P: Point-cloud multistroke recognizer - for recognizing multistroke gestures as point-clouds
$N: Multistroke recognizer - for recognizing simple multistroke gestures
$1: Unistroke recognizer - for recognizing unistroke gestures
AGATe: AGreement Analysis Toolkit - for calculating agreement in gesture-elicitation studies
GHoST: Gesture HeatmapS Toolkit - for visualizing variation in gesture articulation
GREAT: Gesture RElative Accuracy Toolkit - for measuring variation in gesture articulation
GECKo: GEsture Clustering toolKit - for clustering gestures and calculating agreement
Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2013).
Relative accuracy measures for stroke gestures.
Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '13).
Sydney, Australia (December 9-13, 2013).
New York: ACM Press, pp. 279-286.
Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2012).
Gestures as point clouds: A $P recognizer for user interface prototypes.
Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '12).
Santa Monica, California (October 22-26, 2012).
New York: ACM Press, pp. 273-280.
Anthony, L. and Wobbrock, J.O. (2012).
$N-Protractor: A fast and accurate multistroke recognizer.
Proceedings of Graphics Interface (GI '12).
Toronto, Ontario (May 28-30, 2012).
Toronto, Ontario: Canadian Information Processing Society, pp. 117-120.