Show all publications

Athlete in wheelchair Ability-Based Design (NSF funded)
Developing technologies, principles, and methods to support the full range of human abilities with interactive systems. Grew out of work on "ability-based user interfaces" by Gajos, Wobbrock and Weld.
[Show 12 publications]
Woman walking with mobile phone and luggage Understanding, Detecting and Accommodating Situational Impairments (NSF funded)
Understanding the effects of situational impairments and devising new sensors and interfaces to help users overcome these temporary impairments.
[Show 9 publications]
Hand holding an iPhone Mobile Device Accessibility for People with Disabilities
Making mobile devices more accessible to people who are blind, have low vision, or have motor impairments. Also, using mobile devices to make the world more accessible.
[Show 13 publications]
MobileASL MobileASL (NSF funded)
Conducting user-centered laboratory and field studies of MobileASL, which allows deaf people to communicate with video using 3-G mobile phones and networks.
[Show 10 publications]
Old-fashioned hearing aid Design for Social Accessibility (NSF funded)
Performing foundational studies and methodology development for creating "socially accessible" assistive technologies that reduce stigma and promote self-confidence over self-consciousness.
[Show 6 publications]
Non-speech voice-based computer access Non-speech Voice-based Computer Access
Exploring continuous non-speech voice control of computers for people with motor impairments.
[Show 5 publications]
Hand on surface Methods and Tools for Understanding and Designing Gestures
Studying gestures and creating methods and tools for designers and users to create gesture interactions.
[Show 12 publications]
Simple gesture recognizers for arbitrary strokes The $-Family Gesture Recognizers
Inventing and studying lightweight easily-implementable stroke-gesture recognizers for use in user interface prototypes on all kinds of platforms.
[Show 5 publications]
  • Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2018). $Q: A super-quick, articulation-invariant stroke-gesture recognizer for low-resource devices. Proceedings of the ACM Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). Barcelona, Spain (September 3-6, 2018). New York: ACM Press. To appear.
  • Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2012). Gestures as point clouds: A $P recognizer for user interface prototypes. Proceedings of the ACM International Conference on Multimodal Interaction (ICMI '12). Santa Monica, California (October 22-26, 2012). New York: ACM Press, pp. 273-280. Outstanding Paper Winner.
  • Anthony, L. and Wobbrock, J.O. (2012). $N-Protractor: A fast and accurate multistroke recognizer. Proceedings of Graphics Interface (GI '12). Toronto, Ontario (May 28-30, 2012). Toronto, Ontario: Canadian Information Processing Society, pp. 117-120.
  • Anthony, L. and Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface (GI '10). Ottawa, Ontario, Canada (May 31-June 2, 2010). Toronto, Ontario: Canadian Information Processing Society, pp. 245-252.
  • Wobbrock, J.O., Wilson, A.D. and Li, Y. (2007). Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST '07). Newport, Rhode Island (October 7-10, 2007). New York: ACM Press, pp. 159-168. Invited for Reprise at SIGGRAPH '08.
Bonfire Laptop/Tabletop System Extending Laptop and Tablet Interactions with Projectors and Cameras
Using computer vision and micro-projectors to extend interaction out onto a laptop's surrounding table surface.
[Show 3 publications]
Students collaborating at a tabletop computer Improving Student Collaboration on Interactive Tabletops
Understanding and modeling student behavior at interactive tabletops, and designing interventions to improve collaborative work in real-world classroom settings.
[Show 4 publications]
Access Overlays Making Interactive Tabletops Accessible
Making large interactive tabletops accessible to people with disabilities.
[Show 3 publications]
Software keyboard
Software keyboard with drawn punctuation
Touch-Typing on Interactive Tabletops
Studying and enabling expert touch-typing on interactive surfaces with interactive machine learning.
[Show 4 publications]
Typing on a QWERTY keyboard Text Entry Tools, Techniques and Studies
Improving text entry through new evaluation tools, innovative techniques, and formal studies.
[Show 5 publications]
Graph of the pointing error model Models and Tools for Measuring Pointing Performance
Developing predictive models for pointing errors, evaluating variations of models for measuring pointing performance, and creating tools for measuring pointing performance not only in the lab, but also "in the wild."
[Show 5 publications]
Bubble Lens High-Performance Mouse Pointing
Speeding and making accessible the most common action we perform with desktop computers every day.
[Show 7 publications]
Goal crossing with mouse Accessible Goal Crossing (NSF funded)
Creating accessible desktop user interfaces based on goal crossing instead of pointing-and-clicking for people with motor impairments.
[Show 4 publications]
Help interface Improving Web Usability with Contextual Help (NSF funded)
Understaning web sites' challenges for users and creating systems to address those challenges. Resulted in the founding of AnswerDash, Inc.
[Show 10 publications]
Bipolar bear Peer-based support for mental health
Understanding and creating tools for peer-to-peer mental health support.
[Show 3 publications]
Contemplative multitasking Contemplative Multitasking (NSF funded)
Exploring the effects of mindfulness-based meditation training on human-computer multitasking performance.
[Show 2 publications]
ARTool software Improving Statistical Methods in HCI
Developing tools and methods for improving statistical analysis in human-computer interaction.
[Show 4 publications]
HCI-Q HCI-Q: Using Q-Methodology in HCI
Promoting the application and adaptation of Q-methodology for research and design in HCI, especially for technologies for which productivity may be secondary to matters of meaning nad personal significance.
[Show 1 publication]
  • O'Leary, K., Wobbrock, J.O. and Riskin, E.A. (2013). Q-methodology as a research and design tool for HCI. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '13). Paris, France (April 27-May 2, 2013). New York: ACM Press, pp. 1941-1950.
TapSongs TapSongs: Rhythm-Based Password Entry
Devising a user authentication method that uses a single sensor tapped according to a song rhythm. Useful for logging into devices that have no space for keyboards or gestures.
[Show 1 publication]
EyeWrite text entry software for the eyes EyeWrite
Comparing a gestural means of writing with the eyes to traditional eye-typing on an on-screen keyboard. Significantly adapts and extends the EdgeWrite design for text input to the vagaries of eye-tracking.
[Show 2 publications]
EdgeWrite for Palm OS EdgeWrite
Making accessible text entry possible on a variety of devices through a simple, versatile, reusable design using letter-like gestures. EdgeWrite was originally designed as an accessible technology for improving text entry on PDAs, but now has versions on trackballs, joysticks, touchpads, capacitive sensors, four keys, and more. Innovations include continuous recognition feedback, non-recognition retry, slip detection, a four-way scroll ring, and in-stroke word completion.
Visit www.edgewrite.com