Event Information:

  • Tue

    [Talk] Martez Mott: Improving Touch Interaction for People with Motor Impairments through Ability-Based Design

    12:00pm-01:00pmNQ 3100 (The Ehrlicher Room), North Quad, 105 S. State St. Ann Arbor MI 48109

    MISC Talk Martez Mott






    Everyone is welcome -- light lunch will be served on a first-come-first-served basis; make sure to RSVP, so that we will be prepared (add to calendar).

    Improving Touch Interaction for People with Motor Impairments through Ability-Based Design


    Martez Mott is a fourth year Ph.D. candidate in the Information School at the University of Washington. He is a member of the Mobile + Accessible Design Lab (MAD Lab) where he is advised by Jacob O. Wobbrock. Martez’s research takes an ability-based approach toward improving the accessibility of touch-enabled devices for people with motor impairments and for people under the effects of situational impairments. He received a Best Paper Award at the 2016 ACM Conference on Human-Computer Interaction (CHI) for his work on touch screen accessibility for people with motor impairments.


    Touch is one of the most dominant ways users interact with modern computing devices, such as smartphones, tablets, and public kiosks. For people with motor impairments, however, the implicit ability-assumptions embedded in the design of touch screens makes touch-enabled devices inoperable for many users. In this talk, I will describe how my collaborators and I took an ability-based design approach to touch screen accessibility by making touch screens more amenable to a wider-range of users’ touch abilities. First, I will describe results from an exploratory study of the touch behaviors of 10 people with motor impairments. In our study, we discovered that touching with the backs or sides of the hand, with multiple fingers, or with knuckles created varied multi-point touches, even when only a single touch-point is intended. Second, I will describe Smart Touch, a novel template matching technique that maps any number of arbitrary touch areas to a user’s intended (x,y) target location. In an experiment, we discovered that Smart Touch was able to predict target locations over three times closer to users’ intended targets compared to the de facto Land-on and Lift-off locations reported by the native sensors in the Microsoft PixelSense interactive table. Finally, I will preview some of our upcoming work on improving touch on mobile devices for people with motor impairments, and for people under the effects of situational impairments (e.g., while walking).