My research focuses on designing, building, and evaluating interactive technology that addresses high value social issues such as environmental sustainability, computer accessibility, and personalized health and wellness.
Affiliations
Research highlights
Real-time captioning and sound awareness support
With advances in wearable computing and machine learning, Leah Findlater and I have been investigating new opportunities for real-time captioning and sound awareness support for people who are deaf/Deaf and hard of hearing (DHH). Our work spans three primary areas: real-time captioning in augmented reality and wearables (ARCaptions), sound awareness support in the “smart home” (HomeSound), and real-time sound identification on smart watches (SoundWatch, website forthcoming). Throughout this work, we’ve engaged with over 250 DHH participants to help identify design opportunities, pain points, and to solicit feedback on our designs.
Project Sidewalk
Project Sidewalk combines remote crowdsourcing + AI identify and assess sidewalk accessibility in online imagery. Working with people who have mobility disabilities, local government partners, and NGOs, we have deployed Project Sidewalk into five cities (Washington DC, Seattle, WA, Newberg, OR, Columbus, OH, and Mexico City, MX), collecting over 500,000 geo-tagged sidewalk accessibility labels on curb ramps, surface problems, and other obstacles.
Related news
- ARTennis attempts to help low vision players
- Off to the Park: A Geospatial Investigation of Adapted Ride-on Car Usage
- Augmented Reality to Support Accessibility
- Jon Froehlich named Outstanding Faculty Member by the UW College of Engineering
- CREATE faculty and students awarded at ASSETS 2020
- SoundWatch smartwatch app alerts d/Deaf and hard-of-hearing users to sounds
- UW CREATE leadership at ASSETS 2020
- AccessComputing shares UW CREATE's launch and work toward accessibility
- Can Project Sidewalk Use Crowdsourcing to Help Seattleites Get Around?
- Four CREATE faculty receive Google Research Awards