My work focuses on developing interactive technologies, leveraging advancements in display technology,
low-power sensing, wearables, robotics and actuation, soft electronics, interactive textile, and human-computer interaction.
I am specifically interested in techniques that enable new interaction for the augmentation and empowerment of human
abilities. This includes augmented reality, ubiquitous computing, mobile devices, 3D user interfaces, interaction techniques,
interfaces for accessibility and health, medical imaging, multimodal systems, and software/hardware prototyping.
These projects are from collaborations during my time at different research labs, including Google Research, MIT, Columbia University, University of California, KTH (Royal Institute of Technology), and Microsoft Research. I have taught at Stanford University, Rhode Island School of Design and KTH. Research Projects Publications Google Scholar Open Source |
Alex Olwal, Ph.D. Sr Research Scientist, Google olwal [at] acm.org |
![]() |
Mobile and Wearable Haptics | ||||
Haptics with Input introduces new opportunities for the Linear Resonant Actuator (LRA), which is ubiquitous in wearable and mobile devices. Through active and passive back-EMF sensing, it could enable new touch and pressure sensing, and allow mobile devices to sense which surfaces they are placed on. Google AI Blog: Haptics with Input -> |
| |||
|
||||
![]() |