I design and develop interactions and technologies that embrace digital and physical experiences. I am specifically interested in tools, techniques and devices that enable new interaction concepts for the augmentation and empowerment of human abilities. This includes 3D user interfaces, interaction techniques, augmented reality, mixed reality, virtual reality, ubiquitous computing, mobile devices, novel interfaces for medical imaging, multimodal systems, touch-screen interaction, and software/hardware prototyping.

The research projects are from exciting times and inspiring collaborations at different research labs and institutions, including MIT, Columbia University, University of California, KTH (Royal Institute of Technology), and Microsoft Research. I have taught at Stanford University, Rhode Island School of Design and KTH.

Research Projects » Publications » Google Scholar »
Alex Olwal, Ph.D.
Sr Research Scientist, Google
olwal [at] acm.org
Alex Olwal

Mobile and Wearable Haptics
Haptics with Input introduces new opportunities for the Linear Resonant Actuator (LRA), which is ubiquitous in wearable and mobile devices. Through active and passive back-EMF sensing, it could enable new touch and pressure sensing, and allow mobile devices to sense which surfaces they are placed on.

"Haptics with Input" on Google AI Blog »
Haptics with Input: Back-EMF in Linear Resonant Actuators to Enable Touch, Pressure and Environmental Awareness
Dementyev, A., Olwal, A., and Lyon, R.F.
Proceedings of UIST 2020 (ACM Symposium on User Interface Software and Technology), Virtual Event, Oct 20-23, 2020, pp. 420-429.

UIST 2020
PDF [3MB]
thumbnails/haptics_with_input/2.png