I direct research and development of interaction technologies based on advancements in display technology, low-power and high-speed sensing, wearables, actuation, electronic textiles, and human—computer interaction. I am passionate about accelerating innovation and disruption through tools, techniques and devices that enable augmentation and empowerment of human abilities. Research interests include augmented reality, ubiquitous computing, mobile devices, 3D user interfaces, interaction techniques, interfaces for accessibility and health, medical imaging, and software/hardware prototyping.

These projects are from collaborations during my time at different research labs, including Google Research, MIT, Columbia University, University of California, KTH (Royal Institute of Technology), and Microsoft Research. I have taught at Stanford University, Rhode Island School of Design and KTH.

Research Projects Publications Google Scholar Open Source
Alex Olwal, Ph.D.
Staff Research Scientist, Google
olwal [at] acm.org
Alex Olwal

Zensei is an implicit sensing system that leverages bio-sensing, signal processing and machine learning to classify uninstrumented users by their body's electrical properties.
Zensei: Embedded, Multi-electrode Bioimpedance Sensing for Implicit, Ubiquitous User Recognition
Sato, M., Puri, R., Olwal, A., Ushigome, Y., Franciszkiewicz, L., Chandra, D., Poupyrev, I., and Raskar, R.
Proceedings of CHI 2017 (SIGCHI Conference on Human Factors in Computing Systems), Denver, CO, May 6-11, 2017, pp. 3972-3985.

CHI 2017
PDF [15MB]
Zensei: Augmenting Objects with Effortless User Recognition Capabilities through Bioimpedance Sensing
Sato, M., Puri, R., Olwal, A., Chandra, D., Poupyrev, I., and Raskar, R.
UIST 2015 Extended Abstracts (ACM Symposium on User Interface Software and Technology), Charlotte, NC, Nov 8-11, 2015, pp. 41-42.

UIST 2015
PDF [0.7MB]
thumbnails/zensei/4.jpg thumbnails/zensei/1.jpg thumbnails/zensei/2.jpg thumbnails/zensei/3.jpg thumbnails/zensei/5.jpg thumbnails/zensei/6.jpg