I design and develop interactions and technologies that embrace digital and physical experiences. I am interested in tools, techniques and devices that enable new interaction concepts for the augmentation and empowerment of the human senses. The research projects are from exciting times and inspiring collaborations at different research labs and institutions, including MIT, Columbia University, University of California, KTH (Royal Institute of Technology), and Microsoft Research.

Research Projects » Publications » Popular Press »
Alex Olwal, Ph.D.
Interaction Researcher, Google [x]
Research Affiliate, MIT Media Lab
Affiliate Faculty, KTH
olwal [at] media.mit.edu

Shape, Actuation and Deformation
Dynamic Physical User Interfaces
inFORM dynamically changes material and form to adapt the physical and virtual interface. Physical Telepresence uses shape capture and display to enhance interactions with remote people and environments.
Physical Telepresence: Shape Capture and Display for Embodied, Computer-mediated Remote Collaboration
Leithinger, D., Follmer, S., Olwal, A., and Ishii, H.
Proceedings of UIST 2014 (ACM Symposium on User Interface Software and Technology), Honolulu, HI, Oct 5-8, 2014, pp. 461-470.

UIST 2014
PDF []
inFORM: Dynamic Physical Affordances and Constraints through Shape and Object Actuation
Follmer, S., Leithinger, D., Olwal, A., Hogge, A., and Ishii, H.
Proceedings of UIST 2013 (ACM Symposium on User Interface Software and Technology), St Andrews, UK, Oct 8-11, 2013, pp. 417-426.

UIST 2013
PDF []

Switchable Physical/Virtual Rendering
Sublimate explores rapid and fluid transitions between physical and visual representations of dynamic digital content.
Sublimate: State-Changing Virtual and Physical Rendering to Augment Interaction with Shape Displays
Leithinger, D., Follmer, S., Olwal, A., Luescher, S., Hogge, A., Lee, J., and Ishii, H.
Proceedings of CHI 2013 - Best Paper Honorable Mention Award (Top 5%) (SIGCHI Conference on Human Factors in Computing Systems), Paris, France, Apr 27-May 2, 2013, pp. 1441-1450.

CHI 2013 - Best Paper Honorable Mention Award (Top 5%)
PDF []

Controlling Softness + Sensing Shape
Jamming User Interfaces enable programmable stiffness, haptic feedback and deformation, for new types of flexible and shape-changing interactions.
Jamming User Interfaces: Programmable Particle Stiffness and Sensing for Malleable and Shape-Changing Devices
Follmer, S., Leithinger, D., Olwal, A., Cheng, N., and Ishii, H.
Proceedings of UIST 2012 - Best Paper Award (Top 1%) (ACM Symposium on User Interface Software and Technology), Cambridge, MA, Oct 7-10, 2012, pp. 519-528.

UIST 2012 - Best Paper Award (Top 1%)
PDF []

Sensing for Embedded Devices and Tangible User Interfaces
Laser Speckle for Motion Sensing
SpeckleSense exploits laser speckle sensing for precise, high-speed, low-latency motion tracking, which can be applied to a wide range of interaction scenarios and devices.
SpeckleEye: Gestural Interaction for Embedded Electronics in Ubiquitous Computing
Olwal, A., Bardagjy, A., Zizka, J., and Raskar, R.
CHI 2012 Extended Abstracts (SIGCHI Conference on Human Factors in Computing Systems), Austin, TX, May 5-10, 2012, pp. 2237-2242.

CHI 2012 Extended Abstracts
PDF []
SpeckleSense: Fast, Precise, Low-cost and Compact Motion Sensing using Laser Speckle
Zizka, J., Olwal, A., and Raskar, R.
Proceedings of UIST 2011 (ACM Symposium on User Interface Software and Technology), Santa Barbara, CA, Oct 16-19, 2011, pp. 489-498.

UIST 2011
PDF []

RFID + Computer Vision for Tracking
SurfaceFusion senses tangible, physical objects on interactive surfaces. It introduces a hybrid technique that combines RFID and computer vision, to avoid the need for visual markers.
SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces
Olwal, A., and Wilson, A.
Proceedings of GI 2008 (Graphics Interface), Windsor, Ontario, May 28-30, 2008, pp. 235-242.

GI 2008
PDF []

Optical Sensing for Spatial Awareness
LightSense tracks mobile devices on static or dynamic displays, to enable context-sensitive visuals and interaction, based on spatial motion and position.
LightSense: Enabling Spatially Aware Handheld Interaction Devices
Olwal, A.
Proceedings of ISMAR 2006 (IEEE and ACM International Symposium on Mixed and Augmented Reality), Santa Barbara, CA, Oct 22-25, 2006, pp. 119-122.

ISMAR 2006
PDF []
The Audiator: A Device-Independent Active Marker for Spatially Aware Displays
Olwal, A.
SIGGRAPH 2007 Posters (International Conference on Computer Graphics and Interactive Techniques), San Diego, CA, Aug 5-9, 2007.

SIGGRAPH 2007 Posters
PDF []

Spatial Displays, Augmented Reality and Virtual Reality
Proprioception and Gestures in 3D Space
T(ether) introduces gestural techniques that exploit proprioception to adapt the interface of a handheld Virtual Reality viewport, based on the hand's position above, behind or on its surface.
T(ether): Spatially-Aware Handhelds, Gestures and Proprioception for Multi-User 3D Modeling and Animation
Lakatos, D., Blackshaw, M., Olwal, A., Barryte, Z., Perlin, K., and Ishii, H.
Proceedings of SUI 2014 (ACM Symposium on Spatial User Interaction), Honolulu, HI, Oct 4-5, 2014, pp. 90-93.

SUI 2014
PDF []

Immaterial Pixels and Displays
Our immaterial display generates pixels that float in mid-air. The surface enables new interaction possibilities as users can reach, walk and talk through the display.
An Immaterial Pseudo-3D Display with 3D Interaction
DiVerdi, S., Olwal, A., Rakkolainen, I., and Höllerer, T.
3D TV Book (Three-Dimensional Television: Capture, Transmission, and Display), Springer, Heidelberg, 2008, ISBN 978-3-540-72531-2.

3D TV Book
PDF []
Consigalo: Multi-user, Face-to-face Interaction on an Immaterial Display
Olwal, A., DiVerdi, S., Rakkolainen, I., and Höllerer, T.
Proceedings of INTETAIN 2008 (2nd International Conference on Intelligent Technologies for Interactive Entertainment), Cancun, Mexico, Jan 8-10, 2008.

INTETAIN 2008
PDF []
An Immaterial, Dual-sided Display System with 3D Interaction
Olwal, A., DiVerdi, S., Candussi, N., Rakkolainen, I., and Höllerer, T.
Proceedings of IEEE VR 2006 (IEEE Virtual Reality Conference 2006), Alexandria, VA, Mar 25-29, 2006, pp. 279-280.

IEEE VR 2006
PDF []
A Novel Walk-through 3D Display
DiVerdi, S., Rakkolainen, I., Höllerer, T., and Olwal, A.
Proceedings of SPIE 2006 Electronic Imaging (Vol. 6055, Stereoscopic Displays and Virtual Reality Systems XIII), San José, CA, Jan 15-18, 2006, pp. 428-437.

SPIE 2006 Electronic Imaging
PDF []

Transparent + Augmenting 3D Surfaces
ASTOR is a transparent 3D window that enhances the space behind it with graphics. It preserves an optically clear view of the real environment, while superimposing dynamic 3D visuals.
Unencumbered 3D Interaction with See-through Displays
Olwal, A.
Proceedings of NordiCHI 2008 (Nordic Conference on Human Computer Interaction), Lund, Sweden, Oct 18-22, 2008, pp. 527-530.

NordiCHI 2008
PDF []
Spatial Augmented Reality on Industrial CNC machines
Olwal, A., Gustafsson, J., and Lindfors, C.
Proceedings of SPIE 2008 Electronic Imaging (Vol. 6804, The Engineering Reality of Virtual Reality 2008), San José, California, Jan 27-31, 2008.

SPIE 2008 Electronic Imaging
PDF []
ASTOR: An Autostereoscopic Optical See-through Augmented Reality System
Olwal, A., Lindfors, C., Gustafsson, J., Kjellberg, T., and Mattson, L.
Proceedings of ISMAR 2005 (IEEE and ACM International Symposium on Mixed and Augmented Reality), Vienna, Austria, Oct 5-8, 2005, pp. 24-27.

ISMAR 2005
PDF []
An Autostereoscopic Optical See-through Display for Augmented Reality
Olwal, A., Lindfors, C., Gustafsson, J.
SIGGRAPH 2004 Sketches (International Conference on Computer Graphics and Interactive Techniques), Los Angeles, CA, Aug 8-12, 2004.

SIGGRAPH 2004 Sketches
PDF []

Hybrid 2D/3D Workspaces
SpaceTop fuses 2D and 3D interactions in a desktop workspace. It simultaneously allows users to type, click, draw in 2D, and directly manipulate interface elements that float in the 3D space.
SpaceTop: Integrating 2D and Spatial 3D Interactions in a See-through Desktop
Lee, J., Olwal, A., Ishii, H., and Boulanger, C.
Proceedings of CHI 2013 (SIGCHI Conference on Human Factors in Computing Systems), Paris, France, Apr 27-May 2, 2013, pp. 189-192.

CHI 2013
PDF []

Mobile AR workspace
POLAR is a lightweight mobile workspace for augmented reality. Using a foldable, optical see-through setup and hybrid user tracking, it enables annotated views of small physical objects without the need for worn technology.
POLAR: Portable, Optical see-through, Low-cost Augmented Reality
Olwal, A., and Höllerer, T.
Proceedings of VRST 2005 (ACM Symposium on Virtual Reality and Software Technology), Monterey, CA, Nov 7-9, 2005, pp. 227-230.

VRST 2005
PDF []

Interaction Techniques for Medical Imaging
Tracking gaze in radiology
We collect gaze data from radiologists that search 3D CT scans for lung nodules. Analysis and interactive 3D visualisations of eye tracking data indicate 2 dominant search strategies, where "Drilling" is superior over "Scanning".
Scanners and Drillers: Characterizing Expert Visual Search through Volumetric Images
Drew, T., Vo, M., Olwal, A., Jacobson, F., Seltzer, S., and Wolfe, J.
JOV 2013 (Journal of Vision, Vol. 13, no 10.), Aug 6, 2013.

JOV 2013
PDF []

Multi-user Surgery Planning
Our multi-display groupware system for medical team meetings synchronizes multi-touch and pen interaction across various mobile and stationary displays.
Design and Evaluation of Interaction Technology for Medical Team Meetings
Olwal, A., Frykholm, O., Groth, K., and Moll, J.
Proceedings of INTERACT 2011 (IFIP TC13 Conference on Human-Computer Interaction), Lisbon, Portugal, Sep 5-9, 2011, pp. 505-522.

INTERACT 2011
PDF []

3D Visualization of 2D X-rays
Our system tracks and augments 2D X-ray images in image-guided surgery. It provides interactive spatiotemporal visualizations of 2D X-rays in timeline views and 3D clouds.
3D Visualization and Interaction with Spatiotemporal X-ray Data to Minimize Radiation in Image-guided Surgery
Ioakeimidou, F., Olwal, A., Nordberg, A., and von Holst, H.
Proceedings of CBMS 2011 (International Symposium on Computer-based Medical Systems), Bristol, UK, Jun 27-30, 2011.

CBMS 2011
PDF []

Spatially Aware Handheld Displays, Mobile Phones and Portable Devices
Interaction with Dynamic Surfaces
We synchronize mobile devices with interaction on the large touch surface to expand expressiveness. The mobile displays provide denser resolution, while the controls enable better precision and physical affordances.
Augmenting Surface Interaction through Context-sensitive Mobile Devices
Olwal, A.
Proceedings of INTERACT 2009 (IFIP TC13 Conference on Human-Computer Interaction), Uppsala, Sweden, Aug 24-28, 2009, pp. 336-339.

INTERACT 2009
PDF []
Spatially Aware Handhelds for High-Precision Tangible Interaction with Large Displays
Olwal, A., and Feiner, S.
Proceedings of TEI 2009 (International Conference on Tangible and Embedded Interaction), Cambridge, UK, Feb 16-18, 2009, pp. 181-188.

TEI 2009
PDF []

2D + 3D Mobile Augmented Reality
LUMAR combines 2D and 3D interaction with a static display, and enables a 3-layered information space, where the mobile phone provides an augmented reality viewport into the real world.
LUMAR: A Hybrid Spatial Display System for 2D and 3D Handheld Augmented Reality
Olwal, A., and Henrysson, A.
Proceedings of ICAT 2007 (International Conference on Artificial Reality and Teleexistence), Esbjerg, Denmark, Nov 28-30, 2007, pp. 63-70.

ICAT 2007
PDF []

Realistic Interaction with 3D Models
Our mobile phantogram viewer enables realistic real-time interaction with 3D models through view-point correct anamorphosis and stereoscopy.
Interaction and Rendering Techniques for Handheld Phantograms
Ericson, F., and Olwal, A.
CHI 2011 (SIGCHI Conference on Human Factors in Computing Systems), Vancouver, BC, May 7-12, 2011, pp. 1339-1343.

CHI 2011
PDF []

Tangible Interactions
This framework explores tangible interaction for handheld AR. Hardware-accelerated rendering of illumination and shadows enables real-time interaction with realistic models, through spatial motion or touch screen manipulation.
Tangible Interfaces using Handheld Augmented Reality
Rojtberg, P., and Olwal, A.
Proceedings of SIGRAD 2010 (Swedish Chapter of Eurographics Conference), Västerås, Sweden, Nov 25-26, 2010, pp. 17-26.

SIGRAD 2010
PDF []

Customizing Mobile User Interfaces
The OldGen framework addresses accessibility on generic mobile devices by decoupling the software UI from the phone hardware. It makes the UI portable and independent of phone, model or brand.
OldGen: Mobile Phone Personalization for Older Adults
Olwal, A., Lachanas, D., and Zacharouli, E.
Proceedings of CHI 2011 (SIGCHI Conference on Human Factors in Computing Systems), Vancouver, BC, May 7-12, 2011, pp. 3393-3396.

CHI 2011
PDF []

Portable, Interactive Eye Exams
Our interactive device images and visualizes the retina. Its use of indirect diffuse illumination and binocular coupling, avoids the complexity of traditional devices.
Computational Retinal Imaging via Binocular Coupling and Indirect Illumination
Lawson, E., Boggess, J., Khullar, S., Olwal, A., Wetzstein, G., and Raskar, R.
SIGGRAPH 2012 (International Conference on Computer Graphics and Interactive Techniques), Los Angeles, CA, Aug 5-9, 2012.

SIGGRAPH 2012
PDF []

Multimodal Interaction: Touching, Pointing, Looking and Gesturing
Minimal Gestures for Pen and Touch
Rubbing and Tapping are fast and precise interaction techniques for single-touch, multi-touch and pen-based devices. They leverage minimal gestures for quick zoom and pan actions.
Rubbing and Tapping for Precise and Rapid Selection on Touch-Screen Displays
Olwal, A., Feiner, S., and Heyman, S.
Proceedings of CHI 2008 - Best Paper Honorable Mention Award (Top 5%) (SIGCHI Conference on Human Factors in Computing Systems), Florence, Italy, Apr 5-10, 2008, pp. 295-304.

CHI 2008 - Best Paper Honorable Mention Award (Top 5%)
PDF []
Rubbing the Fisheye: Precise Touch-Screen Interaction with Gestures and Fisheye Views
Olwal, A., and Feiner, S.
UIST '03 (ACM Symposium on User Interface Software and Technology), Vancouver, BC, Nov 2-5, 2003, pp. 83-84.

UIST '03
PDF []

Analyzing Gaze and Gestures
MAVEN interprets user intention in AR/VR by fusing speech, gesture, viewpoint, pointing direction, and SenseShapes statistics, to improve recognition through multimodal disambiguation.
SenseShapes: Using Statistical Geometry for Object Selection in a Multimodal Augmented Reality System
Olwal, A., Benko, H., and Feiner, S.
Proceedings of ISMAR 2003 (IEEE and ACM International Symposium on Mixed and Augmented Reality), Tokyo, Japan, Oct 7-10, 2003, pp. 300-301.

ISMAR 2003
PDF []
MAVEN: Mutual Disambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality
Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Feiner, S., and Cohen, P.
Proceedings of ICMI 2003 (International Conference on Multimodal Interfaces), Vancouver, BC, Nov 5-7, 2003, pp. 12-19.

ICMI 2003
PDF []

A Soft, Bendable 3D Selection Widget
The Flexible Pointer can facilitate selection and interaction with fully or partially obscured objects in 3D environments, and help indicate objects of interest to collaborators in AR/VR.
The Flexible Pointer: An Interaction Technique for Selection in Augmented and Virtual Reality
Olwal, A., and Feiner, S.
UIST '03 (ACM Symposium on User Interface Software and Technology), Vancouver, BC, Nov 2-5, 2003, pp. 81-82.

UIST '03
PDF []

Tactile Feedback for Motion Guidance
Motion guidance for position, direction and continuous velocities, is provided to tracked users using visual, vibrotactile and pneumatic feedback.
Multimodal Motion Guidance: Techniques for Adaptive Dynamic Feedback
Schönauer, C., Fukushi, K., Olwal, A., Kaufmann, H., and Raskar, R.
Proceedings of ICMI 2012 (ACM International Conference on Multimodal Interaction), Santa Monica, CA, Oct 22-26, 2012, pp. 133-140.

ICMI 2012
PDF []

Pedagogical Robotics
Cloud Rhymer is a platform that combines cloud technology and robotics. Users can text message a word to the robot, who will start rhyming on that word in sync with the beat.
Cloud Rhymer: Prototype Demo and Intervention Proposal
Robert, D., Schmitt, P., and Olwal, A.
Proceedings of IDC 2013 (International Conference on Interaction Design and Children), New York, NY, Jun 24-27, 2013, pp. 507-510.

IDC 2013
PDF []

Gesture Control of Heart Simulation
The HEART project allows the public to experience and interact with a simulation of a beating heart. Blood flow and pressure are visualized in 3D and can be manipulated with gestures using both hands.
Gestural 3D Interaction with a Beating Heart: Simulation, Visualization and Interaction
Ioakeimidou, F., Ericson, E., Spühler, J., Olwal, A., Forsslund, J., Jansson, J., Sallnäs Pysander, E.-L., and Hoffman, J.
Proceedings of SIGRAD 2011 (Swedish Chapter of Eurographics Conference), Stockholm, Sweden, Nov 17-18, 2011, pp. 93-97.

SIGRAD 2011
PDF []

Visual Data Flows of Interaction
The Unit framework is a visual dataflow programming language for highly interactive 3D environments. Interaction techniques are abstracted from devices and application, to separate application logic from behavior.
Unit: Modular Development of Distributed Interaction Techniques for Highly Interactive User Interfaces
Olwal, A., and Feiner, S.
Proceedings of GRAPHITE 2004 (International Conference on Computer Graphics and Interactive Techniques in Australasia and Southeast Asia), Singapore, Singapore, Jun 15-18, 2004, pp. 131-138.

GRAPHITE 2004
PDF []

Immersive Authoring
This mixed reality system allows users to immersively reconfigure data flow in real-time between interaction devices and objects on a running hybrid user interface.
Immersive Mixed-Reality Configuration of Hybrid User Interfaces
Sandor, C., Olwal, A., Bell, B., and Feiner, S.
Proceedings of ISMAR 2005 (IEEE and ACM International Symposium on Mixed and Augmented Reality), Vienna, Austria, Oct 5-8, 2005, pp. 110-113.

ISMAR 2005
PDF []

Expression through Non-verbal Speech
Prosodic features of speech is used together with audio localization to control interactive applications. This information can also be applied to parameter control, or for disambiguation in speech recognition.
Interaction Techniques using Prosodic Features of Speech and Audio Localization
Olwal, A., and Feiner, S.
Proceedings of IUI 2005 (International Conference on Intelligent User Interfaces), San Diego, CA, Jan 9-12, 2005, pp. 284-286.

IUI 2005
PDF []

Doctoral Thesis
Unobtrusive Augmented Reality
The concept of Unobtrusive Augmented Reality is introduced through various systems and techniques that enable sporadic and spontaneous interaction. Unobtrusive AR emphasizes an optically direct view of a visually unaltered physical environment, the avoidance of user-worn technology, and the preference for unencumbering techniques.
Unobtrusive Augmentation of Physical Environments: Interaction Techniques, Spatial Displays and Ubiquitous Sensing
Olwal, A.
Doctoral Thesis (Ph.D. Dissertation. Opponent: Professor Mark Billinghurst (HIT Lab NZ, University of Canterbury, New Zealand). Committee: Associate Professor Kari Pulli (University of Oulu; Research Fellow, Nokia Research Center, Palo Alto), Professor Kerstin Severinson Eklundh (KTH), Associate Professor Morten Fjeld (Chalmers University of Technology). Advisors: Professor Steven Feiner (Columbia University), Professor Lennart Johnsson (KTH), Professor Yngve Sundblad (KTH).), Jun 5, 2009.

Doctoral Thesis
PDF []