3D from 2D touch
While interaction with computers used to be dominated by mice and keyboards, new types of sensors now allow users to interact through touch, speech, or using their whole body in 3d space. These new interaction modalities are often referred to as “natural user interfaces” or “NUIs.” While 2d NUIs have experienced major success on billions of mobile touch devices sold, 3d NUI systems have so far been unable to deliver a mobile form factor, mainly due to their use of cameras. The fact that cameras require a certain distance from the capture volume has prevented 3d NUI systems from reaching the flat form factor mobile users expect.
In this dissertation, we address this issue by sensing 3d input using flat 2d sensors. The systems we present observe the input from 3d objects as 2d imprints upon physical contact. By sampling these imprints at very high resolutions, we obtain the objects’ textures. In some cases, a texture uniquely identifies a biometric feature, such as the user’s fingerprint. In other cases, an imprint stems from the user’s clothing, such as when walking on multitouch floors. By analyzing from which part of the 3d object the 2d imprint results, we reconstruct the object’s pose in 3d space.
While our main contribution is a general approach to sensing 3d input on 2d sensors upon physical contact, we also demonstrate three applications of our approach.
- We present high-accuracy touch devices that allow users to reliably touch targets that are a third of the size of those on current touch devices. We show that different users and 3d finger poses systematically affect touch sensing, which current devices perceive as random input noise. We introduce a model for touch that compensates for this systematic effect by deriving the 3d finger pose and the user’s identity from each touch imprint. We then investigate this systematic effect in detail and explore how users conceptually touch targets. Our findings indicate that users aim by aligning visual features of their fingers with the target. We present a visual model for touch input that eliminates virtually all systematic effects on touch accuracy.
- From each touch, we identify users biometrically by analyzing their fingerprints. Our prototype Fiberio integrates fingerprint scanning and a display into the same flat surface, solving a long-standing problem in human-computer interaction: secure authentication on touchscreens. Sensing 3d input and authenticating users upon touch allows Fiberio to implement a variety of applications that traditionally require the bulky setups of current 3d NUI systems.
- To demonstrate the versatility of 3d reconstruction on larger touch surfaces, we present a high-resolution pressure-sensitive floor that resolves the texture of objects upon touch. Using the same principles as before, our system GravitySpace analyzes all imprints and identifies users based on their shoe soles, detects furniture, and enables accurate touch input using feet. By classifying all imprints, GravitySpace detects the users’ body parts that are in contact with the floor and then reconstructs their 3d body poses using inverse kinematics. GravitySpace thus enables a range of applications for future 3d NUI systems based on a flat sensor, such as smart rooms in future homes.
We conclude this dissertation by projecting into the future of mobile devices. Focusing on the mobility aspect of our work, we explore how NUI devices may one day augment users directly in the form of implanted devices.