We believe that computer science and mechanical engineering are about to unite. In the future, users will solve mechanical problems by digitizing the involved objects using 3D scanners, solving the problem in the digital domain using the means of computer science, and converting the result back to the mechanical domain using a 3D printer. This will allow solving mechanical problems with the effectiveness and efficiency of computer science, including the ability to scale massively.

This will not only change mechanical engineering, but also allow computing to reach its next phase, which is to merge into matter itself, where the physical matter of objects will also perform the computation, rather than separate micro controllers.

The role of our group is to drive this unification process, in particular by creating and re-purposing fabrication machines and haptic machinery.

(Previously: we worked on unifying the virtual world of the computer with the physical world of the user into a single space. We argue that this is the key to intuitive or “natural” user interfaces. Most of our research takes on an engineering perspective. We create new types of mobile devices, touch devices and interactive floors, and gesture input devices. In addition, we are trying to gain a deeper understanding of touch input through scientific experiments). 

2015+ Keynote: The Six challenges in Personal Fabrication

To come… to be given at Interactions 2015 in Japan

2012-2013 Keynote: Natural User Interface Hardware

"What makes an interface natural" and "why can 1-year olds use touch screens"? I discuss NUI at the example of prototype devices that unify the virtual world of the computer with the physical world of the user into a single "Euclidean" space. 
PPT 54.6MB
 given at British HCI 2012 and 3DUI 2013

2010-2011 Keynote: My new PC is a mobile phone

I argue that the computational device for the world is already here--it is mobile phones. 4 billion of them. But to make them useful, we need to create new software that turns phones into stand-alone computers.
PPT+Video 116.3M given at Mobile HCI 2010 and USAB 2010

Mobile force-feedback (since 2013)

Impacto (full paper at UIST 2015) is a wearable device that simulates the sensation of physical impact in virtual reality. It does so by decomposing the stimuli into (1) the tactile sensation, which is rendered by taping the user's skin with a solenoid, and (2) the impulse, which is rendered by actuating the user's body with electrical muscle stimulation. 

Affordance++ (best paper, fullpaper at CHI 2015) allows everyday objects to communicate dynamic use: motion (spray can shakes when touched), multi-step processes (peeling an avocado), and behaviors that change over time (don't grab the hot cup by its body). 

Proprioceptive Interaction (fullpaper at CHI 2015) allows for eyes-free interaction based on proprioception alone. We devised a bracelet which allows to input and output to happen exclusively through the user's muscles.

Level-Ups (note at CHI 2015) are computer-controlled stilts that allow virtual reality users to experience walking up and down steps. Unlike traditional solutions that are integrated with locomotion devices, Level-Ups allow users to walk around freely.

Haptic Turk (fullpaper at CHI 2014) offers the functionality of a motion platform--but is based on people. Allows bringing motion the concept of immersive haptic experiences to millions of users.

Muscle-Propelled Force-Feedback (note at CHI 2013) provides force feedback via electrical muscle stimulation, e.g., in interactive games. This saves the exoskeleton and motors, making the approach mobile.

Gesture Output (fullpaper at CHI 2013) allows mobile touch devices to send eyes-free messages to the user as a sequence of marks of graffiti characters via our force-feedback touch screen (PocketOuija).

Personal Fabrication (since 2012)

Patching Physical Objects (full paper at UIST 2015) Instead of re-printing the entire object from scratch during design iteration, we suggest patching the existing object and replacing only the unsatisfactory parts.

LaserStacker (full paper at UIST 2015) Physically Sketching Room-Sized Objects at Actual Scale. The key idea behind protopiper is that it forms adhesive tape into tubes as its main building material, rather than extruded plastic or photopolymer lines.

Protopiper (full paper at UIST 2015) Physically Sketching Room-Sized Objects at Actual Scale. The key idea behind protopiper is that it forms adhesive tape into tubes as its main building material, rather than extruded plastic or photopolymer lines.

Platener (best paper nominee at CHI 2015) speeds up design iterations of 3D models by extracting straight and curved plates from the 3D model and substituting them with laser cut parts. 


Scotty (full paper at TEI 2015) is a simple self-contained appliance that allows relocating inanimate physical objects across distance.

WirePrint (full paper at UIST 2014) prints 3D objects as wireframe previews. By extruding filament directly into 3D space instead of printing layer-wise, it achieves a speed-up of up to a factor of 10, allowing designers to iterate more quickly in the early stages of design.

FaBrickation (best paper nominee at CHI 2014) speeds up fabrication time of functional 3D printed objects by integrating construction kit building blocks.

LaserOrigami (best paper award at CHI 2013) allows users to produce fully assembled 3D objects using a laser-cutter--two orders of magnitude faster than with a 3D printer.

Constructable (fullpaper at UIST 2012) allows users to create physical objects by drawing directly on the workpiece while in the laser cutter

gravity + multitouch = 3D tracking (since 2010)

Ergonomic Interaction on Touch Floors (fullpaper at CHI 2015). The main appeal of touch floors is that they are the only direct touch form factor that scales to arbitrary size. In this paper, we argue that the price for this benefit is bad physical ergonomics, and propose addressing this issue by allowing users to operate touch floors in any pose they like.

Kickables (fullpaper at CHI 2014) are tangibles for feet, which provide tangible experiences for large scale installations with interactive floors.

Fiberio (fullpaper UIST 2013) is a multitouch table that senses fingerprints and identifies users biometrically during each touch.

GravitySpace (best paper nominee at CHI 2013) tracks users and their poses in smart rooms based on the latest 8m2 version of our pressure-sensing multitoe floor.

Bootstrapper (note at CHI 2012) recognizes tabletop users by their shoes. Bootstrapper uses a Kinect attached to a Microsoft Surface table pointed at users' shoes.

CapStones and ZebraWidgets (note CHI 2012) are stackable building blocks, dials and widgets for capacitive touch screens that work by handing down capacitance.

Multitoe (fullpaper UIST 2010) is an interactive floor. Based on FTIR sensing, it can identify users based on their soles, track users' foot and body postures, and enable high-precision interaction-using feet.

Lumino (best paper award CHI 2010, demos at Siggraph 2010 and Tabletop 2010) is a system of building blocks that allows microsoft surface to sense building elements arranged in three-dimensional structures.

RidgePad was our first project to explore 3D reconstruction from contact area.

Imaginary interfaces = spatial gestures (since 2010)

Imaginary Reality Games (paper UIST 2013) are games that mimic real-world sport, such as basketball or soccer, except that there is no visible ball.

Understanding palm-based imaginary interfaces (fullpaper CHI 2013) allow users to interact with unfamiliar imaginary interfaces. We also explore revisit the fundamentals of imaginary interfaces, i.e. the role of visual and tactile cues.

BodyScape (best paper nominee CHI 2013) a body-centric design space for exploring how users may interact across the surfaces of their body.

Imaginary Phone (fullpaper UIST 2011) allows users to operate their phone without taking it out of the pocket. Instead users type on their hand. Users learn the device automatically by transferring spatial knowledge from the use of the physical device.

Data Miming (fullpaper CHI 2011). Users retrieve 3D objects from a database by describing their shape through gesture in 3-space. Tracked using a Kinect/PrimeSense depth camera.

Imaginary Interfaces (fullpaper UIST 2010) are devices that allow users to interact spatially as they normally would with a touch screen--yet without the screen.

Understanding touch (since 2010)

Understanding touch (fullpaper at CHI 2011). In order for a touch device to be highly effective it needs to match users' mental model of touch. But what is that model?

Touch on Curved Surfaces (fullpaper CHI 2011). Technology is emerging that allow touch-enabling non-planar surfaces, e.g., on mobile devices. We are modeling how users interact with such surfaces.

Generalized perceived input point model & ridgepad (fullpaper at CHI 2010). We find that the touch location sensed by capacitive touchpads varies across users and finger postures. We build a device based on a fingerprint scanner that exploits this.

Miniature mobile devices (since 2009)

Implanted User Interfaces (full paper CHI 2012) establish that input controls, output components, and components for wireless communication and charging work through human skin.

Rock-Paper-Fibers: (note CHI 2012) Bringing Physical Affordance to Mobile Touch Devices.

Deformable touch devices based on time domain reflectometry (best paper nominee UIST 2011) allow touch-enabling stretchable, reconfigurable, and modular devices, all with a single two-wire cable.

Nenya (note at CHI 2011). This input device is a passive magnetic ring. We obtain this tiny device form-factor by offloading all electronics into a bracelet that tracks the ring using a magnetometer.

Touch Projector (fullpaper CHI 2010) allows users to manipulate the contents located on distant public displays by touch manipulating its video image on their mobile device.

Disappearing Mobile Devices (fullpaper UIST 2009) are devices so small, that they only allow certain types of gesture interactions: a thought experiment about the ultimate future of mobile miniaturization.

Nanotouch (best paper nominee CHI 2009) is a prototype device that users operate via the device backside. Allows for making very small mobile devices.


Let's kick it (note at CHI 2014) makes the unreachable bottom of vertical displays fully interactive by expanding direct touch to the feet.

360° Panoramic Overviews (note CHI 2012) allow users to get an overview of a augmented reality scene without losing spatial orientation.

Relaxed Selection Techniques (fullpaper UIST 2009) allow users to search time series data using a pen stroke that not only specifies the shape of the sought segment, but also specifies tolerances.