A 3D Printer Head as a Robotic Manipulator
Three-dimensional (3D) printers, which can print out 3D geometric data in the physical world, are becoming affordable, especially fused deposition modeling (FDM) 3D printers. We believe that 3D printers have potential beyond printing usage, because FDM 3D printers are intrinsically three-axis Cartesian coordinate robots, which have a filament extruder. The motivation behind our research is to redesign 3D printers, taking advantage of the three-axis robot features and the ability to generate physical objects. We introduce new ways of using 3D printer head as a 3-axis robotic manipulator to enable advanced fabrication and usage such as breaking support materials, assembling separately printed parts and actuating printed objects on a build-plate. To achieve these manipulations, we customize a low-cost commodity fused deposition modeling (FDM) 3D printer to that can attach/detach printed end-effectors which change the function of the 3D printer head (e.g. hook, break, and rotate printed objects). By these advanced fabrication techniques, a low-cost FDM 3D printer print out kinetic objects one-off such as bevel gears, springs. In addition, this technique enables actuating printed functional objects on a build-plate that need a power source and actuators such as a coffee mill.
Building a drone-based Haptic Device
Immersion is a key aspect in creating realistic virtual worlds. After significant advances in the visual domain, it appears that realistic haptic effects are the next step towards increasing immersion. A force-reflecting haptic interface generates synthetic force using mechanical actuators and delivers it to a user through physical contact or coupling with a user’s body. In order to convey the generated force to a certain body part, a haptic interface should be affixed elsewhere. Generally haptic devices are tethered to a surface. In this case the work-space is fixed on the ground and restricted in size by the mechanical limits of the interface. On the other hand if the interface is tethered to the user’s body, exploiting a body part as a reaction support only "relative-force" among body parts can be generated. A drone based haptic device ideally overcomes these usability issues. A drone can actively generate kinetic energy and can push and pull a user’s hand without the need to be tethered. If this physical interaction is done in a well-controlled manner, it can be used as a free moving force reflecting haptic interface. As a proof-of-concept study this research focuses on creating haptic feedback only in 1D direction. To this end, an encountered-type, safe and un-tethered haptic display is implemented. An overview of the system and details on how to control the force output of drones is provided. Our current prototype generates forces up to 1.53 N upwards and 2.97N downwards. This concept serves as a first step towards introducing drones as mainstream haptic devices.