This augmented modelling tool enables users to touch, poke, rub or pinch geometric forms projected onto their skin to design wearable 3D-printed pieces (+ movie).
Digital research studio Madlab has developed a system that combines projection mapping with depth and motion sensing technologies to create customised jewellery and other items worn around the wrist.
Called Tactum, the system generates interactive projections that wrap around the topography of the user's arm, which can be 3D-scanned in advance for added accuracy. These images respond to gestures that enable the real-time modelling of wearable objects.
The system ensures that no matter how much the form is manipulated, it will always be produceable with a 3D printer and the final object will fit the user.
"Pre-scanning the body is not entirely necessary when designing a wearable in Tactum," said the studio. "However, it ensures an exact fit once the printed form is placed back on the body."
Once completed, the designs can be exported as data files to a variety of different types of printer.
The first prototypes for Tactum have relied on depth and motion sensors in existing devices, like a Microsoft Kinect, to map the geometry of the body as well as track the hand gestures like pinching and pulling used to manipulate the design.
These sensors feed the information on each gesture into a modelling programme to make live adjustments to the design, which are projected back onto the skin, creating a loop of information.
Hard constraints can be programmed in to fit the projections to specific parameters – for example, the exact dimensions of an existing watch face that must fit into a strap designed using Tactum.
This means imprecise human gestures, which are only accurate to the size of a fingertip, can be combined with the precision necessary for designing around an existing object.
"Between the 3D scan, the intelligent geometry, and intuitive interactions, Tactum is able to coordinate imprecise skin-based gestures to create very precise designs around very precise forms," explained the studio.
So far the team has explored the project through two prototypes. The first used a Microsoft Kinect sensor to detect and track skin gestures, and a Microsoft Surface Pro 3 tablet as an off-body display for presenting the digital geometry to the designer.
The second prototype switched to an above-mounted Leap Motion Controller for tracking the gestures, and showing them directly on the skin.
"While this sensor provided more robust hand and arm tracking, the gesture detection was more robust with the Kinect," said Madlab. "The second prototype also switched from an auxiliary to mapping and projecting geometry directly onto the body."
The results of the team's experiments and detailed technical information on each part of the process have been published as a research document.
The technology has already been used to create a new watch strap for the Moto 360 Smartwatch.
The position and orientation of the watch face on the wrist, as well as the overall form of the band, were determined by using hand gestures.
The exact measurements and tolerances for the clips to hold the watch face – and the clasp to close the band onto the arm – were pre-determined in the software programming, which was set before the light projection design process began.
A series of test artefacts has also been created to demonstrate the technology with different kinds of interactive geometries, materials, modelling modes and fabrication machines.
These include a PLA plastic print made on a standard desktop 3D printer, a nylon and rubber print created with a Selective-Laser Sinter (SLS) 3D printer, and rubbery print using a Stereolithography (SLA) 3D printer.
Tactum was presented at the SXSW Interactive festival in Austin, Texas, earlier this year.