Shopping for that new dress or a pair of trousers online may save you the hassle of going to the shop, but it also means you are often left wondering: “Will the outfit look good on me?” or “Have I made the right choice?”Your dilemmas (and changing rooms) could soon become a thing of the past, thanks to new technology developed by the Max Planck Institute for Intelligent Systems (MPI-IS), which can digitally capture clothing on moving people to create a 4D virtual simulation.The research, which is still in its early stages, focuses on avatars that come in different body shapes and sizes.It means you can see how the fabric moves on your “virtual body” before making that all-important decision to type in your credit card details. It would also allow you to “try” outfits you’ve just spotted on a celebrity (the Duchess of Cambridge or Eddie Redmayne, for example) without having to leave the confines of your home.The technology could allow you to virtually try outfits you have seen Kate Middleton or Eddie Redmayne wearing without leaving your home (PA)Dubbed ClothCap (short for Cloth Capture) by the researchers, the technology uses motion capture to record 4D movies of people and 66 cameras and projectors to illuminate the person being scanned.“This scanner captures every wrinkle of clothing at high resolution,” says Michael Black, director at MPI-IS.“It is like having 66 eyes looking at a person from every possible angle.”While there isn’t anything new about motion capture technology – it’s already being used in animation, gaming and biomechanics – the researchers say this is the first time effort has been made to capture clothing in motion, in detail, in a way that could help consumers and fashion retailers.Captured subject is left, while ClothCap synthetic animation for a new body is on the right (MPI-IS)ClothCap works by calculating motion and body shape under the clothing while separating and tracking the garments on the body as it moves.
“Our approach is to scan a person wearing the garment, separate the clothing from the person, and then rendering it on top of a new person,” said Dr Gerard Pons-Moll, research scientist at MPI-IS.“This process captures all the detail present in real clothing, including how it moves, which is hard to replicate with simulation.”The researchers say this is a departure from the traditional “marker-based motion” that captures only skeletal motion and transfers those movement markers on clothing where the results often look unrealistic.They claim ClothCap is simpler because the clothing is captured in correspondence with the body.“The algorithm literally subtracts the clothing from the recorded subject and adds it to new body to produce a realistic result,” said Gerard Pons-Moll. “It’s like doing arithmetic with people and their clothing – it’s cool!”So the question is how would this work?The scientists say ClothCap provides the “foundational technology” which can be built upon by retailers to create their own digital wardrobes.The animation on the left is a rendering based on traditional markers while the one on the right has been created with ClothCap technology (MPI-IS/YouTube screenshot)“First, a retailer needs to scan a professional model in a variety of poses and clothing to create a digital wardrobe of clothing items” said Black.“Then a user can select an item and visualise how it looks on their virtual avatar.”But the researchers admit their current technology has several limitations, including the fact that wrinkles in clothing do not change with body shape, and there are still a range of body movements that haven’t been taken into account. They plan to address the issues in future work.While technology may not be able to replicate that feeling of trying an outfit for the first time, maybe if the research takes off, motion-capture avatars might just become the next best thing.The research is being presented at SIGGRAPH – a conference for research in computer graphics – in Los Angeles this week.