#Ipi mocap studio 2 calibration trial
Then you have to 'calibrate' them which is a real trial and error pain in the ass. And then you cannot see if your capture was successful until it has been 'solved' off line. There is far too much setting up and waiting around for results and starting again.Īll space requirements are listed on our website. For home usage, the dual Kinect configuration is better suited, as it does not require much space. Also, the calibration process is very simple for dual Kinects, it takes about 10 minutes. The accuracy is about the same comparing 2 Kinects and 4 PS Eyes. However, due to sensor limitations, the capture area is much less than with PS Eyes. And PS Eyes are better at feet and head tracking, as well as for quick motions (60fps vs 30fps).Ĭalibration for PS Eyes is a bit trickier and longer. However, it is not of a much trouble when got used to it. Of course, it's convenient to have a sort of permanent setup with PS Eyes, fixing them to the walls or ceiling, so you do not need to do calibration very often.Eevolver, an LA-based production studio, was commissioned by Industrial Color to reimagine the iconic Ralph Lauren polo bears for the 2021 fragrance campaign. The company used Unreal Engine’s new real-time fur features to bring the bears to life.įor the holiday campaign, Eevolver was tasked with taking the iconic polo bears, along with a new bear, on a journey from the Ralph Lauren flagship store in New York City, through Central Park, to the snowy streets of London, on to Paris’s Eiffel Tower, then to Shanghai before returning to NYC. Eevolver started by building on previous campaigns and integrating the newest real-time technologies to bring the furry bears to life. “The biggest achievement on the project was using the newest in real-time fur techniques,” says Stacy Burstin, president of Eevolver. “We started with our traditional pipeline, using Maya/Houdini, and then integrated Unreal’s new fur features, which completely changed the potential of the production. This would have previously been an enormously time-consuming render undertaking.” The quality of fur we were able to create using Unreal Engine’s new hair and fur rendering made it possible to have film-quality fur on multiple bears animating in one scene. While creating this spot, the studio’s focus was to build upon the previous campaigns while also integrating a modern style. The Eevolver team achieved both through the story and the approach to the production. The team combined Unreal’s technology with motion capture and keyframe animation. Each scene was fully built in 3D so it could be seen from all angles in context with the lighting, textures, effects and animation. “Additionally, we could visualize and edit the entire spot on a single timeline in real time, which is powerful because it gives the director much more freedom to quickly explore the best way to tell the story.” “It makes animation much more like a live-action set, where we can move the camera to any location,” explains Burstin. While the bears had to appear like stuffed animals, there was much more action in this year’s spot. To modernize and bring a more lifelike appearance to the bears as they traveled the world, Eevolver’s team developed a hybrid look for the bears that combined both motion capture and keyframe animation. “In terms of tools, Autodesk Maya and Arnold were used to create the original bear reference, which we then integrated into the Unreal workflow,” says Burstin. “The team made modifications to the hair shader in Unreal to match the Maya/Arnold reference.
Extreme closeups of the characters were rendered in Arnold to get the softest fur look. Each character had about 600,000 hair strands at Level of Detail (LOD). The background bears had about 100,000 strands at LOD. The team set up three to four LODs for each character, allowing the rendering of complex scenes without running out of memory on the 24GB VRAM Nvidia RTX 3090.