Saturday, April 20, 2024
Subscribe Now

Voice Of The Crew - Since 2002

Los Angeles, California

HomeCraftsAnimationXsens Helps Mindride Create "Love Has No Labels" PSA

Xsens Helps Mindride Create “Love Has No Labels” PSA

-

LR-IMG_5643_credit_Roberto De Jesus
Photos by Roberto De Jesus.
Experiential design firm Mindride recently relied on an Xsens motion capture system to create the viral video “Love Has No Labels,” which has passed 85 million viewers on YouTube.

“Love Has No Labels” is a program created by The Ad Council to point out subconscious biases people have regarding race, age, gender, religion, sexuality and disabilities. Filmed live on the street, the PSA depicts an LED screen where skeletons hug, kiss and dance before revealing themselves as people to the audience. Variations include same sex, elderly and interracial couples, provoking the audience to consider that maybe we are all the same underneath it all.

To create the “Love Has No Labels” X-Ray machine, Mindride developed a new pipeline that leveraged motion capture techniques typically used in Hollywood movies and video games. By employing Xsens MVN, a wearable motion capture system, the team could stream the actor’s movements in real time as live mocap data. That data was then fed into a rendering computer that animated the 3D skeletons.

LR-IMG_4861_credit_Roberto De JesusHelping the characters interact with each other and believably look like they were in the same space was a key priority. To facilitate this, a midi controller was used to control and reposition characters during the performance. This proved useful for situations that involved hugging or hand holding.

“Once we had the data mapped to the skeletons, we became puppeteers, augmenting the gestural nuances of the performers’ hands and jaws,” explained Yehuda Duenyas, chief creative officer at Mindride. “The result was a totally live, real-time, magical, X-Ray experience that highlighted the nuance and individuality of each performer.”

Preparations for the project began months in advance, with the team testing the Xsens sensors for live applications. Deciding on 17 sensors, placed over key areas like body joints, arms, legs and the actor’s neck, Mindride found a mix that could give them the motion capture data they needed to make those skeletons pass the test.

LR-IMG_4401_credti_MINDRIDE“Ultimately we were trying to figure out how to make the skeletons move in the most human way possible,” said Duenyas. “After rigging the avatars in Maya, and lining them up with MVN data, it quickly became clear that we had the quality we needed.”

“The trackers really aren’t that big. You can hide them under your clothes,” said Michael Todd, technical lead at Mindride. “They are easy to take off and put on; don’t require any extensive set up that has to be fine tuned and calibrated. The system is just really easy to use.”

Xsens MVN is a full-body, 3D inertial motion tracking system that delivers fast, production-ready 3D data to professional animators.

“MVN opens up a lot of potential for live applications. What we are really doing is digitally translating human behavior and human action. And if we can deliver that in a live capacity, that’s really exciting,” said Duenyas.

- Advertisment -

Popular

Beowulf and 3-D

0
By Henry Turner Beowulf in 3D is a unique experience, raising not just questions about future of cinema, but also posing unique problems that the...