Students Made A Virtual Reality Experience About Police Brutality And It's Intensely Powerful
“Because it is so immersive, virtual reality is very useful for designing for moral experiences."
Injustice's creators Atit Kothari, Elizabeth Won, Jaehee Cho, Martin Ding and Tiffa Cheng initially made the project as a response to the 2014 shooting death of Michael Brown in Ferguson. But they believe the technology they've developed could be used to drive much-needed dialogue about other social issues, too.
"Because it is so immersive, virtual reality is very useful for designing for moral experiences," Kothari, one of Injustice's programmers, says.The project aims to explore the emotional impact of VR versus traditional film or gaming.
"With this project we wanted to combine what games can do and also what film can do. Because film is really good at making people have empathy, but games are more like doing," Won, one of Injustice's 2D artists, says.
"VR is such a powerful method, in that its users have an emotional connection to what they see and they can be active in the VR space, which creates a level of experience that they've never had before."
Indeed, the Injustice experience is so powerful, one user burst into tears when she tried it for the first time.
"She broke down," Atit Kothari says. "There was a very personal emotion. She didn't know what to feel."
The project took just 15 weeks to create, with a budget of around $8,000.
To film the live footage at eye level, the team had to build a humanized tripod equipped with 10 GoPro cameras, which they secured around a 3D-printed rig.
"We used tripods and the half-part of the mannequin torso, and we made holes on the neck of the torso and mounted the rig so users can look around in the VR space," Kothari says.
They then stitched the footage together to enable a 360° experience.
The experience's voice recognition capabilities are provided by Google Speech API, a voice-driven web app that improves the more it's talked to.
"[The voice recognition app used by Injustice] understands when you stop talking; the recorded clip is then sent to Google server, which understands what you just said. We then get the text,
parse the text and match it with options that we have. Then the next clip is played depending on what you spoke," Kothari says.
"It happens in real time so it doesn't break the experience."
The team also built a custom plugin to play 360° video smoothly in game development tool Unity, which is not designed for such high-resolution video.
Injustice is currently being used in a social decision-making class at Carnegie Mellon University and has been presented at several festivals, most recently at last month's TriBeca Film Festival in New York.
As for future projects, at least three members of the group – Won, Cho and Kothari – want to continue working in the VR space after graduation this month.
For future collaborations "we're looking for funding and production houses, and nonprofits that have a story to tell," Kothari says.
Wherever they end up, one thing's clear: With the public release of Injustice's platform Oculus Rift and rival platform HTCVive, the possibilities for VR are only going to expand.
"There's hardware which is mature enough and there's software that is evolving every day," Kothari says.
"I think it is here to stay. There's an interest and its getting bigger," he says.