About a year ago I found myself wandering the streets of Park City, having struck out again in the seating lottery, and wondering what could fill my time. I ended up in the New Frontiers building, poking at new technology, and overheard someone talking about the 2 hour wait to try the Oculus Rift VR Headset. I’d heard whispers of VR before, but hadn’t thought to pay attention – who knew a year later I’d have a friend lining up to create unique content for that very device?
Bracey Smith is a filmmaker and entrepreneur who recently had an animated short film, Ouverture, make a very successful festival run. But he hopes to find his next adventure in Virtual Reality. He recently acquired a headset, which he offered to let me try. We got to talking about the implications, and mentioned that he expected Production Design becoming even more critical for a successful project. I asked him to elaborate for us at The PDC, and he obliged. Below are excerpts from that conversation, including an introduction to the technology, it’s platforms and developing technologies, and how that might affect PDs.
SKW@PDC: So let’s start at the beginning – what is the headset I’m trying now?
BS: It’s the Samsung Gear VR, but it’s powered by Oculus Rift. Oculus works with Samsung now because they use Samsung Screens in the Oculus Rift and their demo versions. Samsung partnered with them to create a mobile version – it’s a limited experience because everything is processed on the Samsung Note 4 phone that is inserted in the device. The main thing that it lacks is positional experience –you can look all around, but you can’t move. The Oculus Rift Crescent Bay Demo allows you to lean in to check out details because it tracks your full body position.
SKW@PDC: And there are many of these VR headsets being developed right now, correct? Things they hope will be plug-and-play consumer products.
BS: Yes, Sony is developing the Morpheus to work in concert with the PlayStation4, Apple just hired some programmers to work on a new project, there will be a lot of competition.
*SKW@PDC NOTE: That’s true, a short list of VR options in development also includes the Razor OSVR, Durovis Dive, 3DHead, VRTX One, Avegant Glyph, VRELIA, ANTVR Kit, VRVANA Totem, Sensics dSight, Light & Shadows NEO, FOVE, Altergaze, Panoglass, Vrizzmo, Microsoft is developing for the Xbox One; other phone adaptations include the Archos VR and the Zeiss VR One
SKW@PDC: You showed me the Google Cardboard a few days ago, which works with Android Phones. What are the main differences between the Gear VR and the Cardboard? The lenses in the Gear VR, and what else?
BS: One thing is that the light-leak on the Gear VR is virtually non-existent, also, the lenses make the Gear VR have a wider field of view, and it has positional sound. The processing power is also significantly better, because it was developed to work with this specific phone, not across many phones, without a powerful base of operations, as with the Google Cardboard. Google Cardboard was basically an experiment that caught on, and now you can get it for virtually nothing.
SKW@PDC: Which that allows people to start testing the waters, if you will, sure. So you’ve gone over this with me before, but for our readers, can you tell me about how the cameras and the audio work? There are what you called “balls of cameras,” correct?
BS: So the way they film stereoscopic 3D video is that they have enough cameras on the unit that they film two different spheres of vision with the unit, one for each eye. Each is off by the interpupillary distance, it’s off just enough to mimic what each eye sees, and they take these two spheres and they align them, so one eye is seeing one video, one eye is seeing another, and that creates the stereoscopic view.
SKW@PDC: And each sphere has enough cameras to look in front and behind and up and over and around…
BS: Right, to create each sphere there are about seven cameras, and their fields of view are stitched together to create that one sphere of video.
SKW@PDC: And it’s a similar thing with sound. When they do it right, there’s a 360° sound experience that they use several mics to create.
BS: Yeah, sound is interesting. I’ve just been getting into it, and it’s a heated topic in the sound world because capturing it is so hard. It’s called Binaural Sound, and the microphone is very interesting. If you Google it, inevitably you’ll find this image of a cube with all these ears on it, each one used to capture the sound coming from the front, back, right, and left. So there are a couple different ways of doing it. Like with visuals, you can either get mono sound, which is this one sphere, or you can do it in a way that creates stereoscopic sound, which is why the ears are there, it directs the sound waves to create a realistic audio experience. It’s weird, the way you do it, when you’re listening and moving you’re basically panning between all these ears. So I tried the Gear VR. I checked out a few demos they have loaded on the phone: “Strangers with Patrick Watson”- a 360° video wherein a guy plays piano and sings to you in a small New York home recording studio. *Spoiler: It includes one of the most intense viewing experiences VR offers, when Watson looks in your eyes. A smattering of 3D Movie Trailers, 360° photographs, 360° videos, and a 360° space “game” that would have blown my socks off as a kid. And the Oculus Cinema – a 360° theater viewing experience, complete with seats and a big screen in front of you (or you can watch a movie on the moon if you so choose), which felt so real I thought twice about going out into the NYC winter to ever see a movie again. It was impressive. The light leak once I put the headset on was indeed very small, so the 96° viewing angle could capture your entire attention. The stitching between images to create 360° viewing is largely seamless, and when you do catch one it’s not so glaring that it takes you out of the experience. The Gear VR is also surprisingly comfortable. It can’t handle glasses, so wear your contacts, but otherwise, I was happy to watch 20 minutes of programming. I could also VERY easily watch a movie, 3D or otherwise, in this environment. With earbuds in, it did feel completely immersive, it felt like you had a theater to yourself. And the stereoscopic 3D experience was superior to being in a theater with those finger-smudged glasses that cost $12 extra. The binaural sound was also quite real, if you turned around, you could hear that noise was behind you. Creepy, but very, very cool.
*SKW@PDC NOTE: Interested in putting your toe in? Here are some options.
SKW@PDC: For me, I first saw the Oculus Rift at Sundance in 2014, and I know it was at SXSW, but the line was ages long, and I really felt it was more for gamers. And gaming + movies, which is an entire category of future development to be sure, is something that I wasn’t sure I was going to be a part of, honestly. I really didn’t expect the traditional movie experience in the viewer to be as powerful. When I was drawing up my original questions for you, one of them was the difference between a VR experience and a traditional movie-going experience. I expected there to be a strong differentiation between what filmmakers do now with traditional filmmaking and viewing platforms and what they’ll want to do with VR. But it seems like VR could be just another platform. There’s theaters, TVs, computers, phones, and now there’s the VR experience. If you want to enjoy traditionally made films this way, the quality of the experience is very high. But the other question is what opportunity will actively using 360° sound and video provide - how much can you interact with these spaces? There’s no finger tracking here, you can’t reach out and see your hand, right?
BS: But it’s coming. Oculus just acquired a company called Nimble Motion, which allows a small tracker to be put on top of the Oculus which can track your hands. Once it has that data, it transfers to the computer, processes your distance to objects, and it can use that to render the means of interaction between you and the environment, hands or otherwise. It’s not haptic feedback, but that’s also on the way. I’m not sure it’s within the next 10 years, but I’ve seen tech from a couple universities that uses ultrasound to create 3D objects in real space that you can feel. Your hand can go right through it, but you can “feel” it – one thing they’ve created is a teapot that you can “feel” out. There’s no real resistance to your hand, but you can feel the sound waves in space. It’s a device that now uses a bunch of little speakers that projects 3D objects into space via sound.
SKW@PDC: Right, I’ve seen a video that uses sound waves to disrupt a physical plane, actually affecting physical items and making shapes, and even allowing them to move.
*SKW@PDC Note: Here’s that video. And more in-depth reading about haptic technology.
BS: So it’s only a matter of time. You take the haptic technology, the fact that you can calculate where the body positioning is, and soon where the hand positioning is, and it’s not too difficult to see that there will soon be a controller-less VR in 10 years that will be something that someone can jump into seamlessly and interact with their environment with some sense of feedback. And I think that there’s a lot of crossover that will have to happen for production design because of this. I think we will see many possible combinations for the “way to do this”. It’s not just going to be a camera and a director, it’s not just going to be a video game or a film, it’s not just going to be real footage or computer generated, it’s going to have to be both. There are too many advantages in too many different fields that would benefit from being together.
SKW@PDC: Right, because right now you can look around in 3D, which has it’s own implications for the art department. But being able to move around in space, to touch things, that has a whole host of other opportunities and complications. That’s when the integration of digital development from the video game world, where characters are actively picking up things like machine guns and moving them around, will start to matter. The idea of successfully scanning and creating 3D models of key set dressing or props for the audience to interact with might become interesting. Perhaps in the future modeling everything, allowing everything to be tactile will become what the big-budget film experiences will offer, that’s what Paramount will provide. Almost a Choose-Your-Own-Adventure 3D experience, which would be wild.
*SKW@PDC Note: Prior conversations between SKW@PDC and BS have included the note that this technology could create a film version of Sleep No More. If you are in NYC, the show’s still running – check it out in real physical space here. If you’re not, this is a great review.
SKW@PDC: That would mean seamlessly integrating high-quality 3D models with what could likely be physically built sets. I think part of what was so powerful about the Patrick Watson video was that it was set in a real space.
BS: Yeah, it’s kind of jarring. You’re occupying a world that is being shown for your entertainment, and your brain isn’t used to crossing into it. That fourth wall is a lot closer to home now. And what’s so successful about that demo is just sitting in that space. There’s all that shit in that apartment, and being able to look around and study it is so fascinating. You could go back there and watch the demo many times and always see something new. That’s really somewhere that set design will shine - getting a sense of authenticity that you could never get in a CGI movie.
SKW@PDC: Right, I don’t think there was a single digital thing in that space, and there was something so tactile and beautiful about that. One thing I’m thinking of in regards to merging the digital and practical is the new Star Wars trailer, which actually looks physically constructed. Not like the three prequels that were largely derided for how digital they were. I mean, they were beautiful in their own way, but it was not something you felt a part of, or could, even if you were in a VR experience. And that’s something that gets commented on a lot, this coming back to physicality.
BS: Yeah, J.J.Abrams really likes to shoot in physical locations. I think I heard something about the engine room in the Star Trek movie actually being the engine room in the Budweiser Brewery, he actually wanted to get the sense of scale and real equipment and stuff, things you can’t get digitally yet. But even moving around in a small space, as with the Watson video, will require an integration. You’ll want to capture live, but if you want the audience to actually see the space and move through it, you’ll have to model some elements. You’ll use the video initially to inform how things feel, look, move, but then you’ll have to apply that to a 3D construction based on what the camera is picking up. And that will allow you to walk through the spaces. I don’t know how far we are from that happening successfully, there are a few examples, but I think that’s where we’re ultimately going to go for storytellers. But right now, something very simple, like a simple play is going to be very successful on this platform. People who really capitalize on telling a story in a single space from a single location, with performers in that space, will see a big return. The less you move the camera, the better it might be for the audience.
SKW@PDC: That reminds me of something I’m interested in - as a designer, I’m constantly intrigued by seeing people occupy the space and move around as they have interactions. Now, people move around the space, and the camera moves to allow space for the actors and to track them, but with this you could be standing in the space watching the actors, and turning to see them engage the room around you completely. It’s a bit voyeuristic, but then you get to fully understand how the characters interact with the space while it’s all available for you to also understand and experience as a viewer. The immersion into character’s space might be much more impressive.
BS: Sure, just being in a space, with or without people, can be impressive.
SKW@PDC: So what do you want to do with this technology specifically? What do you hope to accomplish as a filmmaker, do you want to cross those lines between storytelling and VR?
BS: I’m looking to start really simple. Something you can do really well, which the Watson piece demonstrated, is a simple sit-down experience. Which is something that experiencing classical music can offer. Once you get down the sound, you don’t need it to be a crazy space, but you can have the orchestra play around you in a way you might not ever have access to, just play. I want to create a catalogue of all these musicians, just sitting there, just as you might go out to see them play at a chamber orchestra, but with a unique richness of music rarely seen. I think the early days of this phenomenon could benefit from experiences that are as simple as that. Having a catalogue of experiences aren’t movies, but are things that people have to sit down to appreciate. Watching a storyteller share a real-life story, but being able to be in their space. You get to see the world they chose to surround themself in, and that will also reflect the story that they are telling. Like a fireside chat. I think there’s plenty of talented people out there that can pull this off. So let’s enjoy it.
SKW@PDC: Thank you so much, Bracey. This has been really enjoyable.
BC: Yeah. Interesting times ahead. And lots of room to play. I think that’s what’s most exciting. It’s so early stages, and there’s so much excitement that we’re going to see so many innovations in our world that we never thought possible. So many radical ideas will just crash together that will create something so interesting, and also, possibly, horrible things…
SKW@PDC: Haha, right, but you have hope for humanity. That's great. Again, thank you so much.
BC: You’re very welcome.
We have since followed up, talked about how VR might impact when and where people gather, or chose not to. How viewing films at home in the theater environment, with the experience but without disruptive moviegoers, would be great. We spent more time talking about collective creation – in gaming situations like Minecraft (https://minecraft.net), and how this relates to and deviates from virtual worlds like Second Life (http://secondlife.com). There are a lot of places to go and develop from where we are now, months or years before something as powerful as the Oculus Rift experience becomes part of every day life.
And I’ve been thinking about how the roles on a film set will change. When the viewer controls the camera and sound, which drive the traditional movie making-and-going experience so much, could the actors and environment could become stronger in guiding the audience's attention? I think that's possible. But for now, I'm happy to see people getting excited about just seeing rooms around the world. Some links:
An article on Oculus Rift Crescent Bay – but check out the second video. It profiles the Gear VR, and in it the commentators discuss how they would like to “go to cool spaces and just wander around” – a hint that Production Design could be a strong part of VR’s future. http://www.ign.com/articles/2015/01/09/ces-2015-oculus-rift-crescent-bay-is-the-most-impressive-vr-demo-ive-ever-experienced
And another “it’s so cool to be in actual spaces” article:
There are more out there, coming every day. This is all just getting started. Interesting times ahead, indeed.
-SKW@PDC