Vizor at the 55th New York Film Festival
We got invited to present an interactive VR piece at this year’s New York Film Festival, more precisely at the Convergence sub-event, which is meant to showcase new forms of digital storytelling. The event was held at the Lincoln Center for Performing Arts on the final weekend of September 2017.
Our project for the festival was called Virtual Jockeys (or Reality Jockeys), where we manipulate the VR environment in real-time, while the user is immersed in it. The idea was to show how the virtual space can be a dynamic canvas that we can improvise on, similarly to a DJ selecting music or a VJ selecting visuals. This is made possible by the multi-user editing feature that we already have in Patches. It’s not a well documented feature, so it was a nice challenge for us to put it to use in a real live festival setting.
How the experience works
The idea was to use two computers (in this case laptops) for the experience, one for the VR viewing, and another for the real time editing using the multi-user editor in Patches. This works by making a duplicate of the project and then sending that URL to the other computer for it to be edited at the same time as it is viewed in VR.
The experience starts from a room with a few objects and a clickable light switch. The viewer can turn on the lights and get adjusted to the roomscale VR space by walking around a bit. The viewer is asked what colors they would like for the armchair, walls and shelves, and they will be changed in real time from another computer. These settings can be found inside the
ROOM patch in the project.
After the custom colors, we add a few more items to the scene by toggling on the visibility parameter in the material patch of the object. Because it was a Film Festival, these items are movie references and we asked if the viewers recognized them. Adding new items by drag and drop always caused severe frame rate drops and tracking issues, so the visibility toggle was the safest option to not cause the VR view to lag when adding items to the scene.
When the TV and Mask are visible and the viewer has interacted with them, the lights are turned off and the radio will be toggled on. The radio can be clicked on to start the music. The final object is the Red button, which opens the room wall and window for the second half of the experience.
The second phase of the experience opens a large landscape of mountains and a space sky. The control panel on the rails has 3 clickable buttons, which the viewer can press. The yellow button hides the room and gives you a better view of the landscape. The red button tilts the landscape up slowly, and the blue button resets the tilting. After the buttons, the editor can start adding art visuals little by little.
The landscape has several different art visuals to be toggled on. The galaxy will be made visible first by increasing it’s opacity, which makes it look like it fades in to the landscape. Next the mountains should be turned off and planet turned on so that the viewer can look down to see the planet Earth. Then add the giant exploding flowers and finally before the music ends add the flying manta rays. The viewer will have time to look around and enjoy the relaxing visuals.
Well, how did it go?
Thanks for asking! It was a very positive experience for us. We did the demo for about 90 people during the course of 15 hours over 3 days. For the great majority of people there this was the first time they tried roomscale VR, and this particular piece was actually a good place to start. We had designed several things into the piece to make sure the viewer gets a good sense of presence, and happily it seemed to work.
The key moment in the piece is the transition from a closed indoor space to the open outdoor balcony - when this happened we managed to get a “wow” reaction from everyone. As the first part of the experience already gives you the feeling of presence in the room, when the second part is revealed the user had already suspended their disbelief and were happy just to relax and chill out to the 3D space visuals for a couple of minutes more.
After the experience ended, we published the piece individually for each visitor - this way they had a personal URL to the piece they could take home and share with their friends. We are happy to notice that these published URLs have been viewed quite a lot, meaning people did actually share and re-watch them.
We’d like to thank everyone who came to check out the piece, NYFF/Convergence for inviting us and a special shout out to our support team: Anna Rosa for coordinating the whole thing, Kschzt for the music, and Sketchfab users for some 3D objects we used (CC Attribution can be found in the piece). Full list of credits can be found here.