THE ENEMY
Making Of
Par Fabien Barati – Part 4/7
Free Roaming
The Enemy’s main technological challenge was to allow visitors wearing VR headsets to move freely around more than 200m². Walking around in the real world translates in the same movements inside the virtual world displayed in the headset. This is called free roaming. In 2014, there was no VR installation capable of providing such a free-roaming experience to the public, let alone in multi-user settings.
To achieve this illusion, I had the idea of combining a VR headset with a motion-capture system. Having previously worked on VR CAVEs, I had experience in motion capture adapted to VR. Indeed, in a CAVE, the user’s position is tracked and the images projected on the walls around them are calculated in real time according to their position. This is the very concept that had to be adapted to VR headsets and across a large room.
A CAVE is an immersive room in which you can experience virtual reality
I knew that these systems were very responsive and offered low latency. This means the time required to capture and calculate the user’s position and to send out an image is very short (less than 10 milliseconds). Too high a latency is prohibitive in VR, as it can lead to motion sickness. The user must see images that perfectly match the accelerations they experience when moving.
Another advantage of this system is that the capture area can be extended indefinitely by simply adding tracking cameras.
Among several motion-capture systems, I chose Optitrack for its simplicity and efficiency. They have become increasingly prevalent in virtual reality in recent years. Their teams were very helpful, especially François Asseman who did a lot for the project.
Each user’s VR headset is equipped with markers that reflect or send out infrared light. Optitrack cameras detect the infrared light from these markers. They are installed around the operating area so that any given point is covered by at least 3 of them. The data captured by the cameras are then transmitted to the Motive software, which determines the exact position – to the nearest millimetre – of all participants.
That is the system we managed to implement to achieve our first prototype of The Enemy.
Optitrack cameras detect “markers” placed on the VR headset
First Prototype
The first usable version of The Enemy, the first prototype, was intended to showcase the project in a simplified version. The goal was to communicate and above all raise funds to achieve the project in its entirety.
On a technical level, users were equipped with a VR Oculus DK2 headset and an audio headset. These two devices were themselves connected by three long cables (USB, HDMI, Audio) to a computer that performed all the calculations, including for images visible in the headset. This allowed users to move freely around about 50m². The computer also served as a relay for the Optitrack device and as a control centre to monitor the experiment. Six cameras were set up around the space to detect users.
I was looking for the most effective way to be detected by these cameras. I came up with the idea to fit the VR helmet with antennas that look, let’s say, rather quirky. The reflective balls at the end of the antennas created a pattern that the cameras could transform into position and orientation. It turned out to be a reliable way to track participants.
Markers installed at the end of antennas, on top of the headset
This single-user version simply put the user face to face with Gilad and Abu Khaled. It was a fifteen-minute experience, and completely unique on a technical level.
We did many presentations of this prototype: France Télévisions, the French Embassy in New York, Sunnyside, MIT, Harvard, Chicago, World Bank, Futur en Scène festival, Forum des images, IDFA, the Israel Film Fest. The highlight was definitely the presentation of the prototype at the Tribeca Film Festival in April 2015.
This first prototype laid the foundations for the narrative we would use, allowed us to test the free-roaming experience and our ability to model the combatants in a perfectly realistic way. We were ready to push on and create the rest of The Enemy!
In the first prototype, the user is connected to a computer by a long cable.
Adapting the virtual to the real world
The exploitation of warehouse-scale projects (large spaces in which one can move freely in VR) generally requires the ability to adapt to the venue. We want to be able to walk in the largest area, with a VR headset, without bumping into the walls. Depending on the types of room, this area often differs architecturally: smaller, larger, longer or wider, completely maze-like or, as it is the case at the MIT Museum, dotted with pillars.
The MIT Museum room is dotted with pillars
Our ability to adapt to the location is not only important for the experience itself, but also for all the tests and presentations. The Enemy set-up has been installed around thirty times since 2014!
At that time, we designed a calibration system for the prototypes that automatically adapted virtual rooms to the real world. This allowed us to conduct presentations and tests very quickly in new locations.
The final version is more rigid because the lighting is partly pre-calculated. But of course, it is still possible to manually adapt the virtual environment to the real world.
THE ENEMY
MAKING OF
Emissive
71 rue de Provence
75009 Paris – FRANCE
+33 1 49 53 09 26