Crystal Cove sim

7 Mar 2014

One of the features of the Oculus ‘Crystal Cove’ prototype revealed at CES was the long-promised positional tracking, via an external camera an LEDs on the headset. It’s important for reducing simulator sickness as it allows a more 1:1 tracking of your motion. But more critically such tracking allows new gameplay and experience opportunities; peering around corners, inspecting objects and the world as naturally as you do in real life. Want to look at something a bit more closely? Move your head.

Working on my theme of trying to understand via making things what is coming and building with that future in mind, I’ve produced a first pass at a demo of positional tracking, ala Crystal Cove.

The demo uses your webcam to track your head, allowing 4 degrees of motion tracking. This if reflected in a scene of the Crystal Cove prototype (thanks Chris!) and a view of what the IR-filtered camera sees. Shy of the 6 needed to fully track the motion (currently).

What is it good for specifically? If extended you could use the simulated IR camera data to develop and test tracking algorithms, comparing to a ‘true’ data, such as the webcam head tracking. So, engineering.

Another possibility is for marketing. While an in-person demo might be the ideal way to showcase the Rift to consumers there are other contexts such as the web in which it could be valuable to demonstrate the various functional aspects of the technology, interactively.

But of course the main goal would be actual in-game usage of positional tracking. A very basic demo, Fifa VR, demonstrates being able to lean in to get a better view, move one's head up and down to change ones perspective. Positional tracking opens up a ton of interesting possibilities.

More to come on a fuller native (Unity) tracking solution, no DK2 needed.