I read about the Oculus Rift stereo-view, motion tracking gaming headset a while ago, but it was only at the recent Unity3D developer conference when I got to try it in person. The Oculus people were using an updated version of the current version developer edition for the demo, and so I was a little apprehensive about getting a standard one for myself. But it’s a gadget, and I’ve been trying to make my own versions for years now, so I couldn’t resist. (The nearest I got with my own experiments was with a pair of Gordie LaForge – esque video goggles with an iPhone duct taped onto the side for motion tracking, but that lacked stereo viewing and looked quite ridiculous). After a long couple of weeks, my own Rift arrived and I was able to give it a go myself. I wasn’t disappointed.

Even the provided storage box is cool.

The development kit consists of the headset itself (which feels surprisingly well-made for a piece of dev kit), three pairs of eye-cups for perfect, near and very near sight correction, and a black box with USB and video inputs. USB and HDMI-to-HDMI (and a DVA adaptor) are provided. If your PC doesn’t have HDMI or DVI output, you’ll need to get an adaptor. You then just hook up the black box to your PC, and put the headset on and proceed to have your mind blown.

The black box. Don’t forget, as I did, there is an on/off switch. If you forget the switch, the view is black no matter how much you swear at it.

My eyesight is pretty bad (around -4.25), and so I had the option of keeping my glasses on, or trying the included eye-cups rated for very-near-sightedness. Unfortunately, the near-sighted cup is not quite strong enough to make things clear, but it’s better than squeezing my glasses into the headset. A different pair of specs might help, as would an eye-cup custom made for my prescription, but if you wear contacts or have decent eye-sight, you’ll be fine.

A view of the two eye-cups.

The display inside the headset is a single LCD panel, and the lenses force each eye to see a particular side. This means your computer’s display shows the same scene from two viewpoints side-by-side. The video resolution is definitely obvious because you are staring at a LCD screen which is centimeters from your eyes (this is when a Retina-class screen is really needed!) but it doesn’t really matter because you are immediately sucked into the scene. The reason you are sucked-in is that the view isn’t just stereo, but it moves as you move your head. Some very nice tracking hardware and software detects your head moving, and so when you look around you (up, down, left and right) the scene changes. And it changes quickly enough that you believe what you are seeing is almost real. It’s one of those things you need to see for yourself to get the full impact.

You too can look like an open-mouthed idiot.

The demo app I played with is called Tuscany, and in it you simply walk around a virtual house on a cliff on a beautiful Italian day. After a few seconds, your brain is convinced you really are walking around the garden. Put on a pair of headphones for an even more immersive effect.

If you watch someone wearing the headset, and virtually steer them towards the cliff, they’ll yell and lean back or put up their hands to avoid falling or bumping into something. My brother tried to recapture his bearings by leaning on a wall that didn’t exist. The biggest shock comes when you look around to find your computer, and you can’t see it. Or your hands. Or your feet. It’s an odd feeling.

Scenes from the demo, showing how the view is split vertically.

Some people complain of motion sickness, a very odd kind of sickness that occurs because your brain is convinced your body is moving, but its not. So far, I’ve not had this feeling – and I am very, very prone to motion sickness – which I think is due to using demos in which you DO move your head around to get a view, rather than being pushed around to new viewpoints by the software.

Programming and beyond

The great thing about the Oculus Rift is that it’s a dev kit, and if you can write code, you can use it in your own apps. Any DirectX app can use it of course, because it’s just a side-by-side view, but the simplest way is to use Unity3D for authoring your 3D worlds, as then you only need to drop in a few lines of code and you’re supporting the Rift. 

Now, as I mentioned, it’s odd not seeing your hands in your virtual world, and this can make creating a user interface a little tricky. Some of the demos I’ve used have you move your head around to select options from a floating menu, which is pretty cool. Another uses a Razer Hydra which tracks your hands position in real space and allows you to move around and interact with virtual objects. The LEAP motion controller is also something that can be used as input. It’s all really getting rather TRON-meets-Inception-meets-Iron Man!

I’m still playing with the demos, but I hope to start writing some of my apps soon. I’ve plenty of ideas, and if you do too, I suggest you order your own Oculus dev kit. It’s cutting-edge stuff.




Comments (0)

Skip to main content