https://www.youtube.com/watch?v=LIUYDUlGJzM

SAN JOSE, Calif.—As great as it is enjoying untethered virtual reality on devices like Samsung's Gear VR and Google Daydream, these portable experiences are held back by their lack of full head-tracking. Without it, you're pretty much stuck sitting in one place and tilting your head on a swivel—you can't even shift side-to-side or lean down to look at a virtual object without the entire virtual world moving along with you.

As Oculus announced Thursday, the company is actively looking to fix this problem with "a completely new category" of virtual reality headset with "inside-out" tracking technology. By using cameras built in to the device itself, combined with computer vision algorithms, inside-out tracking lets a headset calculate your position and head angle as you walk freely about a room, with no external cameras or PC/console tether required.

I got a chance to try out an early prototype of Oculus' effort, dubbed the "Santa Cruz" prototype, at the company's Oculus Connect conference late this week. The prototype isn't perfect, but it already shows a lot of promise and a distinct vision for what portable VR will look like in the future.

Oculus wouldn't let me take pictures of the Santa Cruz headset (or even record audio of my demo time), but there isn't much to see. The prototype looks like a retail Oculus Rift (sans wires) with the addition of a semi-exposed motherboard, roughly the size of a deck of playing cards, driving things from the rear of the headstrap. The glass apertures of four outward-facing cameras (about a centimeter in diameter each) are barely noticeable embedded in the four corners on the front of the display housing.

I got to put on the headset in a specially designed demo room that measured about 10 x 20 feet. Most of the room was bare, save for a large rug, but the sides were cluttered with all sorts of random knickknacks—a guitar hanging on the wall, a set of chairs and a desk in a corner, etc. The point, it seemed, was to prove that the system's computer vision algorithms could handle complex surfaces just as well as flat floors and ceilings.

That point was well made. I was only in the headset for five minutes or so, but in that time I was able to walk around the virtual space unencumbered by any sort of wires. I crouched down to look at a happy, bouncing flower. I craned my neck to see inside a window where someone was talking on a phone. I stepped forward to stand next to a fire truck pouring water through a window into an apparent raging inferno.

The tracking wasn't quite as robust as with a retail headset like the Oculus Rift or HTC Vive. At times, the system took a few noticeable frames to "catch up" to my new position, especially if I moved my head quickly to one side. Once the system caught up, though, the virtual world around me felt like a solid, "real" physical space. The overall effect was much more convincing than other inside-out AR and VR solutions I've tried at trade shows, all of which have had much more noticeable "judder" effect and significant lag.

The larger issue with the headset, as it always is for mobile VR, comes down to raw processing power. While the Santa Cruz prototype runs at the same resolution (and looks just as sharp) as the retail Oculus Rift, the scene I was shown was incredibly sparse. Characters and objects in the virtual world were made of large, blocky polygons with simple, single-color shadings. The environment reminded me of a smoother, stereoscopic version of an early game on the original PlayStation (or worse, the SNES' Super FX chip).

The engineers on hand dodged my questions about the precise processing power available in the prototype, but I suspect the extremely simple demo scene I was shown was near the limit of what the prototype can handle smoothly at this point. That would be understandable for new technology running on a tiny motherboard with an extremely small and light physical footprint—hardware that surely has to devote significant power to those complex computer vision algorithms, to boot.

That said, untethered mobile headsets are already at a processing disadvantage when compared to their tethered cousins. The large, heavy, heat-prone CPUs and GPUs that drive a top-of-the-line PC would never work in a totally portable VR solution (backpack laptops notwithstanding). Add the extra overhead of crunching four camera views through a complex computer vision algorithm and the processing gap is only going to become more apparent (wireless transmission of scenes generated on a PC tower could solve this problem, but that currently seems like a pipe dream at the latency and frame rates needed for good VR).

Those are issues for another day, though. For now, the Santa Cruz experience convinced me that, with a little more work, wireless head-tracking is a problem that will be solved in the near future. As mobile processors continue to improve, we'll soon be at the point where a completely wireless headset can provide "good enough" virtual reality complete with convincing tracking driven by internal cameras. Just don't expect it to ever fully match the raw visual quality of a high-end, tethered VR experience.