Here's Google's New Push To Make Virtual Reality Better And More Accessible

The company spent much of its 2017 developer conference talking about a technology that seems to be perpetually looming on the horizon.

At Google IO on Thursday, the company announced a series of efforts aimed at making the immersive digital worlds of virtual reality (VR) and augmented reality (AR) more practical and universally accessible.

Standalone Daydream (Google's VR brand) headsets are coming, with a reference design for that headset that other companies could follow. The first partners are HTC and Lenovo. Previously, Google’s efforts have required a Daydream-ready phone that is inserted into a viewer. Currently, there are only 8 such phones on the market, although a few more, including Samsung's Galaxy S8, are on the way.

The standalone headset experience is more like what you currently get in an Oculus or HTC Vive headset in that it responds not just to turning your head, but to physical movement as well. You can not only look all around, you can get up and walk — at least short distances. You can peer around corners and see parallax movement of spatial objects as you explore VR worlds.

But the major difference is that Google’s solution doesn’t require additional devices to be set up — you don’t need to connect it to a computer, or to set up towers that track and sense motion. Instead, it tracks motion with the device itself. This relies on something called WorldSense (which builds on the company’s Tango indoor mapping and spatial awareness technology) to achieve this via the device all on its own.

The company also showed off Seurat, named for the painter, a new developer tool that’s meant to help create vivid, high resolution graphics, and video that runs in real time. The idea is that this will allow developers to create ever more-realistic worlds, even on a mobile unit that doesn’t have the power of a connected desktop machine.

There were other developer-oriented announcements as well, designed to make it easier and more attractive to develop for Google’s platform, as well as things designed to make VR a less solitary experience, like new sharing tools, ways to project what you see in a device onto a TV, or ways to watch VR video in YouTube with other people.

Yet the announcements themselves were almost less interesting than the weight Google is putting behind its push. For the second year in a row, the company devoted much of the time and resources of its annual developer’s conference to pushing VR and AR (or as Clay Bavor, who heads the company’s efforts, calls the combination of the two and the spectrum they lie upon, immersive computing).

There is still a long way to go before these immersive computing experiences are mainstream. But, especially when taken alongside Facebook’s similar push, we are clearly entering an era of new interfaces and inputs — away from the keyboard and touchscreen and into an era that’s more guided by what we see around us, the things we hear and say, and the way we physically move through the world.

The current devices meant to achieve almost all of this are, well, clunky. But a picture is starting to emerge of where this all is going, and it’s a place where the devices themselves fade away, and we begin to interact with them in more natural, human ways. We will see and hear, talk and gesture. And sometimes even type.

Topics in this article

Skip to footer