Share AR worlds and play multiplayer games, no matter what platform you're on.



MOUNTAIN VIEW, CALIF.—Google is launching a new version of its augmented reality platform for Android, ARCore 1.2. Version 1.2 adds support for wall detection, launching an AR experience via image recognition, a new "Sceneform" framework, and a "Cloud Anchors" feature that enables shared experiences not just across Android devices—it works on top of iOS' ARKit, too.

Google launched ARCore version 1.0 in February as its big reboot of the Project Tango augmented reality project. Where Tango was focused on special hardware with extra sensors and cameras, ARCore tries to replicate some of that functionality on a normal smartphone. ARCore doesn't work on every single Android phone; instead, it works on a model-by-model basis (mostly on flagships) and requires some work from the device manufacturer. Most of the major Android OEMs, like Samsung, LG, and OnePlus, are signed up though, and today ARCore has a supported install base of more than 100 million devices.

Any developer who wants to can make an ARCore app, which will usually involve 3D objects overlaid on a video feed. The goal of ARCore (and any AR app, really) is to do enough detection of the world to make the 3D object seem like it exists as a real object inside the video feed. On the other side of the aisle, Apple's AR framework is called ARKit. While Google started the whole AR thing first with Project Tango all the way back in 2014, Apple first brought AR to a mainstream device by rolling ARKit out for the last few years of iPhones in 2017.

Unifying AR with cloud anchors


The big news is the addition of "cloud anchors," which, for the first time, can enable multiplayer AR experiences. Cloud anchors will sync the location of a virtual item in 3D space (the anchor) up to the cloud, and then send it back down to other devices. So if you and a friend have the same AR app open on two different devices, you could both see the same virtual object in the same location. This could enable things like a multiplayer game using a virtual tic-tac-toe board or whatever else a developer wants to cook up.

The real kicker is that cloud anchors don't just work across different Android devices; Google is also building a library for iOS devices. ARCore is still an Android-only augmented reality framework, but a special cloud anchor library can run on top of iOS' ARKit. On iOS the system would use all the built-in ARKit functionality, like motion tracking and environmental estimation, and just sync that data to the cloud. Assuming a developer builds the same app for both platforms, you can sync AR data between Android and iOS to share an experience no matter which platform you're on.

ARCore's motion tracking works by identifying the corners of a plane and creating a sparse point cloud. Cloud anchors would store that information in Google's cloud and give the developer an ID for that information. That ID could be shared with other users, which would compare the two private-point clouds and stick the AR object in the same spot.

Cloud Anchors are only meant to be used as an initialization of an AR world—they will sync the location of AR objects across devices and not much else. From there a developer can use its own multiplayer networking system over local Wi-Fi or the Internet. The normal multiplayer networking stack could handle updating an object's location, status changes, progression, and anything else you would normally need in a multiplayer experience. The Cloud Anchor system will handle the initial AR world sync in a few seconds, and from there the latency is just whatever the latency of a normal multiplayer network would be.

Wall support, image detection, and an easier SDK


ARCore 1.0 could only detect horizontal planes, but with version 1.2 it can now detect walls and vertical surfaces. This will be great for things like placing virtual furniture against a wall in a house, hanging a picture on a wall, or opening a portal to another dimension in your doorway. It's also keeping pace with ARKit, which announced vertical plane detection back in January. Full wall occlusion support is still missing, but there is enough to detect if an object is on the wrong side of a wall and display something to the user.

"Augmented Images" is a new feature that can launch an AR experience by pointing the phone camera at an object. This is more than the usual QR codes and AR markers—it is also a general image-recognition system. Developers can specify up to 1,000 2D images for their app, allowing ARCore to detect things like a product box or movie poster without the clunky square AR markers. Aiming the phone at one of these objects could do something like bring the movie poster to life or show assembly instructions for a product.

Google is also making the Android developer side of the AR equation easier with the release of the Sceneform SDK. Augmented reality, since it requires the 3D display of objects, also requires developers to basically become game developers. Jumping into the world of the Android NDK, OpenGL and game engines like Unity can be a little daunting for Android's 2D Java developers. So the Sceneform SDK is an Android Studio plugin and runtime API that allows developers to easily create an AR experience without having to deal with the whole 3D development stack. Sceneform includes lots of common 3D user interface widgets so users can select an object or move things without the developer having to reinvent the wheel.