Google is taking yet another step to commercialise its Augmented Reality platform and bring AR to millions of Anroid smartphones. After launching the Tango AR project back in 2014, the Mountain View based tech giant has now introduced a new SDK called ARCore, which will make adoption of AR easier for Android devices by eliminating the need for additional hardware.
The difference between Google Tango and the new ARCore platform is that the former makes use of computer vision for Augmented Reality applications supported by motion tracking, area learning, and depth perception capabilities. There are only two smartphones that support the Tango platform as of now – ASUS ZenFone AR and Lenovo Phab 2 Pro. Given the limitation of having to house additional sensors in devices, Google Tango is yet to see an enthusiastic response from smartphone makers.
With the ARCore SDK, Google aims to ease the adoption of augmented reality on Android devices. The new SDK will help developers create augmented reality applications for Android without the need of additional sensors or hardware improvements.
According to TechCrunch, Google is even re-branding its existing Tango enabled devices to ARCore devices, while the depth-sensing technology and camera hardware created for Tango will still be very much a part of the company’s AR portfolio. Essentially, Tango is not dying and is just being pulled into the ARCore family.
ARCore will use three main technologies – motion tracking, tracking of size and location of flat horizontal surfaces like the ground and light estimation allowing a phone to estimate the environment's current lighting conditions. ARCore is designed to function with devices running Android Nougat or later and the SDK preview currently supports Google Pixel, Pixel XL and Samsung Galaxy S8.
As the mobile phone moves around, ARCore tracks its position and as per Google, “builds its own understanding of the real world.” It’s motion tracking feature works through the phone’s camera. “With a combination of the movement of these points and readings from the phone's inertial sensors, ARCore determines both the position and orientation of the phone as it moves through space,” Google writes in a blog post. It can also detect flat surfaces like table tops and the floor, along with estimating the average lighting around the area.
Exemplifying how ARCore works, Google writes, “You can place a napping kitten on the corner of your coffee table, or annotate a painting with biographical information about the artist. Motion tracking means that you can move around and view these objects from any angle, and even if you turn around and leave the room, when you come back, the kitten or annotation will be right where you left it.”
Google's new Augmented Reality push comes at the same time as Apple readies itself to officially roll-out its ARKit platform. ARKit will be available on iPhones and iPads with the official iOS 11 release. It uses Visual Inertial Odometry (VIO) to track motion and objects by combining the camera sensore data with CoreMotion data. This will enable iPhones and iPads with A9 and A10 processors to accurately track depth and movements without any need for calibration. Similar to Google’s ARCore, Apple’s ARKit can also detect horizontal plane surfaces, allowing users to place objects on them. ARKit’s light detection system also works in the same way as ARCore.
Multiple developers have already showcased some exciting new applications for the Apple ARKit. Google has just opened up early preview access to ARCore so that developers can start building apps for the same and the company can gain feedback on the early version of the API. “This preview is the first step in a journey to enabling AR capabilities across the Android ecosystem,” says Google.