Niantic Labs, the San Francisco-based game developer responsible for creating the massively successful augmented reality game Pokémon Go, plans to open up the underlying AR platform behind its products to third-party developers.
In a meeting with reporters yesterday at its headquarters, CEO John Hanke gave a detailed overview of that technology — what Niantic calls its Real World Platform. It’s the engine behind the AR experiences in Ingress, Pokémon Go, and the that the company is developing alongside Warner Bros. And Niantic says it’s getting better all the time.
acquisition in February of a company called Escher Reality that is now helping Niantic develop cross-platform shared AR experiences that can involve multiple people in the same interactive digital space.
In a series of demos, Niantic showed off some of the new and experimental capabilities of its Real World Platform, techniques partly thanks to Matrix Mill’s sophisticated machine learning prowess and others drawing on the shared AR experience skills Escher provides. One involved a new AR visual technique Niantic calls occlusion, which allows virtual creatures like a 3D Pikachu to more realistically blend into real-world environments. That involves using machine learning techniques to train a neural network that can reliably, and in real time, parse a live scene with dynamic parts to make it so people and objects obscure the virtual creatures and hide them from sight when necessary.
“Imagine, for example, that if our platform is able to identify and contextualize the presence of flowers, then it will know to make the tiny bee pokémon, Combee, appear. Or, if the AR can see and contextualize a lake, it will know to make the duck pokémon, Psyduck, appear,” Hanke explained in a blog post. “Recognizing objects isn’t limited to understanding what they are, but also where they are. One of the key limitations of AR currently is that AR objects cannot interact meaningfully in a 3D space. Ideally, AR objects should be able to blend into our reality, seamlessly moving behind and around real world objects.”
Another demo Niantic showed off illustrated how, with the talent and technology its acquired from Escher, it can develop apps that let multiple people interact in a shared AR environment, regardless of what type of device they’re using. To do so, Niantic says it developed a low-latency AR networking technique that removes the need for a smartphone to communicate with a server before establishing a connection to a nearby user. Instead, the network of devices lets each one communicate directly with another through cell tower transmission, allowing for lower latency connections and more immediate interactions with other players. The company built a demo game it calls Neon to show off the technique:
In the future, Hanke says he wants the Niantic Real World Platform to operate much like Amazon Web Services does for cloud computing. In other words, app makers will be able to tap into the power of its platform from anywhere in the world to develop their own experiences and services that utilize AR technology and tools. Niantic has yet to release any concrete information about revenue sharing, or whether the company would take a cut of apps developed using its technology. But the developer plans on releasing more information about the Real Word Platform and any API capabilities in the coming months.
Niantic has put up a website where developers can get more information about its Real World Platform and how to apply to gain access to it. “Because we are so excited about the opportunity in advanced AR, we want other people to be able to make use of the Niantic Real World Platform to build innovative experiences that connect the physical and the digital in ways that we haven’t yet imagined,” Hanke said. “We will be selecting a handful of third-party developers to begin working with these tools later this year.”