Tag Archives: Google AR

Updates to ARCore Help You Build More Interactive & Realistic AR Experiences

Posted by Anuj Gosalia

A little over a year ago, we introduced ARCore: a platform for building augmented reality (AR) experiences. Developers have been using it to create thousands of ARCore apps that help people with everything from fixing their dishwashers, to shopping for sunglasses, to mapping the night sky. Since last I/O, we've quadrupled the number of ARCore enabled devices to an estimated 400 million.

Today, at I/O we introduced updates to Augmented Images and Light Estimation - features that let you build more interactive, and realistic experiences. And to make it easier for people to experience AR, we introduced Scene Viewer, a new tool which lets users view 3D objects in AR right from your website.

Augmented Images

To make experiences appear realistic, we need to account for the fact that things in the real world don’t always stay still. That’s why we’re updating Augmented Images — our API that lets people point their camera at 2D images, like posters or packaging, to bring them to life. The updates enable you to track moving images and multiple images simultaneously. This unlocks the ability to create dynamic and interactive experiences like animated playing cards where multiple images move at the same time.

Letter cards overlaid with an example of how Augmented Images API can be used with moving targets

An example of how the Augmented Images API can be used with moving targets by JD.com

Light Estimation

Last year, we introduced the concept of light estimation, which provides a single ambient light intensity to extend real world lighting into a digital scene. In order to provide even more realistic lighting, we’ve added a new mode, Environmental HDR, to our Light Estimation API.

two mannequins with varying light

Before and after Environmental HDR is applied to the digital mannequin on the left, featuring 3D printed designs from Julia Koerner

Environmental HDR uses machine learning with a single camera frame to understand high dynamic range illumination in 360°. It takes in available light data, and extends the light into a scene with accurate shadows, highlights, reflections and more. When Environmental HDR is activated, digital objects are lit just like physical objects, so the two blend seamlessly, even when light sources are moving.

two mannequins with light diffusing from left to right

Digital mannequin on left and physical mannequin on right

Environmental HDR provides developers with three APIs to replicate real world lighting:

  • Main Directional Light: helps with placing shadows in the right direction
  • Ambient Spherical Harmonics: helps model ambient illumination from all directions
  • HDR Cubemap: provides specular highlights and reflections
Rockets showing lighting changes: Main directional light plus ambient spherical harmonics plus HTF cubemap equals environmental HDR

Scene Viewer

We want to make it easier for people to jump into AR, so today we’re introducing Scene Viewer, so that AR experience can be launched right from your website without having to download a separate app.

To make your assets accessible via Scene Viewer, first add a glTF 3D asset to your website with the <model-viewer> web component, and then add the “ar” attribute to the <model-viewer> markup. Later this year, experiences in Scene Viewer will begin to surface in your Search results.

<script type="module" src="https://unpkg.com/@google/model-viewer/dist/model-viewer.js"></script>
<script nomodule src="https://unpkg.com/@google/model-viewer/dist/model-viewer-legacy.js"></script>

<model-viewer ar src="examples/assets/YOURMODEL.gltf"
auto-rotate camera-controls alt="TEXT ABOUT YOUR MODEL" background-color="#455A64"></model-viewer>
Mobile example of NASA.gov Curiosity Rover in use

NASA.gov enables users to view the Curiosity Rover in their space

These are a few ways that improving real world understanding in ARCore can make AR experiences more interactive, realistic, and easier to access. Look for these features to roll out over the next two releases. To learn more and get started, check out the ARCore developer website.

Open Sourcing Resonance Audio

Posted by Eric Mauskopf, Product Manager

Spatial audio adds to your sense of presence when you're in VR or AR, making it feel and sound, like you're surrounded by a virtual or augmented world. And regardless of the display hardware you're using, spatial audio makes it possible to hear sounds coming from all around you.

Resonance Audio, our spatial audio SDK launched last year, enables developers to create more realistic VR and AR experiences on mobile and desktop. We've seen a number of exciting experiences emerge across a variety of platforms using our SDK. Recent examples include apps like Pixar's Coco VR for Gear VR, Disney's Star WarsTM: Jedi Challenges AR app for Android and iOS, and Runaway's Flutter VR for Daydream, which all used Resonance Audio technology.

To accelerate adoption of immersive audio technology and strengthen the developer community around it, we’re opening Resonance Audio to a community-driven development model. By creating an open source spatial audio project optimized for mobile and desktop computing, any platform or software development tool provider can easily integrate with Resonance Audio. More cross-platform and tooling support means more distribution opportunities for content creators, without the worry of investing in costly porting projects.

What's Included in the Open Source Project

As part of our open source project, we're providing a reference implementation of YouTube's Ambisonic-based spatial audio decoder, compatible with the same Ambisonics format (Ambix ACN/SN3D) used by others in the industry. Using our reference implementation, developers can easily render Ambisonic content in their VR media and other applications, while benefiting from Ambisonics open source, royalty-free model. The project also includes encoding, sound field manipulation and decoding techniques, as well as head related transfer functions (HRTFs) that we've used to achieve rich spatial audio that scales across a wide spectrum of device types and platforms. Lastly, we're making our entire library of highly optimized DSP classes and functions, open to all. This includes resamplers, convolvers, filters, delay lines and other DSP capabilities. Additionally, developers can now use Resonance Audio's brand new Spectral Reverb, an efficient, high quality, constant complexity reverb effect, in their own projects.

We've open sourced Resonance Audio as a standalone library and associated engine plugins, VST plugin, tutorials, and examples with the Apache 2.0 license. Most importantly, this means Resonance Audio is yours, so you're free to use Resonance Audio in your projects, no matter where you work. And if you see something you'd like to improve, submit a GitHub pull request to be reviewed by the Resonance Audio project committers. While the engine plugins for Unity, Unreal, FMOD, and Wwise will remain open source, going forward they will be maintained by project committers from our partners, Unity, Epic, Firelight Technologies, and Audiokinetic, respectively.

If you're interested in learning more about Resonance Audio, check out the documentation on our developer site. If you want to get more involved, visit our GitHub to access the source code, build the project, download the latest release, or even start contributing. We're looking forward to building the future of immersive audio with all of you.