Tag Archives: Virtual Reality

YouTube VR: A whole new way to watch… and create

We’re inspired every day by the videos people share on YouTube. The idea that anybody, from any part of the globe, can create something that all of us can watch, laugh at, cry to, or relate to. Not only has this turned YouTube into a home for just about any idea imaginable, it’s made it the place where new video formats, new ways to connect, and new ways to tell stories are born.

We want to continue to provide you with new ways to engage with the world and with your community, and we believe virtual reality will play an important role in the future of storytelling. More than just an amazing new technology, VR allows us to make deep, human connections with people, places and stories. That’s why we’re committed to giving creators the space and resources they need to learn about, experiment with, and create virtual reality video. In fact, we’ve already started working with some awesome creators, recording artists, and partners who are producing VR videos across a wide variety of genres and interest areas on YouTube.

Want to spend some time with beauty vlogger Meredith Foster? Check out an immersive tour of her apartment. More of a foodie? Tastemade’s VR cooking videos breathe new life into learning new recipes. Rooster Teeth reimagines their gaming comedy “Red vs. Blue” for some fresh laughs. You can even watch breaking news in VR from HuffPost RYOT.

We’ve also been working to allow you to have experiences or visit places you might not be able to (or might not dare to!) in real life. Go swimming with sharks thanks to Curiscope, get a first-hand look at a living, breathing dinosaur at the Natural History Museum in London, travel to Belize with StyleHaul, hike a trail a thousand miles away with Daniel and Kelli at Fitness Blender, or watch Tritonal in concert no matter where you are.


And today you can experience all of this amazing content in a more immersive way with the brand new YouTube VR app, available first on Daydream. This new standalone app was built from the ground up and optimized for VR. You just need a Daydream-ready phone like Pixel and the new Daydream View headset and controller to get started. Every single video on the platform becomes an immersive VR experience, from 360-degree videos that let you step inside the content to standard videos shown on a virtual movie screen in the new theater mode. The app even includes some familiar features like voice search and a signed in experience so you can follow the channels you subscribe to, check out your playlists and more.


YTVR-Watch-Spherical-Player.jpg

We can’t tell you how inspired we all are after watching the amazing VR videos made by creators on YouTube. We can't wait to see what you dream up next!

Erin Teague, Product Manager, and Jamie Byrne, Director of YouTube Creators, recently watched "Tour of Man At Arms: Reforged Shop in 360° - The Forging Room!"

Source: YouTube Blog


Omnitone: Spatial audio on the web


Spatial audio is a key element for an immersive virtual reality (VR) experience. By bringing spatial audio to the web, the browser can be transformed into a complete VR media player with incredible reach and engagement. That’s why the Chrome WebAudio team has created and is releasing the Omnitone project, an open source spatial audio renderer with the cross-browser support.

Our challenge was to introduce the audio spatialization technique called ambisonics so the user can hear the full-sphere surround sound on the browser. In order to achieve this, we implemented the ambisonic decoding with binaural rendering using web technology. There are several paths for introducing a new feature into the web platform, but we chose to use only the Web Audio API. In doing so, we can reach a larger audience with this cross-browser technology, and we can also avoid the lengthy standardization process for introducing a new Web Audio component. This is possible because the Web Audio API provides all the necessary building blocks for this audio spatialization technique.



Omnitone Audio Processing Diagram

The AmbiX format recording, which is the target of the Omnitone decoder, contains 4 channels of audio that are encoded using ambisonics, which can then be decoded into an arbitrary speaker setup. Instead of the actual speaker array, Omnitone uses 8 virtual speakers based on an the head-related transfer function (HRTF) convolution to render the final audio stream binaurally. This binaurally-rendered audio can convey a sense of space when it is heard through headphones.

The beauty of this mechanism lies in the sound-field rotation applied to the incoming spatial audio stream. The orientation sensor of a VR headset or a smartphone can be linked to Omnitone’s decoder to seamlessly rotate the entire sound field. The rest of the spatialization process will be handled automatically by Omnitone. A live demo can be found at the project landing page.

Throughout the project, we worked closely with the Google VR team for their VR audio expertise. Not only was their knowledge on the spatial audio a tremendous help for the project, but the collaboration also ensured identical audio spatialization across all of Google’s VR applications - both on the web and Android (e.g. Google VR SDK, YouTube Android app). The Spatial Media Specification and HRTF sets are great examples of the Google VR team’s efforts, and Omnitone is built on top of this specification and HRTF sets.

With emerging web-based VR projects like WebVR, Omnitone’s audio spatialization can play a critical role in a more immersive VR experience on the web. Web-based VR applications will also benefit from high-quality streaming spatial audio, as the Chrome Media team has recently added FOA compression to the open source audio codec Opus. More exciting things like VR view integration, higher-order ambisonics and mobile web support will also be coming soon to Omnitone.

We look forward to seeing what people do with Omnitone now that it's open source. Feel free to reach out to us or leave a comment with your thoughts and feedback on the issue tracker on GitHub.

By Hongchan Choi and Raymond Toy, Chrome Team

Due to the incomplete implementation of multichannel audio decoding on various browsers, Omnitone does not support mobile web at the time of writing.

Daydream Labs: exploring and sharing VR’s possibilities

Posted by Andrey Doronichev, Group Product Manager, Google VR

In Daydream Labs, the Google VR team explores virtual reality’s possibilities and shares what we learn with the world. While it’s still early days, the VR community has already come a long way in understanding what works well in VR across hardware, software, video, and much more. But, part of what makes developing for VR so exciting is that there’s still so much more to discover.

Apps are a big focus for Daydream Labs. In the past year, we’ve built more than 60 app experiments that test different use cases and interaction designs. To learn fast, we build two new app prototypes each week. Not all of our experiments are successful, but we learn something new with each one.

For example, in one week we built a virtual drum kit that used HTC Vive controllers as drumsticks. The following week, when we were debating how to make typing in VR more natural and playful, we thought — “what if we made a keyboard out of tiny drums?”

We were initially skeptical that drumsticks could be more efficient than direct hand interaction, but the result surprised us. Not only was typing with drumsticks faster than with a laser pointer, it was really fun! We even built a game that lets you track your words per minute (mine was 50 wpm!).

Daydream Labs is just getting started. This post is the first in an ongoing series sharing what we’ve learned through our experiments so stay tuned for more! You can also see more of what we’ve learned about VR interactions, immersion, and social design by watching our Google I/O talks on the live stream.

VR at Google – Jump, Expeditions, and Daydream

Posted by Nathan Martz, Product Manager, Daydream

Two years ago at Google I/O, we introduced Google Cardboard, a simple and fun way to experience virtual reality on your smartphone. Since then, we've grown the Google VR family with Expeditions and Jump, and this week at Google I/O, we announced Daydream, a platform for high quality mobile virtual reality.

Jump—in the hands of creators and more cameras on the way

We announced Jump, cameras and software to make producing beautiful VR video simple, at I/O last year. Jump cameras are now in the hands of media partners such as Paramount Pictures, The New York Times, and Discovery Communications. Virtual reality production companies including WEVR, Vrse, The Secret Location, Surreal, Specular Theory, Panograma, and RYOT also have cameras in hand. We can't wait to see the wide variety of immersive videos these creators will share with a growing VR audience.

To enable cameras in a range of shapes and sizes and price points. Today, the Jump ecosystem expands with two partnerships to build Jump cameras. First, we're working with Yi Technology on a rig based around their new 4K Action Cam, coming later this year.

With Jump, we've also seen incredible interest from filmmakers. Of course when you're creating the best content you want the absolute highest quality, cinema-grade camera available. To help create this, we're collaborating with IMAX to develop a very high-end cinema-grade Jump camera.

Expeditions—One year, one million students

More than one million students from over 11 countries have taken an Expedition since we introduced the Google Expeditions Pioneer Program last May. The program lets students take virtual reality trips to over 200 places including Buckingham Palace, underwater in the Great Barrier Reef—and in seventh grader Lance Teeselink’s case—Dubai’s Burj Khalifa, the tallest building in the world.

And soon, students will have even more places to visit, virtually, thanks to new partnerships with the Associated Press and Getty Images. These partners will provide the Expeditions program with high-resolution VR imagery for current events to help students better understand what’s happening around the world.

Daydream—high quality VR on your Android smartphone

Daydream is our new platform for high quality mobile virtual reality, coming this fall. Over time, Daydream will encompass VR devices in many shapes and sizes, and Daydream will enable high quality VR on Android smartphones.

We are working with a number of smartphone manufacturers to create a specification for Daydream-ready phones. These smartphones enable VR experiences with high-performance sensors for smooth, accurate head tracking, fast response displays to minimize blur, and powerful mobile processors. Daydream-ready phones take advantage of VR mode in Android N, a set of powerful optimizations for virtual reality built right into Android.

With Daydream, we've also created a reference design for a comfortable headset and an intuitive controller. And, yes we're building one too. The headset and controller work in tandem to provide rich, immersive experiences. Take a look at how the controller lets you interact in VR:

Build for Daydream

The most important part of virtual reality is what you experience. Some of the world's best content creators and game studios are bringing their content to Daydream. You will also have your favorite Google apps including Play Movies, Street View, Google Photos, and YouTube.

You can start building for Daydream today. The Google VR SDK now includes a C++ NDK. And if you develop with Unreal or Unity, Daydream will be natively supported by both engines. Visit the Daydream developer site where you can get access the tools. Plus, with Android N Developer Preview 3 you can use the Nexus 6P as a Daydream developer kit.

This is just the beginning for Daydream. We’ll be sharing much more on this blog over the coming months. We’re excited to build the next chapter of VR with you.

What’s new in Android: the N-Release, Virtual Reality, Android Studio 2.2 and more

Posted by Dave Burke, VP of Engineering

In the past year, Android users around the globe have installed apps–built by developers like you–over 65 billion times on Google Play. To help developers continue to build amazing experiences on top of Android, today at Google I/O, we announced a number of new things we’re doing with the platform, including the next Developer Preview of Android N, an extension of Android into virtual reality, an update to Android Studio, and much more!

Android N Developer Preview is available to try on a range of devices
Android N: Performance, Productivity and Security
With Android N, we want to achieve a new level of product excellence for Android, so we’ve carried out some pretty deep surgery to the platform, rewriting and redesigning some fundamental aspects of how the system works. For Android N, we are focused on three key themes: performance, productivity and security. The first Developer Preview introduced a brand new JIT compiler to improve software performance, make app installs faster, and take up less storage. The second N Developer Preview included Vulkan, a new 3D rendering API to help game developers deliver high performance graphics on mobile devices. Both previews also brought useful productivity improvements to Android, including Multi-Window support and Direct Reply.

Multi-Window mode on Android N
Android N also adds some important new features to help keep users safer and more secure. Inspired by how Chromebooks apply updates, we’re introducing seamless updates, so that new Android devices built on N can install system updates in the background. This means that the next time a user powers up their device, new devices can automatically and seamlessly switch into the new updated system image.

Today’s release of Android N Developer Preview 3 is our first beta-quality candidate, available to test on your primary phone or tablet. You can opt in to the Android Beta Program at android.com/beta and run Android N on your Nexus 6, 9, 5X, 6P, Nexus Player, Pixel C, and Android One (General Mobile 4G). By inviting more people to try this beta release, developers can expect to see an uptick in usage of your apps on N; if you’ve got an Android app, you should be testing how it works on N, and be watching for feedback from users.

VR Mode in Android  
Android was built for today’s multi-screen world; in fact, Android powers your phone, your tablet, the watch on your wrist, it even works in your car and in your living room, all the while helping you move seamlessly between each device. As we look to what’s next, we believe your phone can be a really powerful new way to see the world and experience new content virtually, in a more immersive way; but, until this point, high quality mobile VR wasn’t possible across the Android ecosystem. That’s why we’ve worked at all levels of the Android stack in N–from how the operating system reads sensor data to how it sends pixels to the display–to make it especially built to provide high quality mobile VR experiences, with VR Mode in Android. There are a number of performance enhancements designed for developers, including single buffer rendering and access to an exclusive CPU core for VR apps. Within your apps, you can take advantage of smooth head-tracking and stereo notifications that work for VR. Most importantly, Android N provides for very low latency graphics; in fact, motion-to-photon latency on Nexus 6P running Developer Preview 3 is <20 ms, the speed necessary to establish immersion for the user to feel like they are actually in another place. We’ll be covering all of the new VR updates tomorrow at 9AM PT in the VR at Google session, livestreamed from Google I/O.

Android Instant Apps: real apps, without the installation 
We want to make it easier for users to discover and use your apps. So what if your app was just a tap away? What if users didn't have to install it at all? Today, we're introducing Android Instant Apps as part of our effort to evolve the way we think about apps. Whether someone discovers your app from search, social media, messaging or other deep links, they’ll be able to experience a fast and powerful native Android app without needing to stop and install your app first or reauthenticate. Best of all, Android Instant Apps is compatible with all Android devices running Jellybean or higher (4.1+) with Google Play services. Android Instant Apps functionality is an upgrade to your existing Android app, not a new, separate app; you can sign-up to request early access to the documentation.

Android Wear 2.0: UI changes and standalone apps  
This morning at Google I/O, we also announced the most significant Android Wear update since its launch two years ago: Android Wear 2.0. Based on what we’ve learned from users and developers, we're evolving the platform to improve key watch experiences: watch faces, messaging, and fitness. We’re also making a number of UI changes and updating our design guidelines to make your apps more consistent, intuitive, and beautiful.  With Android Wear 2.0, apps can be standalone and have direct network access to the cloud via a Bluetooth, Wi-Fi, or cellular connection.  Since your app won’t have to rely on the Data Layer APIs, it can continue to offer full functionality even if the paired phone is far away or turned off. You can read about all of the new features available in today’s preview here.


Android Studio 2.2 Preview: a new layout designer, constraint layout, and much more
Android Studio is the quickest way to get up and running with Android N and all our new platform features. Today at Google I/O, we previewed Android Studio 2.2 - another big update to the IDE designed to help you code faster with smart new tooling features built in. One of the headline features is our rewritten layout designer with the new constraint layout. In addition to helping you get out of XML to do your layouts visually, the new tools help you easily design for Android’s many great devices. Once you’re happy with a layout, we do all the hard work to automatically calculate constraints for you, so your UIs will resize automatically on different screen sizes . Here’s an overview of more of what’s new in 2.2 Preview (we’ll be diving into more detail this update at 10AM PT tomorrow in “What’s new in Android Development Tools”, livestreamed from Google I/O):

  • Speed: New layout designer and constraint layout, Espresso test recording and even faster builds
  • Smarts: APK analyzer, Layout inspector, expanded Android code analysis and IntelliJ 2016.1
  • Platform Support: Enhanced Jack compiler / Java 8 support, Expanded C++ support with CMake and NDK-Build, Firebase support and enhanced accessibility

New Layout Editor and Constraint Layout in Android Studio 2.2 Preview

This is just a small taste of some of the new updates for Android, announced today at Google I/O. There are more than 50 Android-related sessions over the next three days; if you’re not able to join us in person, many of them will be livestreamed, and all of them will be posted to YouTube after we’re done. We can’t wait to see what you build!

New YouTube live features: live 360, 1440p, embedded captions, and VP9 ingestion

Yesterday at NAB 2016 we announced exciting new live and virtual reality features for YouTube. We’re working to get you one step closer to actually being in the moments that matter while they are happening. Let’s dive into the new features and capabilities that we are introducing to make this possible:

Live 360: About a year ago we announced the launch of 360-degree videos at YouTube, giving creators a new way to connect to their audience and share their experiences. This week, we took the next step by introducing support for 360-degree on YouTube live for all creators and viewers around the globe.

To make sure creators can tell awesome stories with virtual reality, we’ve been working with several camera and software vendors to support this new feature, such as ALLie and VideoStitch. Manufacturers interested in 360 through our Live API can use our YouTube Live Streaming API to send 360-degree live streams to YouTube.

Other 360-degree cameras can also be used to live stream to YouTube as long as they produce compatible output, for example, cameras that can act as a webcam over USB (see this guide for details on how to live stream to YouTube). Like 360-degree uploads, 360-degree live streams need to be streamed in the equirectangular projection format. Creators can use our Schedule Events interface to set up 360 live streams using this new option:

360_checkbox.png


Check out this help center page for some details.



1440p live streaming: Content such as live 360 as well as video games are best enjoyed at high resolutions and high frame rates. We are also announcing support of 1440p 60fps resolution for live streams on YouTube. Live streams at 1440p have 70 percent more pixels than the standard HD resolution of 1080p. To ensure that your stream can be viewed on the broadest possible range of devices and networks, including those that don’t support such high resolutions or frame rates, we perform full transcoding on all streams and resolutions. A 1440p60 stream gets transcoded to 1440p60, 1080p60 and 720p60 as well as all resolutions from 1440p30 down to 144p30.

Support for 1440p will be available from our creation dashboard as well as our Live API. Creators interested in using this high resolution should make sure that their encoder is able to encode at such resolutions and that they have sufficient upload bandwidth on their network to sustain successful ingestion. A good rule of thumb is to provision at least twice the video bitrate.

VP9 ingestion / DASH ingestion: We are also announcing support for VP9 ingestion. VP9 is a modern video codec that lets creators upload higher resolution videos with lower bandwidth, which is particularly important for high resolution 1440p content. To facilitate the ingestion of this new video codec we are also announcing support for DASH ingestion, which is a simple, codec agnostic HTTP-based protocol. DASH ingestion will support H.264 as well as VP9 and VP8. HTTP-based ingestion is more resilient to corporate firewalls and also allows ingestion over HTTPS. It is also a simpler protocol to implement for game developers that want to offer in game streaming support with royalty free video codecs. MediaExcel and Wowza Media Systems will both be demoing DASH VP9 encoding with YouTube live at their NAB booths.

We will soon publish a detailed guide to DASH Ingestion on our support web site. For developers interested in DASH Ingestion, please join this Google group to receive updates.

Embedded captions: To provide more support to broadcasters, we now accept embedded EIA-608/CEA-708 captions over RTMP (H.264/AAC). That makes it easier to send captioned video content to YouTube and no longer requires posting caption data over side-band channels. We initially offer caption support for streams while they are live and will soon support the transitioning of caption data to the live recordings as well. Visit the YouTube Help Center for more information on our live captioning support.



We first launched live streaming back in 2011, and we’ve live streamed memorable moments: 2012 Olympics, Red Bull Stratos Jump, League of Legends Championship, and Coachella Music Festival. We are excited to see what our community can create with these new tools!

Nils Krahnstoever, Engineering Manager for Live
Kurt Wilms, Senior Product Manager for VR and Live
Sanjeev Verma, Product Manager for Video Formats

One step closer to reality: introducing 360-degree live streaming and spatial audio on YouTube

Cross posted from the Official YouTube Blog.

Growing up as a kid, my favorite basketball player was Magic Johnson. I wanted nothing more than to be able to watch him play in person, but unfortunately I never got the chance. Whether it’s a sporting event or a concert or even a family gathering, all of us have had the feeling of wanting to be somewhere we couldn’t. But these days, virtual reality and 360-degree video can help get you one step closer to actually being at those places and in those moments. Today, we’re taking immersive video even further with 360-degree live streaming on YouTube.

We first launched support for 360-degree videos back in March 2015. From musicians to athletes to brands, creators have done some incredible things with this technology. Now, they’ll be able to do even more to bring fans directly into their world, with 360-degree live streaming. And after years of live streaming Coachella for fans around the world who can’t attend the festival, this year we’re bringing you the festival like never before by live streaming select artist performances in 360-degrees this weekend.

Starting today, we’re also launching spatial audio for on-demand YouTube videos. Just as watching a concert in 360-degrees can give you an unmatched immersive experience, spatial audio allows you to listen along as you do in real life, where depth, distance and intensity all play a role. Try out this playlist on your Android device.

Jazz.gif

To make sure all creators can tell awesome stories with virtual reality, we’ve been working with companies across the industry. We’re working with companies like VideoStitch and Two Big Ears to make their software compatible with 360-degree live streams or spatial audio on YouTube and more will be available soon. We’ll also make 360-degree live streaming and spatial audio technologies available at all YouTube Space locations around the globe, so you can take it for a spin.

What excites me most about 360-degree storytelling is that it lets us open up the world's experiences to everyone. Students can now experience news events in the classroom as they unfold. Travelers can experience faraway sites and explorers can deep-sea dive, all without the physical constraints of the real world. And today's kids dreaming of going to a basketball game or a concert can access those experiences firsthand, even if they're far away from the court. What were once limited experiences are now available to anyone, anywhere, at any time.

Are you ready to never miss a moment again?

Posted by Neal Mohan, Chief Product Officer, recently watched Dub360: Stephen Curry pregame warmup routine