Tag Archives: Android Developer

Expanding the reach of your Android Auto apps

Posted by Eric Bahna, Product Manager

In December, we opened the Google Play Store for publishing new Android Auto apps to closed testing. Today, you can reach more drivers by publishing navigation, parking, and charging apps to open testing tracks in the Google Play Store. With open testing, there’s no limit to the number of users who can download your app and you don’t need to manage lists of email addresses. This is an important milestone that gets us closer to making these apps available to all users in production. Get started with the Android for Cars App Library and choose an open testing track in the Play Console.

TomTom AmiGO, one of our early access partners

To give you a peek at what’s ahead, we’re working on adding the library to Android Jetpack! This will give you more consistency with other Jetpack APIs and visibility into new features. When the Jetpack library is ready, migrating your app from the existing library will be straightforward - change the namespace and tweak some API calls. After we stabilize the library in Jetpack, we’ll prepare the Google Play Store to publish these new apps to production tracks.

You can get started today - you don’t need to wait for the Jetpack library.

  1. Design your app’s experience using our developer guide and app quality guidelines.
  2. Develop using today’s beta library so you can get user feedback from now.
  3. Test using the desktop head unit.
  4. Publish to the Google Play Store, now up to open testing tracks.

We’re excited to see what you’ve built and take it for a spin!

Introducing the Android for Cars App Library

Posted by Eric Bahna, Product Manager

In August, we announced plans to expand Android Auto’s app ecosystem to enable new navigation, parking, and electric vehicle charging apps. We’ve been hard at work collaborating with our early access partners to test and refine the Android for Cars App Library. Today, we’re releasing the library into an open beta, for any developer to use. This means you’ll now be able to design, develop, and test your navigation, parking or charging app on Android Auto. We’re looking forward to enabling Google Play Store publishing for your beta apps in the coming months.

Android

Three of our early access partners: ChargePoint, SpotHero, and Sygic

The design phase is the time to familiarize yourself with our design guidelines and app quality guidelines. Driver safety is core to our mission and we want to help you optimize your app for the car.

When it comes time to build your app, our new library will hopefully make development easy. Get started with the developer guide and please give us feedback via our public issue tracker.

In the testing phase, see your app come alive on the Desktop Head Unit (DHU), our emulator that lets you simulate a car infotainment display. The DHU now supports multiple screen sizes, displaying information in the instrument cluster, and simulating vehicles with touchpad input.

Android for cars image

The DHU simulating an instrument cluster, a widescreen head unit, and a touchpad

You can get started with the Android for Cars App Library here. We’re excited to see what you build next!

Android Studio 4.1

Posted by Scott Swarthout, Product Manager

Android Studio logo

Today, we’re excited to release the stable version of Android Studio 4.1, with a set of features addressing common editing, debugging, and optimization use cases. A major theme for this release was helping you be more productive while using Android Jetpack libraries, Android’s suite of libraries to help developers follow best practices and write code faster. Based on your feedback we made a number of improvements to the code editing experience with IDE integrations for popular Android libraries.

Some highlights of Android Studio 4.1 include a new Database Inspector for querying your app’s database, support for navigating projects that use Dagger or Hilt for dependency injection, and better support for on-device machine learning with support for TensorFlow Lite models in Android projects. We’ve also made updates to Apply Changes to make deployment faster. Based on your feedback, we’ve made several changes to help game developers with a new native memory profiler and standalone profiling tools.

Product quality continues to be a major focus for the team, and we’ve been hard at work tracking down bugs and performance issues. We’ve heard from many developers that they liked the focus on better performance and reliability, so we’re happy to report that during this release cycle we’ve fixed 2,370 bugs and closed 275 public issues. We stay committed to maintaining high quality since we know that is key to your developer productivity.

Thank you to those who gave your early feedback in preview releases. Your feedback helped us iterate and improve features in Android Studio 4.1. If you are ready for the next stable release, and want to use a new set of productivity features, Android Studio 4.1 is ready to download for you to get started.

Below is a full list of new features in Android Studio 4.1, organized by key developer flows.

Design

Material Design Components updates

Android Studio templates in the create New Project dialog now use Material Design Components (MDC) and conform to updated guidance for themes and styles by default. These changes will make it easier to use recommended material styling patterns and support modern UI features like dark themes.

Material Design Components updates

Material Design Components updates in Project Templates

Updates include:

  • MDC: Projects depend on com.google.android.material:material in build.gradle. Base app themes use Theme.MaterialComponents.* parents and override updated MDC color and “on” attributes.
  • Color resources: Color resources in colors.xml use literal names (for example, purple_500 instead of colorPrimary).
  • Theme resources: Theme resources are in themes.xml (instead of styles.xml) and use Theme.<ApplicationName> names.
  • Dark theme: Base application themes use DayNight parents and are split between res/values and res/values-night.
  • Theme attributes: Color resources are referenced as theme attributes (for example, ?attr/colorPrimary) in layouts and styles to avoid hard-coded colors.

Develop

Database Inspector

We wanted to make it easier to inspect, query, and modify your app's databases using the new Database Inspector. To get started, deploy your app to a device running API level 26 or higher and select View > Tool Windows > Database Inspector from the menu bar. Whether your app uses the Jetpack Room library or the Android platform version of SQLite directly, you can now easily inspect databases and tables in your running app or run custom queries.

Because Android Studio maintains a live connection while you’re inspecting your app, you can also modify values using the Database Inspector and see those changes in your running app. If you use the Room persistence library, Android Studio also places run buttons next to each query in the code editor to help you quickly run queries you define in your @Query annotations. Learn more

Database inspector

Inspect, query, and modify your app’s databases with the Database Inspector

Run Android Emulator directly in Android Studio

You can now run the Android Emulator directly in Android Studio. Use this feature to conserve screen real estate, to navigate quickly between the emulator and the editor window using hotkeys, and to organize your IDE and emulator workflow in a single application window. You can manage snapshots and common emulator actions like rotating and taking screenshots from within Studio, but access to the full set of options still requires running the stable emulator. You can opt-in to use this feature by going to File → SettingsToolsEmulator Launch in Tool Window.

Android Emulator in Android Studio

Run the Android Emulator inside of Android Studio

Dagger Navigation Support

Dagger is a popular library for dependency injection on Android. Android Studio makes it easier to navigate between your Dagger-related code by providing new gutter actions and extending support in the Find Usages window. For example, clicking on the go to producer gutter action gutter action next to a method that consumes a given type navigates you to the provider of that type. Conversely, clicking on the go to consumer gutter action gutter action navigates you to where a type is used as a dependency. Android Studio also supports navigation actions for dependencies defined with the Jetpack Hilt library. Learn more.

Gutter actions navigation in Android Studio

Navigate between Dagger-related code with gutter actions

Use TensorFlow Lite models

Android developers are using machine learning to create innovative and helpful experiences. TensorFlow Lite is a popular library for writing mobile machine learning models, and we wanted to make it easier to import these models into Android apps. Similar to view binding, Android Studio generates easy-to-use classes so you can run your model with less code and better type safety. The current implementation of ML Model Binding supports image classification and style transfer models, provided they are enhanced with metadata.

To see the details for an imported model and get instructions on how to use it in your app, double-click the .tflite model file in your project to open the model viewer page. Learn more.

TensorFlow Lite in Android Studio 4.1

View TensorFlow Lite model metadata in Android Studio 4.1

Build & Test

Android Emulator - Foldable Hinge Support

Android Studio

In addition to recently adding 5G cellular testing support, we’ve added support for foldables in the Android emulator. With Android emulator 30.0.26 and above, you can configure foldable devices with a variety of fold designs and configurations. When a foldable device is configured, the emulator will publish hinge angle sensor updates and posture changes, so you can test how your app responds to these form factors. See the Developing for Android 11 with the Android Emulator blogpost to read more.

Extended controls, device pose

Apply Changes updates

Faster builds help developers make changes to their app more easily and quickly. To help you be more productive as you iterate on your app, we've made multiple enhancements to Apply Changes for devices running Android 11 or higher.

We've invested heavily in optimizing your iteration speed by developing a method to deploy and persist changes on a device without installing the application. After an initial deploy, subsequent deploys to Android 11 devices using either Apply Code Changes or Apply Changes and Restart Activity are now significantly faster. We’ve also added support for additional code changes in Apply Changes. Now if you add a method, you can deploy those changes to a running app by clicking either Apply Code Changes or Apply Changes and Restart Activity.

Export C/C++ dependencies from AARs

Android Gradle Plugin 4.0 added the ability to import Prefab packages in AAR dependencies. We wanted to extend the capability of this feature to support sharing native libraries as well. AGP version 4.1 enables exporting libraries from your external native build in an AAR for an Android Library project. To export your native libraries, add the following to the android block of your library project's build.gradle file:

buildFeatures {
    prefabPublishing true
}

prefab {
    mylibrary {
      headers "src/main/cpp/mylibrary/include"
    }

    myotherlibrary {
        headers "src/main/cpp/myotherlibrary/include"
    }
}

Symbolication for native crash reports

When a crash or ANR occurs in native code, the system produces a stack trace, which is a snapshot of the sequence of nested functions called in your program up to the moment it crashed. These snapshots can help you to identify and fix any problems in the source, but they must first be symbolicated to translate the machine addresses back into human-readable function names.

If your app or game is developed using native code, like C++, you can now upload debug symbols files to the Play Console for each version of your app. The Play Console uses these debug symbols files to symbolicate your app's stack traces, making it easier to analyze crashes and ANRs. To include debug symbols in your app bundle, add the following line to your project’s build.gradle file:

android.buildTypes.release.ndk.debugSymbolLevel = 'SYMBOL_TABLE'

Optimize

System Trace UI improvements

In Android Studio 4.1 we’ve overhauled System Trace, an optimization tool that gives you a real-time look at how your app is using system resources. We made it easier to select a trace with box selection mode, added a new analysis tab, and added more frame rendering data to help you investigate rendering issues in your app’s UI. Learn more.

Box selection: In the Threads section, you can now drag your mouse to perform a box selection of a rectangular area, which you can zoom into by clicking the Zoom to Selection button on the top right (or use the M keyboard shortcut). When you drag and drop similar threads next to each other, you can select across multiple threads to inspect all of them at once.

Use box selection to more easily select traces.

Trace selection

Summary tab: The new Summary tab in the Analysis panel displays:

  • Aggregate statistics for all occurrences of a specific event, such as an occurrence count and min/max duration.
  • Trace event statistics for the selected occurrence.
  • Data about thread state distribution.
  • Longest-running occurrences of the selected trace event.
View aggregated statistics in Summary tab of Android Studio 4.1

View aggregated statistics in the Summary tab

Display data: In the Display section, new timelines for SurfaceFlinger and VSYNC help you investigate rendering issues in your app's UI.

Standalone profilers

It's now possible to access the Android Studio Profilers in a separate window from the primary Android Studio window. This is useful when optimizing Android games built with other tools like Unity or Visual Studio.

To run the standalone profilers, do the following:

  1. Make sure the profilers in Android Studio are not already running on your system.
  2. Go to the installation directory and navigate to the bin directory:

Windows/Linux: <studio-installation-folder>\bin

macOS: <studio-installation-folder>/Contents/bin

  1. Depending on your OS, run profiler.exe or profiler.sh

The standalone profiler will allow you to connect to the Android emulator or any connected devices.

Standalone Android Studio profiler

Optimize your app with the Standalone Android Studio Profilers

Native Memory Profiler

Tracking native memory usage is important for game developers and other developers using C++ to understand how to optimize their app’s memory consumption. The Android Studio Memory Profiler now includes a Native Memory Profiler for apps deployed to physical devices running Android 10 or later. The Native Memory Profiler tracks allocations/deallocations of objects in native code for a specific time period and provides information about total allocations and remaining system heap size.

To initiate a recording, click Record native allocations at the top of the Memory Profiler window:

Native Memory Profiler window in Android Studio 4.1

View native memory allocations with the Native Memory Profiler

To recap, Android Studio 4.1 includes these new enhancements & features:

Design

  • Material Design Components updates

Develop

  • Database Inspector
  • Run Android Emulator directly in Android Studio
  • Dagger navigation support
  • Use TensorFlow Lite models

Build & Test

  • Android Emulator - Foldable Hinge Support
  • Apply Changes updates
  • Export C/C++ dependencies from AARs
  • Symbolification for native crash reports

Optimize

  • System Trace UI Improvements
  • Standalone profilers
  • Native Memory Profiler

These materials are not sponsored by or affiliated with Unity Technologies or its affiliates. “Unity” is a trademark or registered trademark of Unity Technologies or its affiliates in the U.S. and elsewhere.

11 Weeks of Android: That’s a wrap

11

This is the final blog post for #11WeeksOfAndroid. Thank you for joining us over the past 11 weeks as we dove into key areas of Android development. In case you missed it, here’s a recap of everything we talked about during each week:

Week 1 - People and identity

Discover how to implement the conversation shortcut and bubbles with ‘conversation notifications’. Also, learn more about conversation additions and other System UI news, and discover the people and conversations developer documentation here. Finally, you can also listen to the Android Backstage podcast where the System UI team is interviewed on people and bubbles.

To tackle user and developer complexity that makes identity a challenge for developers, we've been working on One Tap and Block Store, part of our new Google Identity Services Library.

If you’re interested in learning more about Identity, we published the video “in Identity on Android: what’s new in sign-in,” where Vishal explains the new libraries in the Google Identity System.

Two teams that worked very early with us are the Facebook Messenger team and the direct messaging team from Twitter. Read the story from Twitter here and find out how we worked with Facebook on the implementation here.

Find out more with the People and Identity learning path, playlist, and the week’s wrap-up blog post.

Week 2 - Machine learning

We kicked off the week by announcing the winners of the #AndroidDevChallenge! Check out all the winning apps and see how they used ML Kit and TensorFlow Lite, all focused on demonstrating how machine learning can come to life in a powerful way to help users get things done, like an app to help visually impaired navigate crowded spaces or another to help students learn sign language.

We recently made ML Kit a standalone SDK and it no longer requires a Firebase account. Just one line in your build.gradle file and you can start bringing ML functionality into your app.

Another much anticipated addition is the support for swapping Google models with your own for both Image Labeling as well as Object Detection and Tracking.

Find out about the importance of finding the unique intersection of user problems and ML strengths and how the People + AI Guidebook can help you make ML product decisions. Check out the interview with the Read Along team for more inspiration.

This week we also highlighted how adding a custom model to your Android app has never been easier.

Finally, try out our codelabs:

Find out more with the Machine Learning pathway, playlist, and the week’s wrap-up blog post.

Week 3 - Privacy and security

As shared in the “Privacy and Security” blog post, we’re giving users even more control and transparency over user data access.

In Android 11, we introduced various privacy improvements such as one time permissions that let users give an app access to the device microphone, camera, or location, just that one time. Learn more about building privacy-friendly apps with these new changes. You can also learn about various Android security updates in this video.

Other notable updates include:

  • Permissions auto-reset: If users haven’t used an app that targets Android 11 for an extended period of time, the system will “auto-reset” all of the granted runtime permissions associated with the app and notify the user.
  • Data access auditing APIs: In Android 11, developers will have access to new APIs that will give them more transparency into their app’s usage of private and protected data. Learn more about new tools in Android 11 to make your apps more private and stable.
  • Scoped Storage: In Android 11, scoped storage will be mandatory for all apps that target API level 30. Learn more and check out the storage FAQ.
  • Google Play system updates: Google Play system updates were introduced with Android 10 as part of Project Mainline, making it easier to bring core OS component updates to users.
  • Jetpack Biometric library:The library has been updated to include new BiometricPrompt features in Android 11 in order to allow for backward compatibility.

Find out more with the ‘privacy, trust and security’ learning pathway, playlist, and documentation on privacy and security best practices.

Week 4 - Android 11 compatibility

We shipped the second Beta of Android 11 and added a new release milestone called Platform Stability to clearly signal to developers that all APIs and system behaviors are complete. Find out more about Beta 2 and platform stability, including what this milestone means for developers, and the Android 11 timeline. Note: since week #4, we shipped the third and final beta and are getting close to releasing Android 11 to AOSP and the ecosystem. Be sure to check that your apps are working!

To get your apps ready for Android 11, check out some of these helpful resources:

In our “Accelerating Android updates” blog post, we looked at how we’re continuing to get the latest OS to reach critical mass by expanding Android’s updatability architecture.

We also highlighted Excelliance Tech, who recently moved their LeBian SDK away from non-SDK interfaces, toward stable, official APIs so they can stay more compatible with the Android OS over time. Check out the Excelliance Tech story.

Find out more with the Android 11 Compatibility learning pathway, playlist, and the week’s wrap-up blog post.

Week 5 - Languages

With the Android 11 beta, we further improved the developer experience for Kotlin on Android by officially recommending coroutines for asynchronous work. If you’re new to coroutines, check out:

Also, check out our new Kotlin case studies page for the latest case studies and data, including the new Google Home case study, and our state of Kotlin on Android video. For beginners, we announced the launch of our new Android basics in Kotlin course.

If you’re a Java language developer, watch support for newer Java APIs on how we’ve made newer OpenJDK libraries available across versions of Android. With Android 11, we also updated the Android runtime to make app startup even faster with I/O prefetching.

Android 11 included updates across the native toolchain, including better tools for profile-guided optimization (PGO) and improvements to native dependency management in Android Studio 4.0.

Finally, we continue to focus on improvements to the D8 and R8 compilers in Android Studio with better support for Kotlin in the R8 shrinker. Learn more.

Find out more with the languages learning pathway, playlist, and the week’s wrap-up blog post.

Week 6 - Android Jetpack

Interested in what’s new in Jetpack? Check out the #Android11 Beta launch with a quick fly-by introducing many of the updates to our libraries, with tips on how to get started.

  • Dive deeper into major releases like Hilt, with cheat sheets to help you get started, and learn how we migrated our own samples to use Hilt for dependency injection. Less boilerplate = more fun.
  • Discover more about Paging 3.0, a complete rewrite of the library using Kotlin coroutines and adding features like improved error handling, better transformations, and much more.
  • Get to know CameraX Beta, and learn how it helps developers manage edge cases across different devices and OS versions, so that you don’t have to.

This year, we've made several major improvements with the release of Navigation 2.3, which allows you to navigate between different screens of your app with ease while also allowing you to follow Android UI principles.

In Android 11, we continued our work to give users even more control over sensitive permissions. Now there are type-safe contracts for common intents and more via new ActivityResult APIs. These changes simplify how you request permissions, and we’ll continue to work on making permissions easier in the future.

Also learn about our recent releases of the AppStartup library as well as what’s new in WorkManager.

Find out more with the Jetpack learning pathway, playlist, and the week’s wrap-up blog post.

Week 7 - Android developer tools

We have brought together an overview of what is new in Android Developer tools.

Check out the latest updates in design tools, and go even deeper:

Also, find out about debugging your layouts, with updates to the layout inspector. Discover the latest developments for Jetpack Compose Design tools, and also how to use the new database inspector in Android Studio.

Discover the latest development tools we have in place for Jetpack Hilt in Android Studio.

Learn about the build system in Android developer tools:

To learn about the latest updates on virtual testing, read this blog on the Android Emulator. Lastly, to see the latest changes for performance tools, watch performance profilers content about System Trace. Additionally, check out more about C++ memory profiling with Android Studio 4.1.

Find out more with the Android developer tools learning pathway, playlist, and the week’s wrap-up blog post.

Week 8 - App distribution and monetization

Check out our webinars about the new Google Play Console beta if you weren’t able to tune in live.

We shared recent improvements we’ve made to app bundles, as well as our intention to require new apps and games to publish with this format in the second half of 2021. The new in-app review API means developers can now ask for ratings and reviews from within your app!

Don’t forget about our policy around more transparent subscriptions to help increase user trust in Google Play Billing. We also expanded our feature set to help you better reach and retain buyers, and launched Play Billing Library 3, which will be required by mid-2021.

Google Play Pass launched in nine new markets last month. Developers using both Google Play Pass and direct billing on Google Play have earned an average of 2.5 times US revenue with Google Play Pass, without diminishing Google Play store earnings. Learn more and express interest in joining.

Find out more with the app distribution and monetization learning pathway, playlist, and the week’s wrap-up blog post.

Week 9 - Android beyond phones

Check out some of the highlights from this week, including;

Find out more with the learning pathways for Android TV and Large Screens, Beyond phones playlist, and the week’s wrap-up blog post.

Week 10 - Games and media

We shared several games updates and presented a special "11 Weeks" episode of The Android Game Developer Show.

You can also take advantage of Android 11's new media controls by making sure your app is using MediaStyle with a valid MediaSession token. Learn how to support media resumption by making your app discoverable with a MediaBrowserServiceCompat, using the EXTRA_RECENT hint to help with resuming content, and handling the onPlay and onGetRoot callbacks. Then check out how to leverage the MediaRouter jetpack library and check out the updated version of the UAMP sample.

Finally, we covered some of the primary ways apps can benefit from 5G. Android 11 adds new APIs and updates existing APIs to help ensure you have all the tools you need to leverage the capabilities of 5G, such as an enhanced bandwidth estimation API, 5G detection capabilities, and a new meteredness flag from cellular carriers. The Android emulator now enables you to develop and test these APIs without needing a 5G device or network connection. All of this and more is available from our dedicated 5G page.

Find out more with the ‘games and media’ learning pathway, playlist, and the wrap-up blog post, and visit d.android.com/games to stay up to date on all of our tools and resources for game developers.

Week 11 - UI

In our final week, we released 4 new codelabs, 9 new samples, new documentation and a podcast from the Compose team. If you prefer videos; we’ve got you covered:

New in Android 11 is the ability for apps to create seamless transitions between the on screen keyboard being opened and closed. To find out how to add this to your app, slide on over to the video, blog posts and sample app

We recommend following the Material Design guidelines to ensure that apps operate consistently, enabling patterns learned in one app to be used in another. Find out more about Material Theming (color, type and shape), dark theme and Material’s motion system using the Material Design Components (MDC) library. If you haven’t already migrated to MDC, then check out our migration guide.

It even becomes possible to ease your migration with libraries like the new MDC-Android Compose Theme Adapter which converts an MDC XML theme into a Compose `MaterialTheme`.

Find out more with the Compose learning pathway, the Modern UI learning pathway, playlist, and the week’s wrap-up blog post.

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here. Follow us on Twitter and YouTube, and subscribe to our email list to receive all the latest news and resources. Thanks so much for letting us be a part of this experience with you!

11 Weeks of Android: UI and Compose

Posted by Chris Banes & Nick Butcher

Android

This blog post is part of a weekly series for #11WeeksOfAndroid. Each week we’re diving into a key area of Android so you don’t miss anything. This week, we spotlighted people & identity; here’s a look at what you should know.

The big news: Jetpack Compose Alpha

This week we released the first alpha of Jetpack Compose ?, Android’s modern UI toolkit with native access to the platform APIs. Compose combines the power of Kotlin with the reactive programming model to make it easier and faster to build UI. We want your feedback to help us build the APIs that you need in your apps, so now is the time to try it out.

To get you up to speed with Compose, this week we’ve released 4 new codelabs, 7 new samples, new documentation and a podcast from the Compose team. If you prefer videos; we’ve got you covered...

To understand the reactive mindset and how to think about building apps with Compose, check out ‘Thinking in Compose’:

Learn how Jetpack Compose makes Android UI easier by walking through concrete examples from our open-source sample apps in ‘Compose by Example’:

Finally, to understand how Jetpack Compose and View based UIs can co-exist and interact, making it easy to adopt Compose at your own pace, check out ‘Compose for Existing’ apps:

Keyboard (IME) animations

New in Android 11 is the ability for apps to create seamless transitions between the on screen keyboard being opened and closed, as well as a revamped WindowInsets API to enable control of things such as the keyboard (IME). To find out how to add this to your app, slide on over to the video, blog posts and sample app

Material Design Components

We recommend following the Material Design guidelines to ensure that apps operate consistently, that patterns learned in one app can be used in another. Check out our new blog posts on Material Theming (color, type and shape), dark theme and Material’s motion system using the Material Design Components (MDC) library.

Adopting MDC now will prepare your codebase for later adopting Jetpack Compose — it uses the same concepts, design vocabulary and components. It even becomes possible to ease your migration with libraries like the new MDC-Android Compose Theme Adapter which converts an MDC XML theme into a Compose `MaterialTheme`.

If you haven’t already migrated to MDC, then check out our migration guide.

Learning path

If you’re looking for an easy way to pick up the highlights of this week, you can check out the learning pathways. This week we have two pathways for you to go through: the Compose pathway, and the ‘Modern UI’ pathway.

A pathway is an ordered tutorial that allows users to complete a pre-defined module that culminates in a quiz. It may include codelabs, videos, articles and blog posts. A virtual badge is awarded to each user who passes the quiz. Test your knowledge in each pathway to earn a limited edition badge.

Key takeaways

Whether you're building with the current UI toolkit or getting ready for the next generation we hope that the resources that we’ve shared this week help you to create beautiful, engaging UIs that your users will love. Thanks to everyone who tuned in or joined us for the AMA. Follow the Modern UI pathway to learn how to leverage Material Design, animation or the latest Android 11 features. Take the Compose pathway to learn about the future of Android UI development and help shape it with your feedback.

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

ML Kit Pose Detection Makes Staying Active at Home Easier

Posted by Kenny Sulaimon, Product Manager, ML Kit; Chengji Yan and Areeba Abid, Software Engineers, ML Kit

ML Kit logo

Two months ago we introduced the standalone version of the ML Kit SDK, making it even easier to integrate on-device machine learning into mobile apps. Since then we’ve launched the Digital Ink Recognition API, and also introduced the ML Kit early access program. Our first two early access APIs were Pose Detection and Entity Extraction. We’ve received an overwhelming amount of interest in these new APIs and today, we are thrilled to officially add Pose Detection to the ML Kit lineup.

ML Kit Overview

A New ML Kit API, Pose Detection


Examples of ML Kit Pose Detection

ML Kit Pose Detection is an on-device, cross platform (Android and iOS), lightweight solution that tracks a subject's physical actions in real time. With this technology, building a one-of-a-kind experience for your users is easier than ever.

The API produces a full body 33 point skeletal match that includes facial landmarks (ears, eyes, mouth, and nose), along with hands and feet tracking. The API was also trained on a variety of complex athletic poses, such as Yoga positions.

Skeleton image detailing all 33 landmark points

Skeleton image detailing all 33 landmark points

Under The Hood

Diagram of the ML Kit Pose Detection Pipeline

The power of the ML Kit Pose Detection API is in its ease of use. The API builds on the cutting edge BlazePose pipeline and allows developers to build great experiences on Android and iOS, with little effort. We offer a full body model, support for both video and static image use cases, and have added multiple pre and post processing improvements to help developers get started with only a few lines of code.

The ML Kit Pose Detection API utilizes a two step process for detecting poses. First, the API combines an ultra-fast face detector with a prominent person detection algorithm, in order to detect when a person has entered the scene. The API is capable of detecting a single (highest confidence) person in the scene and requires the face of the user to be present in order to ensure optimal results.

Next, the API applies a full body, 33 landmark point skeleton to the detected person. These points are rendered in 2D space and do not account for depth. The API also contains a streaming mode option for further performance and latency optimization. When enabled, instead of running person detection on every frame, the API only runs this detector when the previous frame no longer detects a pose.

The ML Kit Pose Detection API also features two operating modes, “Fast” and “Accurate”. With the “Fast” mode enabled, you can expect a frame rate of around 30+ FPS on a modern Android device, such as a Pixel 4 and 45+ FPS on a modern iOS device, such as an iPhone X. With the “Accurate” mode enabled, you can expect more stable x,y coordinates on both types of devices, but a slower frame rate overall.

Lastly, we’ve also added a per point “InFrameLikelihood” score to help app developers ensure their users are in the right position and filter out extraneous points. This score is calculated during the landmark detection phase and a low likelihood score suggests that a landmark is outside the image frame.

Real World Applications


Examples of a pushup and squat counter using ML Kit Pose Detection

Keeping up with regular physical activity is one of the hardest things to do while at home. We often rely on gym buddies or physical trainers to help us with our workouts, but this has become increasingly difficult. Apps and technology can often help with this, but with existing solutions, many app developers are still struggling to understand and provide feedback on a user’s movement in real time. ML Kit Pose Detection aims to make this problem a whole lot easier.

The most common applications for Pose detection are fitness and yoga trackers. It’s possible to use our API to track pushups, squats and a variety of other physical activities in real time. These complex use cases can be achieved by using the output of the API, either with angle heuristics, tracking the distance between joints, or with your own proprietary classifier model.

To get you jump started with classifying poses, we are sharing additional tips on how to use angle heuristics to classify popular yoga poses. Check it out here.

Learning to Dance Without Leaving Home

Learning a new skill is always tough, but learning to dance without the aid of a real time instructor is even tougher. One of our early access partners, Groovetime, has set out to solve this problem.

With the power of ML Kit Pose Detection, Groovetime allows users to learn their favorite dance moves from popular short-form dance videos, while giving users automated real time feedback on their technique. You can join their early access beta here.

Groovetime App using ML Kit Pose Detection

Staying Active Wherever You Are

Our Pose Detection API is also helping adidas Training, another one of our early access partners, build a virtual workout experience that will help you stay active no matter where you are. This one-of-a-kind innovation will help analyze and give feedback on the user’s movements, using nothing more than just your phone. Integration into the adidas Training app is still in the early phases of the development cycle, but stay tuned for more updates in the future.

How to get started?

If you would like to start using the Pose Detection API in your mobile app, head over to the developer documentation or check out the sample apps for Android and iOS to see the API in action. For questions or feedback, please reach out to us through one of our community channels.

11 Weeks of Android: Beyond phones

Posted by Product Leads for Android TV, Android for cars, Wear OS, and Chrome OS

Android

This blog post is part of a weekly series for #11WeeksOfAndroid. Each week we’re diving into a key area of Android so you don’t miss anything. This week, we spotlighted Android Beyond Phones; here’s a look at what you should know.

With Android, people can experience the apps and services that they love across many devices & surfaces. Beyond phones, Android offers distinct yet familiar experiences on devices of all shapes and sizes, ranging from the smallest smartwatch screens, to larger displays on foldables and Chromebooks, to in-car entertainment systems, and all the way up to the largest television screens. For the past 4 days, we featured a daily deep dive into each of these exciting form factors that are providing developers with new and growing ways to engage people. Read on for a recap.

Android TV

We kicked off the week with Android TV, which is now partnering with 7 of the top 10 OEMs and over 160 television operators across the globe. The Android TV team highlighted 6 upcoming launches, like instant app trials right from Google Play and an updated Gboard, to help developers acquire more users, more easily monetize, and build even more engaging experiences. Then, new resources were published to help developers build their first Android TV app, or even go deep on new integrations like Cast Connect and frictionless subscriptions. If you’re excited about developing for TV, pick up an ADT-3, learn the latest from the training pathway, and bring your Android app to the biggest screen in the home!

Android for Cars

We shared new ways to reach more drivers on Android for cars. Android Auto, which allows you to connect your phone to your car display, is currently available with nearly every major car manufacturer and is on track to reach 100 million cars. Soon, new app categories including navigation, parking, and electric vehicle charging will be available and the experience for Android Auto users will become even more seamless as car manufacturers continue to add support for wireless connectivity. We also highlighted the launch of the first car powered by Android Automotive OS with Google apps and services built in — the Polestar 2. As more manufacturers ship cars with this embedded functionality, we’re making it even easier for developers to build media apps on Android Automotive OS with updated documentation and emulators. Get started today to bring your app to cars!

Large Screens

We covered large screens starting with the announcement of ChromeOS.dev — a dedicated resource for technical developers, designers, product managers, and business leaders. We’ve showcased our growth in device sales in Chrome OS, seeing Chromebook unit sales grew 127% year over year between March and June this year1, as well as some of the new features coming, such as customizable Linux Terminal and Android Emulator support. We’ve also continued to see growth in apps optimized for larger screen experiences, with over 1 million apps optimized for tablets and large screens available in the Google Play store. To help you develop the best-in-class apps for Chrome OS, foldables and tablets, we are continuing to release new features and updates. We released new design recommendations for your app as well as a few updates made in Android Studio. Check out the new sessions and some of the resurfaced content to learn more about bringing the best experiences to your users on these large-screen devices.

Wear OS

To round out the week, we talked about Wear OS where we are investing in the fundamentals with a focus on performance and faster app startup times, which you’ll see in the latest platform release coming in the fall. Wear OS will be launching updates to cornerstone features like Weather in the coming months, and is investing in helpful experiences, such as our recent hand-wash timer to help people maintain hand-hygiene in the Covid-19 pandemic. The team is working hard to bring the best of Android 11 to Wear OS.

Learning paths

Are you building your apps with different screen sizes and form factors in mind? Check out all the resources for Wear OS and Android for cars, and if you’re looking for an easy way to pick up the highlights of this week for Android TV and large screens, consider completing the pathway for each. These include codelabs, videos, articles and blog posts. A virtual badge is awarded to each user who passes the quiz.

We hope that you found the Android Beyond Phones week useful, and we're excited to see all the great experiences that you'll build on these platforms!

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

Sources:
1 The NPD Group, Inc., U.S. Retail Tracking Service, Notebook Computers, based on unit sales, April–June 2020 and March–June 2020​.

11 Weeks of Android: Beyond phones

Posted by Product Leads for Android TV, Android for cars, Wear OS, and Chrome OS

Android

This blog post is part of a weekly series for #11WeeksOfAndroid. Each week we’re diving into a key area of Android so you don’t miss anything. This week, we spotlighted Android Beyond Phones; here’s a look at what you should know.

With Android, people can experience the apps and services that they love across many devices & surfaces. Beyond phones, Android offers distinct yet familiar experiences on devices of all shapes and sizes, ranging from the smallest smartwatch screens, to larger displays on foldables and Chromebooks, to in-car entertainment systems, and all the way up to the largest television screens. For the past 4 days, we featured a daily deep dive into each of these exciting form factors that are providing developers with new and growing ways to engage people. Read on for a recap.

Android TV

We kicked off the week with Android TV, which is now partnering with 7 of the top 10 OEMs and over 160 television operators across the globe. The Android TV team highlighted 6 upcoming launches, like instant app trials right from Google Play and an updated Gboard, to help developers acquire more users, more easily monetize, and build even more engaging experiences. Then, new resources were published to help developers build their first Android TV app, or even go deep on new integrations like Cast Connect and frictionless subscriptions. If you’re excited about developing for TV, pick up an ADT-3, learn the latest from the training pathway, and bring your Android app to the biggest screen in the home!

Android for Cars

We shared new ways to reach more drivers on Android for cars. Android Auto, which allows you to connect your phone to your car display, is currently available with nearly every major car manufacturer and is on track to reach 100 million cars. Soon, new app categories including navigation, parking, and electric vehicle charging will be available and the experience for Android Auto users will become even more seamless as car manufacturers continue to add support for wireless connectivity. We also highlighted the launch of the first car powered by Android Automotive OS with Google apps and services built in — the Polestar 2. As more manufacturers ship cars with this embedded functionality, we’re making it even easier for developers to build media apps on Android Automotive OS with updated documentation and emulators. Get started today to bring your app to cars!

Large Screens

We covered large screens starting with the announcement of ChromeOS.dev — a dedicated resource for technical developers, designers, product managers, and business leaders. We’ve showcased our growth in device sales in Chrome OS, seeing Chromebook unit sales grew 127% year over year between March and June this year1, as well as some of the new features coming, such as customizable Linux Terminal and Android Emulator support. We’ve also continued to see growth in apps optimized for larger screen experiences, with over 1 million apps optimized for tablets and large screens available in the Google Play store. To help you develop the best-in-class apps for Chrome OS, foldables and tablets, we are continuing to release new features and updates. We released new design recommendations for your app as well as a few updates made in Android Studio. Check out the new sessions and some of the resurfaced content to learn more about bringing the best experiences to your users on these large-screen devices.

Wear OS

To round out the week, we talked about Wear OS where we are investing in the fundamentals with a focus on performance and faster app startup times, which you’ll see in the latest platform release coming in the fall. Wear OS will be launching updates to cornerstone features like Weather in the coming months, and is investing in helpful experiences, such as our recent hand-wash timer to help people maintain hand-hygiene in the Covid-19 pandemic. The team is working hard to bring the best of Android 11 to Wear OS.

Learning paths

Are you building your apps with different screen sizes and form factors in mind? Check out all the resources for Wear OS and Android for cars, and if you’re looking for an easy way to pick up the highlights of this week for Android TV and large screens, consider completing the pathway for each. These include codelabs, videos, articles and blog posts. A virtual badge is awarded to each user who passes the quiz.

We hope that you found the Android Beyond Phones week useful, and we're excited to see all the great experiences that you'll build on these platforms!

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

Sources:
1 The NPD Group, Inc., U.S. Retail Tracking Service, Notebook Computers, based on unit sales, April–June 2020 and March–June 2020​.

New ways to reach more drivers on Android for cars

Posted by Mickey Kataria, Director of Product Management, Android for cars

This blog post is part of a weekly series for #11WeeksOfAndroid. For each week, we’re diving into a key area and this week we’re focusing on Android Beyond Phones. Today, we’ll be talking about cars.

Since 2014, Google has been committed to bringing the familiarity of apps and services from Android phones into the car in a safe and seamless way. We’re continuing to see strong momentum and adoption of both Android Auto and Android Automotive OS, and are excited to share new improvements that provide app developers the opportunity to reach more users in the car.

Android Auto momentum

We launched Android Auto for users to stay connected on-the-go and more easily access their Android phones on their car displays— while staying focused on the road. Android Auto is currently available with nearly every major car manufacturer and is on track to be in more than 100 million cars in the coming months. Many car manufacturers, including General Motors, BMW and Kia, have also added support for wireless connections, making it easier for drivers to use Android Auto as soon as they get into their car. We’re continuing to add new features to make the experience more seamless for users and help developers reach more drivers with in-car apps.

Expanding Android Auto’s app ecosystem

One of our most common requests for Android Auto continues to be support for more apps in the car. We currently have over 3,000 apps in Google Play whose in-car experiences have been purpose-built for driving.

Today, we’re showcasing our work with early access partners to build apps in new categories for Android Auto, including navigation, parking and electric vehicle charging. Using our new Android for Cars App Library, we’re able to ensure that all tasks within an app can be achieved with minimal glances or taps.

image

Early access partners for new apps on Android Auto

To mitigate driver distraction, we collaborated with government, industry and academic institutions to develop our own best practice guidelines that we apply to every aspect of our product development process. With our standard templates and guidelines, developers have the tools to easily optimize their apps for cars, without needing to become an expert in driver distraction.

Our early access partners will be releasing new apps to their beta testers by the end of this year. Pending additional testing and feedback, we then plan to make these APIs publicly available for all developers to build Android Auto apps in these categories.

Android

We're partnering with some of the leading navigation, parking and electric vehicle charging apps around the world including ChargePoint, SpotHero and Sygic.

Android Automotive OS adoption

More recently, we introduced Android Automotive OS as a full-stack, open source and highly customizable platform powering vehicle infotainment systems. With Android Automotive OS, car manufacturers are able to have apps and services like Google Assistant, Google Maps and Google Play built into vehicles so that a mobile device is not required for common activities like navigation, downloading third-party apps and listening to media. Polestar 2, the first car running Android Automotive OS with Google built in, is now on the road and available for customers globally. In addition, Volvo Cars, Renault, General Motors and more have announced plans for infotainment systems powered by Android Automotive OS with Google apps and services built-in.

Extending the reach of media apps in cars

As more manufacturers begin to ship cars with infotainment systems powered by Android Automotive OS, developers have the opportunity to deliver a seamless media experience using Google Play in the car. If you already have a media app for Android Auto, you can extend the reach by adding support for Android Automotive OS. The process for porting over your apps is simple with most of the work already done, just follow these steps.

Making it easier to develop media apps for Android Automotive OS

For the past year, we have been on a journey to allow app developers to design, develop, test and publish media apps directly on Google Play in the car. We are happy to share that this is now possible.

Android Auto image Image of Polestar 2 and Google Generic Automative system

Polestar 2 and Google Generic Automotive system images for Android emulator

We have made updates to the Android Automotive OS design guidelines and development documentation for you to add support for your media apps. We also launched updates to the emulator to include Google Assistant, Google Maps and Google Play, so you can develop and test your apps in an environment that more closely mirrors the software in the car. The Polestar 2 system image enables you to test your app on similar software that is available on the road today. Lastly, the Play Console now accepts Android Automotive OS APKs, enabling you to simply upload your app for quality review and publishing. These changes allow developers to seamlessly complete the end-to-end development process for Android Automotive OS.

Image of Google Play features

Google Play features many media apps today, including Spotify, iHeartRadio, NPR One and more.

To learn more about how to create an app for Android Automotive OS, look out for updates or post on the automotive-developers Google Group or Stack Overflow using android-automotive tags.

With new app expansion on Android Auto and improved development tools for Android Automotive OS, developers have more opportunity than ever to reach users with app experiences optimized for the car. Head over to developer.android.com/cars to get started!

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

11 Weeks of Android: App distribution and monetization on Google Play

Posted by Alex Musil, Director of Product Management, Google Play

11 Weeks of Android Week 8 App Distribution & Monetization

This blog post is part of a weekly series for #11WeeksOfAndroid. Each week we’re diving into a key area of Android so you don’t miss anything. This week, we spotlighted app distribution and monetization on Google Play; here’s a look at what you should know.

Thanks for joining us for this week of 11 Weeks of Android, where we focused on app distribution and monetization. The developments we announced will enable you to deliver the exciting improvements to the Android platform you’ve been hearing about since week 1.

Google Play partners with developers to deliver amazing digital experiences to billions of Android users. From the start, we’ve committed to providing the tools and insights you need to reach more users and grow your business. This week, we launched even more features — and improved existing ones — to help you continue to maximize your success.

Key takeaways

  1. We released several webinars about the new Google Play Console beta. Check out the videos if you weren’t able to tune in live.
  2. We shared recent improvements we’ve made to app bundles, as well as our intention to require new apps and games to publish with this format in the second half of 2021.
  3. Developers can now ask for ratings and reviews from within your app with the new in-app review API.
  4. To increase user trust in our billing platform, we made some product updates and reminded you of our policy around more transparent subscriptions. We also expanded our feature set to help you better reach and retain buyers, and launched Play Billing Library 3, which will be required by mid-2021.
  5. Google Play Pass launched in nine new markets last month. With an innovative revenue model, participating titles together have earned 2.5x the revenue of Google Play Store-only sales, without diminishing Play Store earnings. You can learn more and express interest in joining.

Google Play Console beta

Thank you to everyone who has already shared their feedback on the new Google Play Console beta, which launched a few months ago at play.google.com/console. As we’ve continued to update the beta, we’ve launched a number of key releases including:

  • Major performance increases across different browsers, which many of you requested
  • New menus and headers on mobile for a better responsive experience
  • Features including Inbox (your Google Play Console messaging hub) and enhanced subscription retention reports

Earlier this week, we hosted three webinars to get you up to speed on what’s new and what’s changed from the classic Play Console. If you weren’t able to tune in live, you can watch the videos on demand below.

If you’re just getting started, join Google Play Console’s lead engineer, Dan White, for a look at new features like Inbox, policy status, app content, and enhanced team management capabilities.

To help you release with even more confidence, check out this webinar with Google Play UX designer Matt McGriskin, who will walk you through the new testing and publishing workflow.

Finally, if you want to grow your audience, join Google Play engineer Ryan Fanelli for app store optimization best practices and an overview of the new acquisition reports.

You can also take our Play Console Play Academy course. And if you haven’t already, please opt in to 2-Step Verification to sign into Google Play Console, which will be required later this year.

Android App Bundle

We’re glad so many of you are already using the Android App Bundle to release your apps and games. We’re continuing to make app bundles a better publishing format with several recent improvements:

  • The recently-launched Play Asset Delivery brings the benefits of app bundles to games and allows developers to improve the user experience while cutting delivery costs and reducing the size of their game
  • You can now shrink resources when building modular apps
  • Install-time modules are now automatically fused by default when app bundles are processed into distribution APKs
  • Feature-to-feature dependency is now stable in Android Studio 4.0

If you haven’t switched to the app bundle yet, we’ve published some FAQs on Play App Signing—which is required for app bundles—as well as guidance on how to test your app bundle. Check out our recent blog post to find out more about the recent improvements we’ve made to developing, testing, and publishing with app bundles.

As we announced as part of the Android 11 Beta launch, we intend to require new apps to publish with the Android App Bundle on Google Play in the second half of 2021. This means that we will also be deprecating APK expansion files (OBBs) and making Play Asset Delivery the standard for publishing games larger than 150MB.

In-app review API

Because ratings and reviews are such an important touchpoint with your users, many of you asked us to give users the ability to leave a review from within your app. Now, with the new in-app review API, you can do just that. Choose when to prompt users for a review and get feedback when it’s most valuable. The in-app review API is available now in the Play Core Library.

We've also released a unified sample for Play Core APIs, which includes in-app reviews as well as on-demand feature modules and in-app updates. Check it out to learn how to use these APIs using our Play Core Kotlin extensions artifact, which makes working with Play Core easier for Kotlin users.

Google Play Commerce

We’ve made a number of updates to Play Commerce aimed at building user trust through clearer, easier payment experiences. The user trust policies we announced in April offer users greater transparency, safer trial experiences, and easier cancellations.

We also launched Play Billing Library 3, which supports cash payments, a better subscription promo code redemption experience, purchase attribution, and more. Billing Play Library 3 will be mandatory for all new apps starting August 2, 2021.

For more information, check out this session with Mrinalini Loew, Group Project Manager for Google Play Commerce.

We’ve also just kicked off a six-article series on Google Play Billing, which you can follow here on Medium.

Google Play Pass

Google Play Pass enables developers to earn additional revenue and connect with untapped audiences by offering experiences free of ads and in-app purchases. Since launching last September, Play Pass has added over 200 new titles to the catalog, from puzzles and racing games to utility and kid-friendly apps. We’re also excited to celebrate the world premieres of Super Glitch Dash and Element this week as the newest “Premiering on Play Pass” titles.

The expanded catalog has enabled rich user experiences and provided a sustainable stream of revenue for developers using an innovative revenue payout model. In aggregate, titles on Play Pass earn more than 2.5x the revenue compared to their Play Store-only earnings in the US.

Last month, we made Google Play Pass available in nine new markets and gave users the option to get started with either an annual subscription or the existing monthly plan.

Today, we are announcing that developers with in-app subscriptions can now nominate their titles to join Play Pass. If you’re building a great experience that Google Play Pass users would love, you can learn more and express interest in participating.

Learning path

If you’re looking for an easy way to pick up the highlights of this week, check out the app distribution and monetization pathway. Test your knowledge of key takeaways to earn a limited-edition virtual badge.

Thanks for joining us for 11 Weeks of Android! We hope you find these recent announcements and resources helpful in powering your success on Google Play.

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!