Tag Archives: media

Common media processing operations with Jetpack Media3 Transformer

Posted by Nevin Mital – Developer Relations Engineer, and Kristina Simakova – Engineering Manager

Android users have demonstrated an increasing desire to create, personalize, and share video content online, whether to preserve their memories or to make people laugh. As such, media editing is a cornerstone of many engaging Android apps, and historically developers have often relied on external libraries to handle operations such as Trimming and Resizing. While these solutions are powerful, integrating and managing external library dependencies can introduce complexity and lead to challenges with managing performance and quality.

The Jetpack Media3 Transformer APIs offer a native Android solution that streamline media editing with fast performance, extensive customizability, and broad device compatibility. In this blog post, we’ll walk through some of the most common editing operations with Transformer and discuss its performance.

Getting set up with Transformer

To get started with Transformer, check out our Getting Started documentation for details on how to add the dependency to your project and a basic understanding of the workflow when using Transformer. In a nutshell, you’ll:

    • Create one or many MediaItem instances from your video file(s), then
    • Apply item-specific edits to them by building an EditedMediaItem for each MediaItem,
    • Create a Transformer instance configured with settings applicable to the whole exported video,
    • and finally start the export to save your applied edits to a file.
Aside: You can also use a CompositionPlayer to preview your edits before exporting, but this is out of scope for this blog post, as this API is still a work in progress. Please stay tuned for a future post!

Here’s what this looks like in code:

val mediaItem = MediaItem.Builder().setUri(mediaItemUri).build()
val editedMediaItem = EditedMediaItem.Builder(mediaItem).build()
val transformer = 
  Transformer.Builder(context)
    .addListener(/* Add a Transformer.Listener instance here for completion events */)
    .build()
transformer.start(editedMediaItem, outputFilePath)

Transcoding, Trimming, Muting, and Resizing with the Transformer API

Let’s now take a look at four of the most common single-asset media editing operations, starting with Transcoding.

Transcoding is the process of re-encoding an input file into a specified output format. For this example, we’ll request the output to have video in HEVC (H265) and audio in AAC. Starting with the code above, here are the lines that change:

val transformer = 
  Transformer.Builder(context)
    .addListener(...)
    .setVideoMimeType(MimeTypes.VIDEO_H265)
    .setAudioMimeType(MimeTypes.AUDIO_AAC)
    .build()

Many of you may already be familiar with FFmpeg, a popular open-source library for processing media files, so we’ll also include FFmpeg commands for each example to serve as a helpful reference. Here’s how you can perform the same transcoding with FFmpeg:

$ ffmpeg -i $inputVideoPath -c:v libx265 -c:a aac $outputFilePath

The next operation we’ll try is Trimming.

Specifically, we’ll set Transformer up to trim the input video from the 3 second mark to the 8 second mark, resulting in a 5 second output video. Starting again from the code in the “Getting set up” section above, here are the lines that change:

// Configure the trim operation by adding a ClippingConfiguration to
// the media item
val clippingConfiguration =
   MediaItem.ClippingConfiguration.Builder()
     .setStartPositionMs(3000)
     .setEndPositionMs(8000)
     .build()
val mediaItem =
   MediaItem.Builder()
     .setUri(mediaItemUri)
     .setClippingConfiguration(clippingConfiguration)
     .build()

// Transformer also has a trim optimization feature we can enable.
// This will prioritize Transmuxing over Transcoding where possible.
// See more about Transmuxing further down in this post.
val transformer = 
  Transformer.Builder(context)
    .addListener(...)
    .experimentalSetTrimOptimizationEnabled(true)
    .build()

With FFmpeg:

$ ffmpeg -ss 00:00:03 -i $inputVideoPath -t 00:00:05 $outputFilePath

Next, we can mute the audio in the exported video file.

val editedMediaItem = 
  EditedMediaItem.Builder(mediaItem)
    .setRemoveAudio(true)
    .build()

The corresponding FFmpeg command:

$ ffmpeg -i $inputVideoPath -c copy -an $outputFilePath

And for our final example, we’ll try resizing the input video by scaling it down to half its original height and width.

val scaleEffect = 
  ScaleAndRotateTransformation.Builder()
    .setScale(0.5f, 0.5f)
    .build()
val editedMediaItem =
  EditedMediaItem.Builder(mediaItem)
    .setEffects(
      /* audio */ Effects(emptyList(), 
      /* video */ listOf(scaleEffect))
    )
    .build()

An FFmpeg command could look like this:

$ ffmpeg -i $inputVideoPath -filter:v scale=w=trunc(iw/4)*2:h=trunc(ih/4)*2 $outputFilePath

Of course, you can also combine these operations to apply multiple edits on the same video, but hopefully these examples serve to demonstrate that the Transformer APIs make configuring these edits simple.

Transformer API Performance results

Here are some benchmarking measurements for each of the 4 operations taken with the Stopwatch API, running on a Pixel 9 Pro XL device:

(Note that performance for operations like these can depend on a variety of reasons, such as the current load the device is under, so the numbers below should be taken as rough estimates.)

Input video format: 10s 720p H264 video with AAC audio

  • Transcoding to H265 video and AAC audio: ~1300ms
  • Trimming video to 00:03-00:08: ~2300ms
  • Muting audio: ~200ms
  • Resizing video to half height and width: ~1200ms

Input video format: 25s 360p VP8 video with Vorbis audio

  • Transcoding to H265 video and AAC audio: ~3400ms
  • Trimming video to 00:03-00:08: ~1700ms
  • Muting audio: ~1600ms
  • Resizing video to half height and width: ~4800ms

Input video format: 4s 8k H265 video with AAC audio

  • Transcoding to H265 video and AAC audio: ~2300ms
  • Trimming video to 00:03-00:08: ~1800ms
  • Muting audio: ~2000ms
  • Resizing video to half height and width: ~3700ms

One technique Transformer uses to speed up editing operations is by prioritizing transmuxing for basic video edits where possible. Transmuxing refers to the process of repackaging video streams without re-encoding, which ensures high-quality output and significantly faster processing times.

When not possible, Transformer falls back to transcoding, a process that involves first decoding video samples into raw data, then re-encoding them for storage in a new container. Here are some of these differences:

Transmuxing

    • Transformer’s preferred approach when possible - a quick transformation that preserves elementary streams.
    • Only applicable to basic operations, such as rotating, trimming, or container conversion.
    • No quality loss or bitrate change.
Transmux

Transcoding

    • Transformer's fallback approach in cases when Transmuxing isn't possible - Involves decoding and re-encoding elementary streams.
    • More extensive modifications to the input video are possible.
    • Loss in quality due to re-encoding, but can achieve a desired bitrate target.
Transcode

We are continuously implementing further optimizations, such as the recently introduced experimentalSetTrimOptimizationEnabled setting that we used in the Trimming example above.

A trim is usually performed by re-encoding all the samples in the file, but since encoded media samples are stored chronologically in their container, we can improve efficiency by only re-encoding the group of pictures (GOP) between the start point of the trim and the first keyframes at/after the start point, then stream-copying the rest.

Since we only decode and encode a fixed portion of any file, the encoding latency is roughly constant, regardless of what the input video duration is. For long videos, this improved latency is dramatic. The optimization relies on being able to stitch part of the input file with newly-encoded output, which means that the encoder's output format and the input format must be compatible.

If the optimization fails, Transformer automatically falls back to normal export.

What’s next?

As part of Media3, Transformer is a native solution with low integration complexity, is tested on and ensures compatibility with a wide variety of devices, and is customizable to fit your specific needs.

To dive deeper, you can explore Media3 Transformer documentation, run our sample apps, or learn how to complement your media editing pipeline with Jetpack Media3. We’ve already seen app developers benefit greatly from adopting Transformer, so we encourage you to try them out yourself to streamline your media editing workflows and enhance your app’s performance!

Apps adopt Transformer to support more reliable and performant media editing use cases

Posted by Caren Chang – Developer Relations Engineer

The Jetpack Media3 library enables Android apps to build high quality media apps. As part of the Media3 library, the Transformer module aims to provide easy to use, reliable, and performant APIs for transcoding and editing media.

For example, apps can use Transformer to apply editing operations such as trimming a long piece of media file, or applying effects to video tracks. Transformer can also be used to convert media files from one format to another, such as adjusting the resolution or encoding of the media file.

Developing Transformer APIs

As part of the process to introduce new APIs, our engineering team works closely with Google apps such as Google Photos to test and experiment the new APIs. Experimental flags are first introduced to enable performance improvements. Once the results are successful and conclusive, these experimental features are then built into the default API implementations or promoted to public APIs for all apps to use. This approach allows Transformer APIs to be tested on a wide variety of devices.

Transformer Adoption in apps

Apps that have been using Transformer in production observed in-app performance improvements, less code to maintain, and better developer experience. Let’s take a closer look at how Transformer has helped apps for their media-editing use cases.

One of users’ favorite features in Google Photos is memory sharing, where snippets of your life story that are curated and presented as Google Photos memories can now be shared as videos to social media and chat apps. However, the process of combining media items to create a video on device is resource intensive and subject to significant latency, especially on low-end devices. To reduce this latency and enable the feature on a wider range of devices, Photos adopted Transformer in their media creation pipeline. Along with other improvements made, the team found that Transformer played a part in reducing the median user latency for creating memory videos by 41% on high-end devices and 27% on mid-range devices.

The Photos app also enables users to perform media edits such as trimming or rotating a video. By adopting Transformer APIs for rotating videos, median save latency was reduced by 79% for applicable videos. The app also adopted Transformer’s API for optimizing video trimming, and observed video save latency decrease by 64%.

1 Second Everyday is a personal video journal that helps you create captivating montages and timelapses. One of the app’s main user journeys is sequentially combining short videos to create a meaningful movie. After adopting Transformer for this use case, the app observed that video encoding performance was up to 5x faster, allowing them to explore enabling 4k and HDR support. The Transformer adoption also helped decrease relevant code by 30%, making it easier for the developers to maintain the code base.

BandLab is the next-generation music creation platform used by millions around the world to make and share their music. The app originally used MediaCodecs for their video creation use cases, but found that the low level implementation resulted in native crashes that were difficult to debug. After researching more on Transformer, the team made the decision to migrate from MediaCodecs to Transformer. Overall, it only took the team 12 working days for the migration, and this resulted in a simpler codebase and more maintainable pipeline for their media creation use cases. In addition, the app observed that all previously observed native crashes were no longer occurring anymore.

What’s next for Transformers?

We’re excited to see Transformer’s adoption in the developer community, and will continue adding new features to support more media-editing use cases for the Android ecosystem including:

    • Better support for previewing media edits
    • Improving the performance and developer experience for video frame extraction
    • Easier integration with AI effects
    • and much more

Keep an eye out on what we’re working on in the Media3 Github, and file feature requests to help shape the future of Transformer!

Performance Class helps Google Maps deliver premium experiences

Posted by Nevin Mital - Developer Relations Engineer, Android Media

The Android ecosystem features a diverse range of devices, and it can be difficult to build experiences that take advantage of new or premium hardware features while still working well for users on all devices. With Android 12, we introduced the Media Performance Class (MPC) standard to help developers better understand a device’s capabilities and identify high-performing devices. For a refresher on what MPC is, please see our last blog post, Using performance class to optimize your user experience, or check out the Performance Class documentation.

Earlier this year, we published the first stable release of the Jetpack Core Performance library as the recommended solution for more reliably obtaining a device’s MPC level. In particular, this library introduces the PlayServicesDevicePerformance class, an API that queries Google Play Services to get the most up-to-date MPC level for the current device and build. I’ll get into the technical details further down, but let’s start by taking a look at how Google Maps was able to tailor a feature launch to best fit each device with MPC.

Performance Class unblocks premium experience launch for Google Maps

Google Maps recently took advantage of the expanded device coverage enabled by the Play Services module to unblock a feature launch. Google Maps wanted to update their UI by increasing the transparency of some layers. Consequently, this meant they would need to render more of the map, and found they had to stop the rollout due to latency increases on many devices, especially towards the low-end. To resolve this, the Maps team started by slicing an existing key metric, “seconds to UI item visibility”, by MPC level, which revealed that while all devices had a small increase in this latency, devices without an MPC level had the largest increase.

A bar graph displays A/B test results for Seconds to UI item visibility, comparing control results with those using increased transparency across different Media Performance Class Levels.  A green horizontal line and text indicate the updated experience shipped to devices qualifying for MPC. A vertical green dotted line separates results for devices without a specific MPC level, which kept the previous UI.

With these results in hand, Google Maps started their rollout again, but this time only launching the feature on devices that report an MPC level. As devices continue to get updated and meet the bar for MPC, the updated Google Maps UI will be available to them as well.

The new Play Services module

MPC level requirements are defined in the Android Compatibility Definition Document (CDD), then devices and Android builds are validated against these requirements by the Android Compatibility Test Suite (CTS). The Play Services module of the Jetpack Core Performance library leverages these test results to continually update a device’s reported MPC level without any additional effort on your end. This also means that you’ll immediately have access to the MPC level for new device launches without needing to acquire and test each device yourself, since it already passed CTS. If the MPC level is not available from Google Play Services, the library will fall back to the MPC level declared by the OEM as a build constant.

A flowchart depicts the process of determining Performance Class levels for Android devices, involving manufacturers, CTS tests, a Grader, the Play Services module, and the CDD.

As of writing, more than 190M in-market devices covering over 500 models across 40+ brands report an MPC level. This coverage will continue to grow over time, as older devices update to newer builds, from Android 11 and up.

Using the Core Performance library

To use Jetpack Core Performance, start by adding a dependency for the relevant modules in your Gradle configuration, and create an instance of DevicePerformance. Initializing a DevicePerformance should only happen once in your app, as early as possible - for example, in the onCreate() lifecycle event of your Application. In this example, we’ll use the Google Play services implementation of DevicePerformance.

// Implementation of Jetpack Core library.
implementation("androidx.core:core-ktx:1.12.0")
// Enable APIs to query for device-reported performance class.
implementation("androidx.core:core-performance:1.0.0")
// Enable APIs to query Google Play Services for performance class.
implementation("androidx.core:core-performance-play-services:1.0.0")

import androidx.core.performance.play.services.PlayServicesDevicePerformance

class MyApplication : Application() {
  lateinit var devicePerformance: DevicePerformance

  override fun onCreate() {
    // Use a class derived from the DevicePerformance interface
    devicePerformance = PlayServicesDevicePerformance(applicationContext)
  }
}

Then, later in your app when you want to retrieve the device’s MPC level, you can call getMediaPerformanceClass():

class MyActivity : Activity() {
  private lateinit var devicePerformance: DevicePerformance
  override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    // Note: Good app architecture is to use a dependency framework. See
    // https://developer.android.com/training/dependency-injection for more
    // information.
    devicePerformance = (application as MyApplication).devicePerformance
  }

  override fun onResume() {
    super.onResume()
    when {
      devicePerformance.mediaPerformanceClass >= Build.VERSION_CODES.UPSIDE_DOWN_CAKE -> {
        // MPC level 34 and later.
        // Provide the most premium experience for the highest performing devices.
      }
      devicePerformance.mediaPerformanceClass == Build.VERSION_CODES.TIRAMISU -> {
        // MPC level 33.
        // Provide a high quality experience.
      }
      else -> {
        // MPC level 31, 30, or undefined.
        // Remove extras to keep experience functional.
      }
    }
  }
}

Strategies for using Performance Class

MPC is intended to identify high-end devices, so you can expect to see MPC levels for the top devices from each year, which are the devices you’re likely to want to be able to support for the longest time. For example, the Pixel 9 Pro released with Android 14 and reports an MPC level of 34, the latest level definition when it launched.

You should use MPC as a complement to any existing Device Clustering solutions you already use, such as querying a device’s static specs or manually blocklisting problematic devices. An area where MPC can be a particularly helpful tool is for new device launches. New devices should be included at launch, so you can use MPC to gauge new devices’ capabilities right from the start, without needing to acquire the hardware yourself or manually test each device.

A great first step to get involved is to include MPC levels in your telemetry. This can help you identify patterns in error reports or generally get a better sense of the devices your user base uses if you segment key metrics by MPC level. From there, you might consider using MPC as a dimension in your experimentation pipeline, for example by setting up A/B testing groups based on MPC level, or by starting a feature rollout with the highest MPC level and working your way down. As discussed previously, this is the approach that Google Maps took.

You could further use MPC to tune a user-facing feature, for example by adjusting the number of concurrent video playbacks your app attempts based on the MPC level’s concurrent codec guarantees. However, make sure to still query a device’s runtime capabilities when using this approach, as they may differ depending on the environment and state the device is in.

Get in touch!

If MPC sounds like it could be useful for your app, please give it a try! You can get started by taking a look at our sample code or documentation. We welcome you to share any questions or feedback you have in this short form.


This blog post is a part of Camera and Media Spotlight Week. We're providing resources – blog posts, videos, sample code, and more – all designed to help you uplevel the media experiences in your app.

To learn more about what Spotlight Week has to offer and how it can benefit you, be sure to read our overview blog post.

Spotlight Week: Android Camera and Media

Posted by Caren Chang- Android Developer Relations Engineer

Android offers Camera and Media APIs to help you build apps that can capture, edit, share, and play media. To help you enhance Android Camera and Media experiences to be even more delightful for your users, this week we will be kicking off the Camera and Media Spotlight week.

This Spotlight Week will provide resources—blog posts, videos, sample code, and more—all designed to help you uplevel the media experiences in your app. Check out highlights from the latest releases in Camera and Media APIs, including better Jetpack Compose support in CameraX, motion photo support in Media3 Transformer, simpler ExoPlayer setup, and much more! We’ll also bring in developers from the community to talk about their experiences building Android camera and media apps.


Here’s what we’re covering during Camera and Media Spotlight week:

What’s new in camera and media

Tuesday, January 7

Check out what’s new in the latest CameraX and Media3 releases, including how to get started with building Camera apps with Compose.

Creating delightful and premium experiences

Wednesday, January 8

Building delightful and premium experiences for your users is what can help your app really stand out. Learn about different ways to achieve this such as utilizing the Media Performance Class or enabling HDR video capture in your app. Learn from developers, such as how Google Drive enabled Ultra HDR images in their Android app, and Instagram improved the in-app image capture experience by implementing Night Mode.

Adaptive for camera and media, for large screens and now XR!

Thursday, January 9

Thinking adaptive is important, so your app works just as well on phones as it does large screens, like foldables, tablets, ChromeOS, cars, and the new Android XR platform! On Thursday, we’ll be diving into the media experience on large screen devices, and how you can build in a smooth tabletop mode for your camera applications. Prepare your apps for XR devices by considering Spatial Audio and Video.

Media creation

Friday, January 10

Capturing, editing, and processing media content are fundamental features of the Android ecosystem. Learn about how Media3’s Transformer module can help your app’s media processing use cases, and see case studies of apps that are using Transformer in production. Listen in to how the 1 Second Everyday Android app approaches media use cases, and check out a new API that allows apps to capture concurrent camera streams.Learn from Android Google Developer Tom Colvin on how he experimented with building an AI-powered Camera app.


These are just some of the things to think about when building camera and media experiences in your app. Keep checking this blog post for updates; we’ll be adding links and more throughout the week.

Media3 1.5.0 — what’s new?

Posted by Kristina Simakova – Engineering Manager

This article is cross-published on Medium

Media3 1.5.0 is now available!

Transformer now supports motion photos and faster image encoding. We’ve also simplified the setup for DefaultPreloadManager and ExoPlayer, making it easier to use. But that’s not all! We’ve included a new IAMF decoder, a Kotlin listener extension, and easier Player optimization through delegation.

To learn more about all new APIs and bug fixes, check out the full release notes.

Transformer improvements

Motion photo support

Transformer now supports exporting motion photos. The motion photo’s image is exported if the corresponding MediaItem’s image duration is set (see MediaItem.Builder().setImageDurationMs()) Otherwise, the motion photo’s video is exported. Note that the EditedMediaItem’s duration should not be set in either case as it will automatically be set to the corresponding MediaItem’s image duration.

Faster image encoding

This release accelerates image-to-video encoding, thanks to optimizations in DefaultVideoFrameProcessor.queueInputBitmap(). DefaultVideoFrameProcessor now treats the Bitmap given to queueInputBitmap() as immutable. The GL pipeline will resample and color-convert the input Bitmap only once. As a result, Transformer operations that take large (e.g. 12 megapixels) images as input execute faster.

AudioEncoderSettings

Similar to VideoEncoderSettings, Transformer now supports AudioEncoderSettings which can be used to set the desired encoding profile and bitrate.

Edit list support

Transformer now shifts the first video frame to start from 0. This fixes A/V sync issues in some files where an edit list is present.

Unsupported track type logging

This release includes improved logging for unsupported track types, providing more detailed information for troubleshooting and debugging.

Media3 muxer

In one of the previous releases we added a new muxer library which can be used to create MP4 container files. The media3 muxer offers support for a wide range of audio and video codecs, enabling seamless handling of diverse media formats. This new library also brings advanced features including:

    • B-frame support
    • Fragmented MP4 output
    • Edit list support

The muxer library can be included as a gradle dependency:

implementation ("androidx.media3:media3-muxer:1.5.0")

Media3 muxer with Transformer

To use the media3 muxer with Transformer, set an InAppMuxer.Factory (which internally wraps media3 muxer) as the muxer factory when creating a Transformer:

val transformer = Transformer.Builder(context)
    .setMuxerFactory(InAppMuxer.Factory.Builder().build())
    .build()

Simpler setup for DefaultPreloadManager and ExoPlayer

With Media3 1.5.0, we added DefaultPreloadManager.Builder, which makes it much easier to build the preload components and the player. Previously we asked you to instantiate several required components (RenderersFactory, TrackSelectorFactory, LoadControl, BandwidthMeter and preload / playback Looper) first, and be super cautious on correctly sharing those components when injecting them into the DefaultPreloadManager constructor and the ExoPlayer.Builder. With the new DefaultPreloadManager.Builder this becomes a lot simpler:

    • Build a DefaultPreloadManager and ExoPlayer instances with all default components.
val preloadManagerBuilder = DefaultPreloadManager.Builder()
val preloadManager = preloadManagerBuilder.build()
val player = preloadManagerBuilder.buildExoPlayer()

    • Build a DefaultPreloadManager and ExoPlayer instances with custom sharing components.
val preloadManagerBuilder = DefaultPreloadManager.Builder().setRenderersFactory(customRenderersFactory)
// The resulting preloadManager uses customRenderersFactory
val preloadManager = preloadManagerBuilder.build()
// The resulting player uses customRenderersFactory
val player = preloadManagerBuilder.buildExoPlayer()

    • Build a DefaultPreloadManager and ExoPlayer instances, while setting the custom playback-only configurations on the ExoPlayers.
val preloadManagerBuilder = DefaultPreloadManager.Builder()
val preloadManager = preloadManagerBuilder.build()
// Tune the playback-only configurations
val playerBuilder = ExoPlayer.Builder().setFooEnabled()
// The resulting player will have playback feature "Foo" enabled
val player = preloadManagerBuilder.buildExoPlayer(playerBuilder)

Preloading the next playlist item

We’ve added the ability to preload the next item in the playlist of ExoPlayer. By default, playlist preloading is disabled but can be enabled by setting the duration which should be preloaded to memory:

player.preloadConfiguration =
    PreloadConfiguration(/* targetPreloadDurationUs= */ 5_000_000L)

With the PreloadConfiguration above, the player tries to preload five seconds of media for the next item in the playlist. Preloading is only started when no media is being loaded that is required for the ongoing playback. This way preloading doesn’t compete for bandwidth with the primary playback.

When enabled, preloading can help minimize join latency when a user skips to the next item before the playback buffer reaches the next item. The first period of the next window is prepared and video, audio and text samples are preloaded into its sample queues. The preloaded period is later queued into the player with preloaded samples immediately available and ready to be fed to the codec for rendering.

Once opted-in, playlist preloading can be turned off again by using PreloadConfiguration.DEFAULT to disable playlist preloading:

player.preloadConfiguration = PreloadConfiguration.DEFAULT

New IAMF decoder and Kotlin listener extension

The 1.5.0 release includes a new media3-decoder-iamf module, which allows playback of IAMF immersive audio tracks in MP4 files. Apps wanting to try this out will need to build the libiamf decoder locally. See the media3 README for full instructions.

implementation ("androidx.media3:media3-decoder-iamf:1.5.0")

This release also includes a new media3-common-ktx module, a home for Kotlin-specific functionality. The first version of this module contains a suspend function that lets the caller listen to Player.Listener.onEvents. This is a building block that’s used by the upcoming media3-ui-compose module (launching with media3 1.6.0) to power a Jetpack Compose playback UI.

implementation ("androidx.media3:media3-common-ktx:1.5.0")

Easier Player customization via delegation

Media3 has provided a ForwardingPlayer implementation since version 1.0.0, and we have previously suggested that apps should use it when they want to customize the way certain Player operations work, by using the decorator pattern. One very common use-case is to allow or disallow certain player commands (in order to show/hide certain buttons in a UI). Unfortunately, doing this correctly with ForwardingPlayer is surprisingly hard and error-prone, because you have to consistently override multiple methods, and handle the listener as well. The example code to demonstrate how fiddly this is too long for this blog, so we’ve put it in a gist instead.

In order to make these sorts of customizations easier, 1.5.0 includes a new ForwardingSimpleBasePlayer, which builds on the consistency guarantees provided by SimpleBasePlayer to make it easier to create consistent Player implementations following the decorator pattern. The same command-modifying Player is now much simpler to implement:

class PlayerWithoutSeekToNext(player: Player) : ForwardingSimpleBasePlayer(player) {
  override fun getState(): State {
    val state = super.getState()
    return state
      .buildUpon()
      .setAvailableCommands(
        state.availableCommands.buildUpon().remove(COMMAND_SEEK_TO_NEXT).build()
      )
      .build()
  }

  // We don't need to override handleSeek, because it is guaranteed not to be called for
  // COMMAND_SEEK_TO_NEXT since we've marked that command unavailable.
}

MediaSession: Command button for media items

Command buttons for media items allow a session app to declare commands supported by certain media items that then can be conveniently displayed and executed by a MediaController or MediaBrowser:

image of command buttons for media items in the Media Center of android Automotive OS
Screenshot: Command buttons for media items in the Media Center of Android Automotive OS.

You'll find the detailed documentation on android.developer.com.

This is the Media3 equivalent of the legacy “custom browse actions” API, with which Media3 is fully interoperable. Unlike the legacy API, command buttons for media items do not require a MediaLibraryService but are a feature of the Media3 MediaSession instead. Hence they are available for MediaController and MediaBrowser in the same way.


If you encounter any issues, have feature requests, or want to share feedback, please let us know using the Media3 issue tracker on GitHub. We look forward to hearing from you!


This blog post is a part of Camera and Media Spotlight Week. We're providing resources – blog posts, videos, sample code, and more – all designed to help you uplevel the media experiences in your app.

To learn more about what Spotlight Week has to offer and how it can benefit you, be sure to read our overview blog post.

Cloud photos now available in the Android photo picker

Posted by Roxanna Aliabadi Walker – Product Manager

Available now with Google Photos

Our photo picker has always been the gateway to your local media library, providing a secure, date-sorted interface for users to grant apps access to selected images and videos. But now, we're taking it a step further by integrating cloud photos from your chosen cloud media app directly into the photo picker experience.

Moving image of the photo picker access

Unifying your media library

Backed-up photos, also known as "cloud photos," will now be merged with your local ones in the photo picker, eliminating the need to switch between apps. Additionally, any albums you've created in your cloud storage app will be readily accessible within the photo picker's albums tab. If your cloud media provider has a concept of “favorites,” they will be showcased prominently within the albums tab of the photo picker for easy access. This feature is currently rolling out with the February Google System Update to devices running Android 12 and above.

Available now with Google Photos, but open to all

Google Photos is already supporting this new feature, and our APIs are open to any cloud media app that qualifies for our pilot program. Our goal is to make accessing your lifetime of memories effortless, regardless of the app you prefer.

The Android photo picker will attempt to auto-select a cloud media app for you, but you can change or remove your selected cloud media app at any time from photo picker settings.

Image of Cloud media settings in photo picker settings

Migrate today for an enhanced, frictionless experience

The Android photo picker substantially reduces friction by not requiring any runtime permissions. If you switch from using a custom photo picker to the Android photo picker, you can offer this enhanced experience with cloud photos to your users, as well as reduce or entirely eliminate the overhead involved with acquiring and managing access to photos on the device. (Note that apps without a need for persistent and/or broad scale access to photos - for example - to set a profile picture, must adopt the Android photo picker in lieu of any sensitive file permissions to adhere to Google Play policy).

The photo picker has been backported to Android 4.4 to make it easy to migrate without needing to worry about device compatibility. Access to cloud content will only be available for users running Android 12 and higher, but developers do not need to consider this when implementing the photo picker into their apps. To use the photo picker in your app, update the ActivityX dependency to version 1.7.x or above and add the following code snippet:

// Registers a photo picker activity launcher in single-select mode.
val pickMedia = registerForActivityResult(PickVisualMedia()) { uri ->
    // Callback is invoked after the user selects a media item or closes the
    // photo picker.
    if (uri != null) {
        Log.d("PhotoPicker", "Selected URI: $uri")
    } else {
        Log.d("PhotoPicker", "No media selected")
    }
}


// Launch the photo picker and let the user choose images and videos.
pickMedia.launch(PickVisualMediaRequest(PickVisualMedia.ImageAndVideo))

// Launch the photo picker and let the user choose only images.
pickMedia.launch(PickVisualMediaRequest(PickVisualMedia.ImageOnly))

// Launch the photo picker and let the user choose only videos.
pickMedia.launch(PickVisualMediaRequest(PickVisualMedia.VideoOnly))

More customization options are listed in our developer documentation.

Android 14 is live in AOSP

Posted by Dave Burke, VP of Engineering

Today we're releasing Android 14 and pushing the Android 14 source to the Android Open Source Project (AOSP). Android 14 is designed to improve your productivity as developers while enhancing performance, privacy, security, and customization for users.

Android 14 is rolling out to select Pixel devices starting today, and will be available later this year on some of your favorite devices from Samsung Galaxy, iQOO, Nothing, OnePlus, Oppo, Realme, Sharp, Sony, Tecno, vivo and Xiaomi.

Thank you for taking the time to take Android 14 for an early spin though our developer preview and beta programs, sharing your feedback, and making sure your apps deliver a great experience on Android 14. Making Android work well for each and every one of the billions of Android users is a collaborative process between us, Android hardware manufacturers, and you, our developer community.

This post covers a selection of Android 14 changes that have the most developer impact. For a complete list of all of the Android 14 changes, visit the Android 14 developer site.

Performance and Efficiency

A big focus of Android 14 was on improving the performance and efficiency of the platform.

Freezing cached applications

Prior to Android 14, cached applications were allowed to run somewhat unconstrained. In Android 14, we freeze cached applications after a short period of time, giving them 0 CPU time. In Android 14 Beta populations, we see that cached processes consume up to 50% less CPU cycles as compared to Android 13 public devices. Thus, background work is disallowed outside of conventional Android app lifecycle APIs such as foreground services, JobScheduler, or WorkManager.

Optimized broadcasts

To keep frozen applications frozen longer (i.e. not get CPU time), we adjusted how apps receive context-registered broadcasts once they go into a cached state; they may be queued, and repeating ones, such as BATTERY_CHANGED, may be merged into one broadcast.

Faster app launches

With cached app and broadcast optimizations In Android 14, we were able to increase long-standing limits on the maximum number of cached applications in the platform, leading to a reduction in cold app starts that scales by the RAM present on the device. On 8GB devices, the beta group saw 20% fewer cold app starts, and on 12GB devices it was over 30% fewer. Cold startups are slow compared to warm startups and they are expensive in terms of power. This work effectively improves both power usage and overall app startup times.

Reduced memory footprint

Improving the Android Runtime (ART) has a huge impact on the Android user-experience. Code size is one of the key metrics we look at; smaller generated files are better for memory (both RAM and storage). In Android 14, ART includes optimizations that reduce code size by an average of 9.3% without impacting performance.

Customization

Customization is at the core of Android's DNA, and Android 14 continues our commitment to enabling Android users to tune their experience around their individual needs, including enhanced accessibility and internationalization features.

Bigger fonts with non-linear scaling – Starting in Android 14, users will be able to scale up their font to 200%. Previously, the maximum font size scale on Pixel devices was 130%. A non-linear font scaling curve is automatically applied to ensure that text that is already large enough doesn’t increase at the same rate as smaller text. Learn more here.

Image showing the differences between font with no scaling at 100% on the left, standard scaling at 200% in the middle, and non-linear scaling at 200% on the right

Per-app language preferences – You can dynamically update your app's localeConfig with LocaleManager.setOverrideLocaleConfig to customize the set of languages displayed in the per-app language list in Android Settings. IMEs can now use LocaleManager.getApplicationLocales to know the UI language of the current app to update the keyboard language. Starting with Android Studio Giraffe and AGP 8.1, you can configure your app to support Android 13's per-app language preferences automatically.

Regional preferencesRegional preferences enable users to personalize temperature units, the first day of the week, and numbering systems.

Grammatical Inflection – The Grammatical Inflection API allows you to more easily add support for users who speak languages that have grammatical gender. To show personalized translations, you just need to add translations inflected for each grammatical gender for impacted languages and integrate the API.

New media capabilities


Ultra HDR for images
- Android 14 adds support for 10-bit high dynamic range (HDR) images, with support for the Ultra HDR image format. The format is fully backward-compatible with JPEG, allowing apps to seamlessly interoperate with HDR images.

Zoom, Focus, Postview, and more in Camera Extensions - Android 14 upgrades and improves Camera Extensions, allowing apps to handle longer processing times, enabling improved images using compute-intensive algorithms like low-light photography on supported devices.

Lossless USB audio - Android 14 devices can support lossless audio formats for audiophile-level experiences over USB wired headsets.

New graphics capabilities

Custom meshes with vertex and fragment shaders - Android 14 adds support for custom meshes, which can be defined as triangles or triangle strips, and can, optionally, be indexed. These meshes are specified with custom attributes, vertex strides, varying, and vertex/fragment shaders written in AGSL.

Hardware buffer renderer for Canvas - Android 14 introduces HardwareBufferRenderer to assist in using Android's Canvas API to draw with hardware acceleration into a HardwareBuffer, which is particularly helpful when your use case involves communication with the system compositor through SurfaceControl for low-latency drawing.

Developing across form factors

Android 14 builds on the work done in Android 12L and 13 to support tablets and foldable form factors including a taskbar supporting enhanced multitasking, large-screen optimized system apps and notification UI, activity embedding, enhanced letterboxing, improved media projection, and more. Our app quality guidance for large screens along with additional learning opportunities around building for large screens and foldables can help optimize your app across all Android surfaces, while the large screen gallery contains design patterns and inspiration for social and communications, media, productivity, shopping, and reading apps.

Improving your productivity

Android 14 contains many updates focused on making your development experience more consistent, fun, and productive. Many of these updates are being made available on older platform releases using a combination of Google Play system updates, Jetpack libraries, and Google Play services, so you can reach more users with them.

OpenJDK 17 Support - Thanks to Google Play system updates (Project Mainline), over 600M devices are enabled to receive the latest Android Runtime (ART) updates that ship with Android 14. Learn more here.

Credential Manager and Passkeys support - Credential Manager is a new Jetpack API that supports multiple sign-in methods, such as username and password, passkeys, and federated sign-in solutions (such as Sign-in with Google) in a single API, simplifying your integration. Using Google Play services, Credential Manager is supported back to Android 4.4 (API level 19). Read more here.

Health Connect - Health Connect is a user controlled on-device repository for user health and fitness data that makes it easier than ever to support an integrated health and fitness experience across apps and connected devices. Health Connect is part of the Android platform and receives updates via Google Play system updates without requiring a separate download, and is available to older devices as an app on the Google Play store. Read about Health Connect and more in What's new in Android Health.

Superior system sharesheets - To make it easy to give your app's users a rich, consistent, sharing experience, the system sharesheets in Android 14 are configurable with custom actions and improved ranking.

More consistent and reliable foreground services- We've collaborated with hardware manufacturers such as Samsung to create both a more consistent developer experience and a more reliable user experience. To this end, Android 14 has a new requirement to declare foreground service types and request type-specific permissions and we have Google Play policies to enforce appropriate use of these APIs. We've also added a new user-initiated data transfer job type, making the experience of managing large user-initiated uploads and downloads smoother by leveraging JobScheduler's constraints (e.g. network constraints such as unmetered WiFi).

User experience

Predictive Back - Android 14 introduces new Predictive Back system animations – cross-activity and cross-task – to join the back-to-home animation introduced in Android 13. The system animations remain behind a developer option to allow time for additional polish and for more apps to opt-into Predictive Back; Material and Jetpack Predictive Back animations are available to users today.

Privacy and security

 
Data sharing updates – Users will see a new section in the location runtime permission dialog that highlights when an app shares location data with third parties, where they can get more information and control the app’s data access.

Partial access to photos and videos – When your app targeting SDK 34 requests any of the visual media permissions introduced in SDK 33 (READ_MEDIA_IMAGES / READ_MEDIA_VIDEO), Android 14 users can now grant your app access to only selected photos and videos. To adapt your app to this change, we recommend following our recent best practices.

Background activity launchingAndroid 10 (API level 29) and higher place restrictions on when apps can start activities when running in the background. To further reduce instances of unexpected interruptions, apps targeting Android 14 need to grant privileges to start activities in the background when sending a PendingIntent or when binding a service.

Block installation of apps targeting older SDK versions – To protect against malware that targets older API levels to bypass security and privacy protections, apps with a targetSdkVersion lower than 23 cannot be installed on Android 14.

Runtime receivers – Apps targeting Android 14 must indicate if dynamic Context.registerReceiver() usage should be treated as "exported" or "unexported", a continuation of the manifest-level work from previous releases. Learn more here.

Secure full screen Intent notifications – Since full-screen intent notifications are designed for extremely high-priority notifications demanding the user's immediate attention, Android 14 limits the apps granted this permission on app install to those that provide calling and alarms only. Your app can now launch the settings page where users can grant the permission.

Safer dynamic code loading – Apps targeting Android 14 require dynamically loaded files to be marked as read-only. Learn more here.

Safer implicit intents – For apps targeting Android 14, creating a mutable pending intent with an implicit intent will throw an exception, preventing them from being able to be used to trigger unexpected code paths. Apps need to either make the pending intent immutable or make the intent explicit. Learn more here.

App compatibility

We’re working to make updates faster and smoother with each platform release by prioritizing app compatibility. In Android 14 we’ve made most app-facing changes opt-in until your app targets SDK version 34 to give you more time to make any necessary app changes, and we’ve updated our tools and processes to help you get ready sooner.

Easier testing and debugging of changes – To make it easier for you to test the opt-in changes that can affect your app, we’ll make many of them toggleable again this year. With the toggles, you can force-enable or disable the changes individually from Developer options or adb. Check out the details here.

image of app compatibility toggles in Developer Options
App compatibility toggles in Developer Options.

Get your apps, libraries, tools, and game engines ready!

Now is the time to finish your final compatibility testing and publish any necessary updates to ensure a smooth app experience.

If you develop an SDK, library, tool, or game engine, it's even more important to release any necessary updates now to prevent your downstream app and game developers from being blocked by compatibility issues and allow them to target the latest SDK features. Please make sure to let your developers know if updates are needed to fully support Android 14.

Testing your app involves installing your production app onto a device running Android 14; you can use Google Play or other means. Work through all the app's flows and look for functional or UI issues. Review the behavior changes to focus your testing. Each release of Android contains changes to the platform that improve privacy, security, and the overall user experience, and these changes can affect your apps. Here are some top changes to test:

Remember to exercise libraries and SDKs that your app is using in your compatibility testing. You may need to update to current SDK versions or reach out to the developer for help.

Once you’ve published the compatible version of your current app, you can start the process to update your app's targetSdkVersion. Review the behavior changes that apply when your app targets Android 14 and use the compatibility framework to help detect issues quickly.

Get started with Android 14

If you have a supported Pixel device, and are not enrolled in the Android Beta program, you will receive the public Android 14 OTA update as it becomes available (it may take a week or longer as this is a phased rollout dependent on device type and carrier). If you are currently enrolled in the Android Beta program running Android 14, you likely have received and installed the beta of the next Android 14 quarterly platform release (QPR1).

System images for Pixel devices are available here for manual download and flash, and you can get the latest 64-bit Android Emulator system images via the Android Studio SDK Manager. If you're looking for the Android 14 source, you'll find it here in the Android Open Source Project repository under the Android 14 branches.

For the best development experience with Android 14, we recommend that you use the latest release of Android Studio Hedgehog. Once you’re set up, here are some of the things you should do:

  • Try the new features and APIs. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility – learn whether your app is affected by default behavior changes in Android 14. Install your app onto a device or emulator running Android 14 and extensively test it.
  • Test your app with opt-in changes – Android 14 has opt-in behavior changes that only affect your app when it’s targeting the new platform. It’s important to understand and assess these changes early. To make it easier to test, you can toggle the changes on and off individually.
  • Update your app with the Android SDK Upgrade Assistant – Android Studio Hedgehog now filters and identifies the specific Android 14 API changes that are relevant to your app, and walks you through the steps to upgrade your targetSdkVersion with the Android SDK Upgrade Assistant.
Screengrab of Android SDK Upgrade Assistant on Google Pixel Fold

Thank you again for participating in our Android developer preview and beta program! We're looking forward to seeing how your apps take advantage of the updates in Android 14.

Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

Android 14 Developer Preview 2

Posted by Dave Burke, VP of Engineering

Today, we're releasing the second Developer Preview of Android 14, building on the work of the first developer preview of Android 14 from last month with additional enhancements to privacy, security, performance, developer productivity, and user customization while continuing to refine the large-screen device experience on tablets, foldables, and more.

Android delivers enhancements and new features year-round, and your feedback on the Android 14 developer preview and Quarterly Platform Release (QPR) beta program plays a key role in helping Android continuously improve. The Android 14 developer site has lots more information about the preview, including downloads for Pixel and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.

Working across form factors

Android 14 builds on the work done in Android 12L and 13 to support tablets and foldable form factors. See get started with building for large screens and learn about foldables for a quick jumpstart on how to get your apps ready. Our app quality guidance for large screens contains detailed checklists to review your app. We've also recently released libraries supporting low latency stylus and motion prediction.

The large screen gallery contains design inspiration for social and communications, media, productivity, shopping, and reading app experiences.

Privacy and security

Privacy and security have always been a core part of Android's mission, built on the foundation of app sandboxing, open source code, and open app development. In Android 14, we’re building the highest quality platform for all by providing a safer device environment and giving users more controls to protect their information.

Selected photos access

We recommend that you use the Photo Picker if your app needs to access media that the user selects; it provides a permissionless experience on devices running Android 4.4 onwards, using a combination of core platform features, Google Play system updates, and Google Play services.

If you cannot use Photo Picker, when your app requests any of the visual media permissions (READ_MEDIA_IMAGES / READ_MEDIA_VIDEO) introduced in SDK 33, Android 14 users can now grant your app access to only selected photos and videos.

In the new dialog, the permission choices will be:

  • Allow access to all photos: the full library of all on-device photos & videos is available
  • Select photos: only the user's selection of photos & videos will be temporarily available via MediaStore
  • Don’t allow: access to all photos and videos is denied

Apps can prompt users to select media again by requesting the media permissions again and having the READ_MEDIA_VISUAL_USER_SELECTED permission declared in their app manifest.

Please test this new behavior with your apps and adapt your UX to handle the new permission and the media file reselection flow.

Credential manager

Android 14 adds Credential Manager as a platform API, and we're supporting it back to Android 4.4 (API level 19) devices through a Jetpack Library with a Google Play services implementation. It aims to make sign-in easier for users with APIs that retrieve and store credentials with user-configured credential providers. In addition to supporting passwords, the API allows your app to sign-in using passkeys, the new industry standard for passwordless sign-in. Passkeys are built on industry standards, can work across different operating systems and browser ecosystems, and can be used with both websites and apps. Developer Preview 2 features improvements in the UI styling for the account selector, along with changes to the API based upon feedback from Developer Preview 1. Learn more here.

Safer implicit intents

For apps targeting Android 14, creating a mutable pending intent with an implicit intent will throw an exception, preventing them from being able to be used to trigger unexpected code paths. Apps need to either make the pending intent immutable or make the intent explicit. Learn more here.

Background activity launching

Android 10 (API level 29) and higher place restrictions on when apps can start activities when the app is running in the background. These restrictions help minimize interruptions for the user and keep them more in control of what's shown on their screen. To further reduce instances of unexpected interruptions, Android 14 gives foreground apps more control over the ability of apps they interact with to start activities. Specifically, apps targeting Android 14 need to grant privileges to start activities in the background when sending a PendingIntent or when binding a Service.

Streamlining background work

Android 14 continues our effort to optimize the way apps work together, improve system health and battery life, and polish the end-user experience.

Background optimizations

Developer Preview 2 includes optimizations to Android’s memory management system to improve resource usage while applications are running in the background. Several seconds after an app goes into the cached state, background work is disallowed outside of conventional Android app lifecycle APIs such as foreground services, JobScheduler, or WorkManager. Background work is disallowed an order of magnitude faster than on Android 13.

Fewer non-dismissible notifications

Notifications on Android 14 containing FLAG_ONGOING_EVENT will be user dismissible on unlocked handheld devices. Notifications will stay non-dismissible when the device is locked, and notification listeners will not be able to dismiss these notifications. Notifications that are important to device functionality, like system and device policy notifications, will remain fully non-dismissible.

Improved App Store Experiences

Android 14 introduces several new PackageInstaller APIs which allow app stores to improve their user experience, including the requestUserPreapproval() method that allows the download of APKs to be deferred until after the installation has been approved, the setRequestUpdateOwnership() method that allows an installer to indicate that it is responsible for future updates to an app it is installing, and the setDontKillApp() method that can seamlessly install optional features of an app through split APKs while the app is in use. Also, the InstallConstraints API gives installers a way to ensure that app updates happen at an opportune moment, such as when an app is no longer in use.

If you develop an app store, please give these APIs a try and let us know what you think!

Personalization

Regional Preferences

Regional preferences enable users to personalize temperature units, the first day of the week, and numbering systems. A European living in the United States might prefer temperature units to be in Celsius rather than Fahrenheit and for apps to treat Monday as the beginning of the week instead of the US default of Sunday.

New Android Settings menus for these preferences provide users with a discoverable and centralized location to change app preferences. These preferences also persist through backup and restore. Several APIs and intents grant you read access to user preferences for adjusting app information display (getTemperatureUnit, getFirstDayOfWeek). You can also register a BroadcastReceiver on ACTION_LOCALE_CHANGED to handle locale configuration changes when regional preferences change.

App compatibility

We’re working to make updates faster and smoother with each platform release by prioritizing app compatibility. In Android 14 we’ve made most app-facing changes opt-in to give you more time to make any necessary app changes, and we’ve updated our tools and processes to help you get ready sooner.

Developer Preview 2 is in the period where we're looking for input on our APIs, along with details on how platform changes affect your apps, so now is the time to try new features and give us your feedback.

It’s also a good time to start your compatibility testing and identify any work you’ll need to do. You can test some of them without changing your app's targetSdkVersion using the behavior change toggles in Developer Options. This will help you get a preliminary idea of how your app might be affected by opt-in changes in Android 14.

Image of a partial screen shot of a device showing App compatibility toggles in Developer Options
App compatibility toggles in Developer Options.

Platform Stability is when we’ll deliver final SDK/NDK APIs and app-facing system behaviors. We’re expecting to reach Platform Stability in June 2023, and from that time you’ll have several weeks before the official release to do your final testing. The release timeline details are here.

Get started with Android 14

The Developer Preview has everything you need to try the Android 14 features, test your apps, and give us feedback. For testing your app with tablets and foldables, the easiest way to get started is using the Android Emulator in a tablet or foldable configuration in the latest preview of the Android Studio SDK Manager. For phones, you can get started today by flashing a system image onto a Pixel 7 Pro, Pixel 7, Pixel 6a, Pixel 6 Pro, Pixel 6, Pixel 5a 5G, Pixel 5, or Pixel 4a (5G) device. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio.

For the best development experience with Android 14, we recommend that you use the latest preview of Android Studio Giraffe (or more recent Giraffe+ versions). Once you’re set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical during the early part of the developer preview. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility - learn whether your app is affected by default behavior changes in Android 14; install your app onto a device or emulator running Android 14 and extensively test it.
  • Test your app with opt-in changes - Android 14 has opt-in behavior changes that only affect your app when it’s targeting the new platform. It’s important to understand and assess these changes early. To make it easier to test, you can toggle the changes on and off individually.

We’ll update the preview system images and SDK regularly throughout the Android 14 release cycle. This preview release is for developers only and not intended for daily or consumer use, so it will only available by manual download for new Android 14 developer preview users. Once you’ve manually installed a preview build, you’ll automatically get future updates over-the-air for all later previews and Betas. Read more here.

If you intend to move from the Android 13 QPR Beta program to the Android 14 Developer Preview program and don't want to have to wipe your device, we recommend that you move to Developer Preview 2 now. Otherwise, you may run into time periods where the Android 13 Beta will have a more recent build date which will prevent you from going directly to the Android 14 Developer Preview without doing a data wipe.

As we reach our Beta releases, we'll be inviting consumers to try Android 14 as well, and we'll open up enrollment for the Android 14 Beta program at that time. For now, please note that the Android Beta program is not yet available for Android 14.

For complete information, visit the Android 14 developer site.

Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

The first developer preview of Android 14

Posted by Dave Burke, VP of Engineering

Making Android work well for each and every one of the billions of Android users is a collaborative process between us, Android hardware manufacturers, and you, our developer community.

Illustration of badge style Android 14 logo

Today we're releasing the first Developer Preview of Android 14, and your feedback in these previews is a critical part of making Android better for everyone. Android 14 continues our work to improve your productivity as developers, along with enhancements to performance, privacy, security, and user customization. This preview is just the beginning, and we’ll have lots more to share as we move through the release cycle.

Android continues to deliver enhancements and new features year-round, and your Android 14 developer preview and Quarterly Platform Release (QPR) beta program feedback plays a key role in helping Android continuously improve. The Android 14 developer site has lots more information about the preview, including downloads for Pixel and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.

Working across devices and form factors

Android 14 builds on the work done in Android 12L and 13 to support tablets and foldable form factors. To help you build apps that adapt to different screen sizes, we've created window size classes, sliding pane layout, Activity embedding, and box with constraints and more, all supported in Jetpack Compose. With every release, our goal is to make it easier for you to optimize your app across all Android surfaces.

To help streamline getting your apps ready we have updated our app quality guidance for large screens, and provided additional learning opportunities around building for large screens and foldables. The large screen gallery contains proven design patterns along with design inspiration around the markets that your app supports such as social and communications, media, productivity, shopping, and reading apps.

Multi-device experiences are a big part of the future of Android. You can get started today with the Cross device SDK preview, allowing you to build rich experiences that intuitively work across different devices and form factors, and there's more to come.

Streamlining background work

Android 14 continues our effort to optimize the way apps work together, improve system health and battery life, and polish the end-user experience.

Updates and additions to JobScheduler and Foreground Services

It's more complicated than necessary to perform some background work, such as downloading large files when WiFi is available. We're creating a standard path for this work to simplify your app development and potentially improve the user experience. We're also being more opinionated about how foreground services should be used, reserving them for only the highest priority user-facing tasks so that Android can improve resource consumption and battery life.

In Android 14, we are making changes to existing Android APIs (Foreground Services and JobScheduler) including adding new functionality for user-initiated data transfers, along with an updated requirement to declare foreground service types. The user-initiated data transfer job will make managing user initiated downloads and uploads easier, particularly when they require constraints such as downloading on Wi-Fi only. The requirement to declare foreground service types allows you to clearly define the intent of the background work of your app while making it clear which use-cases are appropriate for foreground services. In addition, Google Play will be rolling out new policies to ensure the appropriate use of these APIs, with more details coming soon.

Optimized broadcasts

We’ve made several optimizations to the internal broadcast system to improve battery life and responsiveness. While most of the optimizations are internal to Android and should not impact your apps, we have adjusted how apps receive context-registered broadcasts once the app goes into a cached state. Broadcasts to context-registered receivers may be queued and only delivered to the app once it comes out of the cached state. Furthermore, some repeating context-registered broadcasts, such as BATTERY_CHANGED, may be merged into one final broadcast before it is delivered once the app comes out of the cached state.

Exact alarms

The invocation of exact alarms can significantly affect the device's resources, such as battery life. So in Android 14, newly installed apps targeting Android 13+ (SDK 33+) that are not clocks or calendars must request the user to grant them the SCHEDULE_EXACT_ALARM special permission before setting exact alarms. Apps can direct users to the settings page via an intent to toggle this permission, but we encourage you to evaluate your use cases and choose more flexibly scheduled alternatives when possible.

Clock and calendar apps targeting Android 13+ (SDK 33+) that rely on exact alarms as part of their core app workflow will be able to declare the USE_EXACT_ALARM normal permission instead (granted on install). Apps will not be able to publish a version of their app to the Play store with this permission in the manifest unless they qualify based on the policy language.

Customization

We're continuing to make sure that Android users can tune their experience around their individual needs, including enhanced accessibility and internationalization features.

Bigger fonts with non-linear scaling

Starting in Android 14, users will be able to scale up their font to 200%. Previously, the maximum font size scale on Pixel devices was 130%.

To mitigate issues where text gets too large, starting in Android 14, a non-linear font scaling curve is automatically applied. This ensures that text that is already large enough doesn’t increase at the same rate as smaller text.
Examples of text scaling showing the differences between the sizing of standard font at 100% (no scaling)on the left, standard scaling (200%) in the middle, and non-linear scaling (200%)on the rightIn Android 14, you should test your app UI with the maximum font size using the Font size option within the Accessibility > Display size and text settings. Ensure that the adjusted large text size setting is reflected in the UI, and that it doesn’t cause text to be cut off. Our documentation has more on best practices.

Per-app language preferences

You can dynamically update your app's localeConfig with LocaleManager.setOverrideLocaleConfig to customize the set of languages displayed in the per-app language list in Android Settings. This allows you to customize the language list per region, run A/B experiments, and provide updated locales if your app utilizes server-side localization pushes.

IMEs can now use LocaleManager.getApplicationLocales to know the UI language of the current app to update the keyboard language.

Grammatical Inflection API

The Grammatical Infection API allows you to more easily add support for users who speak languages which have grammatical gender. For example,

Masculine: “Vous êtes abonné à...”

Feminine: “Vous êtes abonnée à…”

Neutral: “Abonnement à…activé”

Grammatical gender is inherent to the language and cannot be easily worked around in some non-English languages. This new API lowers the effort to support viewer gender (who’s viewing the UI; not who’s being talked about) as compared to using the SelectFormat in ICU which must be applied on a per string basis.

To show personalized translations, you just need to add translations inflected for each grammatical gender for impacted languages and integrate the API.

Privacy and Security

Runtime receivers

Apps targeting Android 14 must indicate if dynamic Context.registerReceiver() usage should be treated as "exported" or "unexported", a continuation of the manifest-level work from previous releases. Learn more here.

Safer implicit intents

To prevent malicious apps from intercepting intents, apps targeting Android 14 are restricted from sending intents internally that don't specify a package. Learn more here.

Safer dynamic code loading

Dynamic code loading (DCL) introduces outlets for malware and exploits, since dynamically downloaded executables can be unexpectedly manipulated, causing code injection. Apps targeting Android 14 require dynamically loaded files to be marked as read-only. Learn more here.

Block installation of apps

Malware often targets older API levels to bypass security and privacy protections that have been introduced in newer Android versions. To protect against this, starting with Android 14, apps with a targetSdkVersion lower than 23 cannot be installed. This specific version was chosen because some malware apps use a targetSdkVersion of 22 to avoid being subjected to the runtime permission model introduced in 2015 by Android 6.0 (API level 23).

On devices that upgrade to Android 14, any apps with a targetSdkVersion lower than 23 will remain installed.

You can test apps targeting an older API level using the following ADB command:

adb install --bypass-low-target-sdk-block FILENAME.apk

Credential Manager and Passkeys support

We recently announced the alpha release of Credential Manager, a new Jetpack API that allows you to simplify your users' authentication journey, while also increasing security with support of passkeys. Passkeys are a significantly safer replacement for passwords and other phishable authentication factors and more convenient for users (they require just a biometric swipe to securely sign in on any device). Read more here.

App compatibility

We’re working to make updates faster and smoother with each platform release by prioritizing app compatibility. In Android 14 we’ve made most app-facing changes opt-in to give you more time to make any necessary app changes, and we’ve updated our tools and processes to help you get ready sooner.

OpenJDK 17 Support - This preview includes access to 300 OpenJDK 17 classes. We are working hard to fully enable Java 17 language features in upcoming developer previews. These include record classes, multi-line strings and pattern matching instanceof. Thanks to Google Play system updates (Project Mainline), over 600M devices are enabled to receive the latest Android Runtime (ART) updates that include these changes. This is part of our commitment to give apps a more consistent, secure environment across devices, and to deliver new features and capabilities to users independent of platform releases.

Easier testing and debugging of changes - To make it easier for you to test the opt-in changes that can affect your app, we’ll make many of them toggleable again this year. With the toggles, you can force-enable or disable the changes individually from Developer options or adb. Check out the details here.
App compatibility toggles in Developer Options
Platform stability milestone - Like last year, we’re letting you know our Platform Stability milestone well in advance, to give you more time to plan for app compatibility work. At this milestone we’ll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We’re expecting to reach Platform Stability in June 2023, and from that time you’ll have several weeks before the official release to do your final testing. The release timeline details are here.

Get started with Android 14

The Developer Preview has everything you need to try the Android 14 features, test your apps, and give us feedback. For testing your app with tablets and foldables, the easiest way to get started is using the Android Emulator in a tablet or foldable configuration in the latest preview of the Android Studio SDK Manager. For phones, you can get started today by flashing a system image onto a Pixel 7 Pro, Pixel 7, Pixel 6a, Pixel 6 Pro, Pixel 6, Pixel 5a 5G, Pixel 5, or Pixel 4a (5G) device. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio.

For the best development experience with Android 14, we recommend that you use the latest preview of Android Studio Giraffe (or more recent Giraffe+ versions). Once you’re set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical during the early part of the developer preview. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility - learn whether your app is affected by default behavior changes in Android 14; install your app onto a device or emulator running Android 14 and extensively test it.
  • Test your app with opt-in changes - Android 14 has opt-in behavior changes that only affect your app when it’s targeting the new platform. It’s important to understand and assess these changes early. To make it easier to test, you can toggle the changes on and off individually.

We’ll update the preview system images and SDK regularly throughout the Android 14 release cycle. This initial preview release is for developers only and not intended for daily or consumer use, so we're making it available by manual download only. Once you’ve manually installed a preview build, you’ll automatically get future updates over-the-air for all later previews and Betas. Read more here.

If you intend to move from the Android 13 QPR Beta program to the Android 14 Developer Preview program and don't want to have to wipe your device, we recommend that you move to Developer Preview 1 now. Otherwise you may run into time periods where the Android 13 Beta will have a more recent build date which will prevent you from going directly to the Android 14 Developer Preview without doing a data wipe.

As we reach our Beta releases, we'll be inviting consumers to try Android 14 as well, and we'll open up enrollment for the Android Beta program at that time. For now, please note that the Android Beta program is not yet available for Android 14.

For complete information, visit the Android 14 developer site.

Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

The first developer preview of Android 14

Posted by Dave Burke, VP of Engineering

Making Android work well for each and every one of the billions of Android users is a collaborative process between us, Android hardware manufacturers, and you, our developer community.

Illustration of badge style Android 14 logo

Today we're releasing the first Developer Preview of Android 14, and your feedback in these previews is a critical part of making Android better for everyone. Android 14 continues our work to improve your productivity as developers, along with enhancements to performance, privacy, security, and user customization. This preview is just the beginning, and we’ll have lots more to share as we move through the release cycle.

Android continues to deliver enhancements and new features year-round, and your Android 14 developer preview and Quarterly Platform Release (QPR) beta program feedback plays a key role in helping Android continuously improve. The Android 14 developer site has lots more information about the preview, including downloads for Pixel and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.

Working across devices and form factors

Android 14 builds on the work done in Android 12L and 13 to support tablets and foldable form factors. To help you build apps that adapt to different screen sizes, we've created window size classes, sliding pane layout, Activity embedding, and box with constraints and more, all supported in Jetpack Compose. With every release, our goal is to make it easier for you to optimize your app across all Android surfaces.

To help streamline getting your apps ready we have updated our app quality guidance for large screens, and provided additional learning opportunities around building for large screens and foldables. The large screen gallery contains proven design patterns along with design inspiration around the markets that your app supports such as social and communications, media, productivity, shopping, and reading apps.

Multi-device experiences are a big part of the future of Android. You can get started today with the Cross device SDK preview, allowing you to build rich experiences that intuitively work across different devices and form factors, and there's more to come.

Streamlining background work

Android 14 continues our effort to optimize the way apps work together, improve system health and battery life, and polish the end-user experience.

Updates and additions to JobScheduler and Foreground Services

It's more complicated than necessary to perform some background work, such as downloading large files when WiFi is available. We're creating a standard path for this work to simplify your app development and potentially improve the user experience. We're also being more opinionated about how foreground services should be used, reserving them for only the highest priority user-facing tasks so that Android can improve resource consumption and battery life.

In Android 14, we are making changes to existing Android APIs (Foreground Services and JobScheduler) including adding new functionality for user-initiated data transfers, along with an updated requirement to declare foreground service types. The user-initiated data transfer job will make managing user initiated downloads and uploads easier, particularly when they require constraints such as downloading on Wi-Fi only. The requirement to declare foreground service types allows you to clearly define the intent of the background work of your app while making it clear which use-cases are appropriate for foreground services. In addition, Google Play will be rolling out new policies to ensure the appropriate use of these APIs, with more details coming soon.

Optimized broadcasts

We’ve made several optimizations to the internal broadcast system to improve battery life and responsiveness. While most of the optimizations are internal to Android and should not impact your apps, we have adjusted how apps receive context-registered broadcasts once the app goes into a cached state. Broadcasts to context-registered receivers may be queued and only delivered to the app once it comes out of the cached state. Furthermore, some repeating context-registered broadcasts, such as BATTERY_CHANGED, may be merged into one final broadcast before it is delivered once the app comes out of the cached state.

Exact alarms

The invocation of exact alarms can significantly affect the device's resources, such as battery life. So in Android 14, newly installed apps targeting Android 13+ (SDK 33+) that are not clocks or calendars must request the user to grant them the SCHEDULE_EXACT_ALARM special permission before setting exact alarms. Apps can direct users to the settings page via an intent to toggle this permission, but we encourage you to evaluate your use cases and choose more flexibly scheduled alternatives when possible.

Clock and calendar apps targeting Android 13+ (SDK 33+) that rely on exact alarms as part of their core app workflow will be able to declare the USE_EXACT_ALARM normal permission instead (granted on install). Apps will not be able to publish a version of their app to the Play store with this permission in the manifest unless they qualify based on the policy language.

Customization

We're continuing to make sure that Android users can tune their experience around their individual needs, including enhanced accessibility and internationalization features.

Bigger fonts with non-linear scaling

Starting in Android 14, users will be able to scale up their font to 200%. Previously, the maximum font size scale on Pixel devices was 130%.

To mitigate issues where text gets too large, starting in Android 14, a non-linear font scaling curve is automatically applied. This ensures that text that is already large enough doesn’t increase at the same rate as smaller text.
Examples of text scaling showing the differences between the sizing of standard font at 100% (no scaling)on the left, standard scaling (200%) in the middle, and non-linear scaling (200%)on the rightIn Android 14, you should test your app UI with the maximum font size using the Font size option within the Accessibility > Display size and text settings. Ensure that the adjusted large text size setting is reflected in the UI, and that it doesn’t cause text to be cut off. Our documentation has more on best practices.

Per-app language preferences

You can dynamically update your app's localeConfig with LocaleManager.setOverrideLocaleConfig to customize the set of languages displayed in the per-app language list in Android Settings. This allows you to customize the language list per region, run A/B experiments, and provide updated locales if your app utilizes server-side localization pushes.

IMEs can now use LocaleManager.getApplicationLocales to know the UI language of the current app to update the keyboard language.

Grammatical Inflection API

The Grammatical Infection API allows you to more easily add support for users who speak languages which have grammatical gender. For example,

Masculine: “Vous êtes abonné à...”

Feminine: “Vous êtes abonnée à…”

Neutral: “Abonnement à…activé”

Grammatical gender is inherent to the language and cannot be easily worked around in some non-English languages. This new API lowers the effort to support viewer gender (who’s viewing the UI; not who’s being talked about) as compared to using the SelectFormat in ICU which must be applied on a per string basis.

To show personalized translations, you just need to add translations inflected for each grammatical gender for impacted languages and integrate the API.

Privacy and Security

Runtime receivers

Apps targeting Android 14 must indicate if dynamic Context.registerReceiver() usage should be treated as "exported" or "unexported", a continuation of the manifest-level work from previous releases. Learn more here.

Safer implicit intents

To prevent malicious apps from intercepting intents, apps targeting Android 14 are restricted from sending intents internally that don't specify a package. Learn more here.

Safer dynamic code loading

Dynamic code loading (DCL) introduces outlets for malware and exploits, since dynamically downloaded executables can be unexpectedly manipulated, causing code injection. Apps targeting Android 14 require dynamically loaded files to be marked as read-only. Learn more here.

Block installation of apps

Malware often targets older API levels to bypass security and privacy protections that have been introduced in newer Android versions. To protect against this, starting with Android 14, apps with a targetSdkVersion lower than 23 cannot be installed. This specific version was chosen because some malware apps use a targetSdkVersion of 22 to avoid being subjected to the runtime permission model introduced in 2015 by Android 6.0 (API level 23).

On devices that upgrade to Android 14, any apps with a targetSdkVersion lower than 23 will remain installed.

You can test apps targeting an older API level using the following ADB command:

adb install --bypass-low-target-sdk-block FILENAME.apk

Credential Manager and Passkeys support

We recently announced the alpha release of Credential Manager, a new Jetpack API that allows you to simplify your users' authentication journey, while also increasing security with support of passkeys. Passkeys are a significantly safer replacement for passwords and other phishable authentication factors and more convenient for users (they require just a biometric swipe to securely sign in on any device). Read more here.

App compatibility

We’re working to make updates faster and smoother with each platform release by prioritizing app compatibility. In Android 14 we’ve made most app-facing changes opt-in to give you more time to make any necessary app changes, and we’ve updated our tools and processes to help you get ready sooner.

OpenJDK 17 Support - This preview includes access to 300 OpenJDK 17 classes. We are working hard to fully enable Java 17 language features in upcoming developer previews. These include record classes, multi-line strings and pattern matching instanceof. Thanks to Google Play system updates (Project Mainline), over 600M devices are enabled to receive the latest Android Runtime (ART) updates that include these changes. This is part of our commitment to give apps a more consistent, secure environment across devices, and to deliver new features and capabilities to users independent of platform releases.

Easier testing and debugging of changes - To make it easier for you to test the opt-in changes that can affect your app, we’ll make many of them toggleable again this year. With the toggles, you can force-enable or disable the changes individually from Developer options or adb. Check out the details here.
App compatibility toggles in Developer Options
Platform stability milestone - Like last year, we’re letting you know our Platform Stability milestone well in advance, to give you more time to plan for app compatibility work. At this milestone we’ll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We’re expecting to reach Platform Stability in June 2023, and from that time you’ll have several weeks before the official release to do your final testing. The release timeline details are here.

Get started with Android 14

The Developer Preview has everything you need to try the Android 14 features, test your apps, and give us feedback. For testing your app with tablets and foldables, the easiest way to get started is using the Android Emulator in a tablet or foldable configuration in the latest preview of the Android Studio SDK Manager. For phones, you can get started today by flashing a system image onto a Pixel 7 Pro, Pixel 7, Pixel 6a, Pixel 6 Pro, Pixel 6, Pixel 5a 5G, Pixel 5, or Pixel 4a (5G) device. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio.

For the best development experience with Android 14, we recommend that you use the latest preview of Android Studio Giraffe (or more recent Giraffe+ versions). Once you’re set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical during the early part of the developer preview. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility - learn whether your app is affected by default behavior changes in Android 14; install your app onto a device or emulator running Android 14 and extensively test it.
  • Test your app with opt-in changes - Android 14 has opt-in behavior changes that only affect your app when it’s targeting the new platform. It’s important to understand and assess these changes early. To make it easier to test, you can toggle the changes on and off individually.

We’ll update the preview system images and SDK regularly throughout the Android 14 release cycle. This initial preview release is for developers only and not intended for daily or consumer use, so we're making it available by manual download only. Once you’ve manually installed a preview build, you’ll automatically get future updates over-the-air for all later previews and Betas. Read more here.

If you intend to move from the Android 13 QPR Beta program to the Android 14 Developer Preview program and don't want to have to wipe your device, we recommend that you move to Developer Preview 1 now. Otherwise you may run into time periods where the Android 13 Beta will have a more recent build date which will prevent you from going directly to the Android 14 Developer Preview without doing a data wipe.

As we reach our Beta releases, we'll be inviting consumers to try Android 14 as well, and we'll open up enrollment for the Android Beta program at that time. For now, please note that the Android Beta program is not yet available for Android 14.

For complete information, visit the Android 14 developer site.

Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.