Tag Archives: #11WeeksOfAndroid

Announcing the winners of the #AndroidDevChallenge, powered by on-device machine learning

Posted by Jacob Lehrbaum, Director of Developer Relations, Android

Developers like you have always played an important role in Android innovation. Over 10 years ago, when we first launched the Android SDK, we also announced the Android Developer Challenge to reward model apps and highlight new ways of solving user problems. As Android pushes the boundaries of machine learning, 5G, foldables, and more, developers continue to help shape these new frontiers. To celebrate this work, we revived the challenge in 2019, with a focus on “Helpful Innovation,” powered by on-device machine learning.

We received hundreds of creative projects, and at the end of last year, picked 10 winners who each combined a strong idea and a thirst to bring it to life. Since then, we’ve been working with those winners to help turn their ideas into reality. And today, we’re announcing the 10 winners. Some are still at the beginning of their journey but but their apps are now ready for you to download and try out! !

  • AgroDoc helps farmers diagnose plant disease and make treatment plans. [Navneet Krishna; Kochi, India]
  • AgriFarm helps farmers detect plant diseases and prevent major damage in fruits and vegetables such as tomatoes, corn and potatoes. [Balochisan, Pakistan]
  • Eskke streamlines mobile money management for people in the Congo, letting them transfer money, pay bills, buy subscriptions and essential airtime through SMS. [David Mumbere Kathoh; Goma, Democratic Republic of Congo]
  • Leepi helps students learn hand gestures and symbols for American Sign Language. [Prince Patel; Bengaluru, India]
  • MixPose is a live streaming platform that gives yoga teachers and fitness professionals the opportunity to teach, track alignment, and give feedback in real-time. [Peter Ma; San Francisco, California, USA]
  • Pathfinder could help people with visual impairments navigate complex situations by identifying and calculating the trajectories of objects moving in their path. [Colin Shelton; Addison, Texas, USA]
  • Snore & Cough helps you identify and analyze snoring and coughing, to help provide info to users seeking assistance from a medical professional. [Ethan Fan; Mountain View, California, USA]
  • Stila pairs with a wearable device, like the Fitbit wristband or a device running on Wear OS by Google to monitor and track the body’s stress levels. By monitoring stress levels over time, you have the chance to better understand and manage stress in your life. [Yingdin Wing; Munich, Germany]
  • Trashly makes recycling easier. Just point the on-device camera at an item, and through object detection, the app identifies and classifies plastic and paper cups, bags, bottles, etc. [Elvin Rakhmankulov; Chicago, Illinois, USA]
  • UnoDogs helps owners better support their pet’s wellness, providing customized information and fitness programs. [Chinmany Mishra; New Delhi, India]

Making on-device machine learning more accessible, with ML Kit and TensorFlow Lite

Increasingly, machine learning is becoming a more accessible tool to developers with limited to no background in the technology. In fact, for most of the winners of the Android Developer Challenge, this was their first foray into machine learning. That’s thanks in part to two key offerings from Google, which bring on-device machine learning into reach for millions of developers around the world.

The first is ML Kit. ML Kit brings Google’s on-device machine learning technologies to mobile app developers, so they can build customized and interactive experiences into their apps. This includes tools such as language translation, text recognition, object detection and more. Eskke, for instance, uses offline text recognition and barcode scanning from ML Kit so users can scan the QR code at a mobile money kiosk and quickly withdraw money. And MixPose uses ML Kit's forthcoming Pose detection API to detect each user’s yoga positions and movements, so teachers can provide feedback.

The other Google resource that many of the Android Dev Challenge winners used was TensorFlow Lite. This powerful machine learning framework can help run machine learning models on Android, iOS and IoT devices that would never normally be able to support them. Its set of tools can be used for all kinds of powerful neural network-related applications, from image detection to speech recognition, bringing the latest cutting-edge technology to the devices we carry around with us wherever we go. Trashly, for instance, uses a custom TensorFlow Lite model to report if an object is recyclable and how to recycle it.

Helpful innovation, such as the 10 winning apps in the Android Developer Challenge, has the potential to change the way we access, use, and interpret information, making it available when we need it, where we need it most. By working with these developers focused on helpful innovation, we hope to inspire the next wave of developers to unlock what’s possible with this new technology.

#11WeeksOfAndroid Week 2 Machine Learning with Android logo head

What’s next in Android Machine Learning week?

As we kick off the second week of #11WeeksOfAndroid, focused on Machine Learning, we will highlight new tools and resources available to Android developers. Here’s a taste of the rest of this week:

  • Tuesday - ML Kit, the turnkey ML SDK went through a major overhaul with its new on-device offering this month. Check out the substantial improvement in developer usability, CameraX support and where the platform is going next.
  • Wednesday - Custom Models. When prepackaged SDK doesn’t quite satisfy your need, tools from Android Studio, TensorFlow Lite and ML Kit might just be the answer. Aside from individual offerings, we will also highlight how they can be used together.
  • Thursday - ML design. Learn some best practices for making ML product decisions from the People + AI Guidebook. We will go behind the scenes of the Read Along app, an on-device ML app that helps grow universal literacy. Bring your whole team because everyone, including UXers, engineers, and product managers are invited!

On Tuesday and Wednesday, we will also have a “codelab of the day” so get your Android Studio 4.1 beta today, block off an hour in your schedule and take this ML journey with us!

*The apps presented here are the projects of the developers individually, and not Google.

Announcing the winners of the #AndroidDevChallenge, powered by on-device machine learning

Posted by Jacob Lehrbaum, Director of Developer Relations, Android

Developers like you have always played an important role in Android innovation. Over 10 years ago, when we first launched the Android SDK, we also announced the Android Developer Challenge to reward model apps and highlight new ways of solving user problems. As Android pushes the boundaries of machine learning, 5G, foldables, and more, developers continue to help shape these new frontiers. To celebrate this work, we revived the challenge in 2019, with a focus on “Helpful Innovation,” powered by on-device machine learning.

We received hundreds of creative projects, and at the end of last year, picked 10 winners who each combined a strong idea and a thirst to bring it to life. Since then, we’ve been working with those winners to help turn their ideas into reality. And today, we’re announcing the 10 winners. Some are still at the beginning of their journey but but their apps are now ready for you to download and try out! !

  • AgroDoc helps farmers diagnose plant disease and make treatment plans. [Navneet Krishna; Kochi, India]
  • AgriFarm helps farmers detect plant diseases and prevent major damage in fruits and vegetables such as tomatoes, corn and potatoes. [Balochisan, Pakistan]
  • Eskke streamlines mobile money management for people in the Congo, letting them transfer money, pay bills, buy subscriptions and essential airtime through SMS. [David Mumbere Kathoh; Goma, Democratic Republic of Congo]
  • Leepi helps students learn hand gestures and symbols for American Sign Language. [Prince Patel; Bengaluru, India]
  • MixPose is a live streaming platform that gives yoga teachers and fitness professionals the opportunity to teach, track alignment, and give feedback in real-time. [Peter Ma; San Francisco, California, USA]
  • Pathfinder could help people with visual impairments navigate complex situations by identifying and calculating the trajectories of objects moving in their path. [Colin Shelton; Addison, Texas, USA]
  • Snore & Cough helps you identify and analyze snoring and coughing, to help provide info to users seeking assistance from a medical professional. [Ethan Fan; Mountain View, California, USA]
  • Stila pairs with a wearable device, like the Fitbit wristband or a device running on Wear OS by Google to monitor and track the body’s stress levels. By monitoring stress levels over time, you have the chance to better understand and manage stress in your life. [Yingdin Wing; Munich, Germany]
  • Trashly makes recycling easier. Just point the on-device camera at an item, and through object detection, the app identifies and classifies plastic and paper cups, bags, bottles, etc. [Elvin Rakhmankulov; Chicago, Illinois, USA]
  • UnoDogs helps owners better support their pet’s wellness, providing customized information and fitness programs. [Chinmany Mishra; New Delhi, India]

Making on-device machine learning more accessible, with ML Kit and TensorFlow Lite

Increasingly, machine learning is becoming a more accessible tool to developers with limited to no background in the technology. In fact, for most of the winners of the Android Developer Challenge, this was their first foray into machine learning. That’s thanks in part to two key offerings from Google, which bring on-device machine learning into reach for millions of developers around the world.

The first is ML Kit. ML Kit brings Google’s on-device machine learning technologies to mobile app developers, so they can build customized and interactive experiences into their apps. This includes tools such as language translation, text recognition, object detection and more. Eskke, for instance, uses offline text recognition and barcode scanning from ML Kit so users can scan the QR code at a mobile money kiosk and quickly withdraw money. And MixPose uses ML Kit's forthcoming Pose detection API to detect each user’s yoga positions and movements, so teachers can provide feedback.

The other Google resource that many of the Android Dev Challenge winners used was TensorFlow Lite. This powerful machine learning framework can help run machine learning models on Android, iOS and IoT devices that would never normally be able to support them. Its set of tools can be used for all kinds of powerful neural network-related applications, from image detection to speech recognition, bringing the latest cutting-edge technology to the devices we carry around with us wherever we go. Trashly, for instance, uses a custom TensorFlow Lite model to report if an object is recyclable and how to recycle it.

Helpful innovation, such as the 10 winning apps in the Android Developer Challenge, has the potential to change the way we access, use, and interpret information, making it available when we need it, where we need it most. By working with these developers focused on helpful innovation, we hope to inspire the next wave of developers to unlock what’s possible with this new technology.

#11WeeksOfAndroid Week 2 Machine Learning with Android logo head

What’s next in Android Machine Learning week?

As we kick off the second week of #11WeeksOfAndroid, focused on Machine Learning, we will highlight new tools and resources available to Android developers. Here’s a taste of the rest of this week:

  • Tuesday - ML Kit, the turnkey ML SDK went through a major overhaul with its new on-device offering this month. Check out the substantial improvement in developer usability, CameraX support and where the platform is going next.
  • Wednesday - Custom Models. When prepackaged SDK doesn’t quite satisfy your need, tools from Android Studio, TensorFlow Lite and ML Kit might just be the answer. Aside from individual offerings, we will also highlight how they can be used together.
  • Thursday - ML design. Learn some best practices for making ML product decisions from the People + AI Guidebook. We will go behind the scenes of the Read Along app, an on-device ML app that helps grow universal literacy. Bring your whole team because everyone, including UXers, engineers, and product managers are invited!

On Tuesday and Wednesday, we will also have a “codelab of the day” so get your Android Studio 4.1 beta today, block off an hour in your schedule and take this ML journey with us!

*The apps presented here are the projects of the developers individually, and not Google.

11 Weeks of Android: People & Identity

Posted by Dr. Stefan Frank, Senior Product Manager, Android System UI

#11 weeks of Android with Android logo

This blog post is part of a weekly series for #11WeeksOfAndroid. Each week we’re diving into a key area of Android so you don’t miss anything. This week, we spotlight people & identity. Here's a look at what you should know.

The big news

One of the goals of Android 11 was making our phones more people-centric. Because nothing matters more to people than connecting to loved ones. It is a core human need especially during our current physical distancing constraints. We have the need to be more social than ever before. Android 11 reimagines how we have conversations on our phones by adding new capabilities to help you maintain your identity across multiple devices.

We are announcing some new features in Android 11 that allows you to easily connect with your loved ones, friends and business colleagues. At the center of this release is the Android Conversation Shortcut API and Identity Services Library. These new tools empower instant connections to your best friend, sharing funny pictures of your dogs, telling your aunt about a tasty seafood recipe that you discovered, or congratulating an office colleague about her promotion. And they also provide a new level of password management that makes it easier for your users to sign up and sign in.

One of our favorite features brings conversations from people that matter most to you right to the lock screen of your phone. You’ll easily recognize them by their avatar and instantly respond to your family, friends, or colleagues. These are people you truly want to connect to. We knew this feature would be useful, but the responses from our beta-testers made us smile. The decision to include the Conversation Shortcut API to improve the lives of our users was one of the easiest decisions for us to make in the release of Android 11.

Creating a bubble from an incoming notification and accessing the conversation from the bubble.

Creating a bubble from an incoming notification and accessing the conversation from the bubble.

One of the new features building on top of shortcuts is the new conversation space at the top of your notifications. It focuses your attention on what matters most - your conversations. Right from these notifications the user can trigger another new feature in Android 11 - Bubbles. Bubbles are small representations of conversations floating over other content on the side of the screen, which can be expanded to allow quick access to conversations, without changing what you were doing on the device. They are super handy for carrying on conversations while using the device for other tasks.

The new conversations space showing how a conversation is marked as priority and will be displayed on the lock screen.

The new conversations space showing how a conversation is marked as priority and will be displayed on the lock screen.

A long tap on conversation notifications enables the user to mark priority conversations in order to give special prominence to the most important people. Priority conversations will be displayed with their individual avatar right on the lock screen and move to the top of your notifications. They can even be set to break through do-not-disturb. Another use of the conversation shortcuts is that they are used as share targets in the system share sheet, which was already launched in Android 10.

Another focus of this week was identity. To tackle user and developer complexity that makes identity a challenge for developers, we've been working on One Tap and Block Store, part of our new Google Identity Services Library. One Tap is our new cross-platform sign-in mechanism for Web and Android, supporting and streamlining multiple types of credentials. Block Store is our new token-based sign-in mechanism that’s built on top of Backup and Restore. It allows you to keep your user signed in across Android devices.

We’re super excited about all these features [since] they help all of us connect, communicate and express ourselves to the people that we care about and to the apps we are dealing with -- which is as important now as it’s ever been.”

What to watch

For a high level overview of the people centric functionality, we recommend that you check out the Android 11 launch highlight video on People. Earlier this week, we also launched a new talk on ‘Conversation Notifications’, where Artur describes how to implement the conversation shortcut and bubbles. There is also a great overview talk on the conversation additions and other System UI news from Dan. Finally, you can also listen to Chet’s podcasts where he interviews us on People and Bubbles.

If you’re interested in learning more about Identity, we also published “In Identity on Android: What’s new in sign-in”, this week. In this video, Vishal explains the new libraries in the Google Identity System: One Tap and Block Store."

Two of the teams that worked very early with us on these conversation specific topics are the Messenger team from Facebook and the direct messaging team from Twitter. Read the stories around both of these implementations here and here.

Learning path

If you’re looking for an easy way to pick up the highlights of this week, check out the People and Identity pathway. A pathway is an ordered tutorial that allows users to complete a pre-defined module that culminates in a quiz. It may include codelabs, videos, articles and blog posts. A virtual badge is awarded to each user who passes the quiz. Test your knowledge of key takeaways about People and Identity to earn a limited edition badge.

Key takeaways

Android 11 is the starting point of an ongoing focus on what matters most to users, people and conversations. Many of our partners in our ecosystem introduced amazing apps and services enabling these connections with people and conversations. We in Android want to elevate and surface these partners more prominently in order to support this goal. Thus if you are working on an app that fosters real time communication between people, we strongly encourage you to adopt the conversation shortcut based APIs for notifications, bubbles and sharing when targeting API 30 in order to put users’ conversations front and center and give them quick access to your app. The developer documentation can be found here.

For apps that handle user accounts, we encourage you to help our users avoid messy password hunting and forgotten credential processes by integrating One Tap to streamline credential management and Block Store to handle device updates. These integrations will work on phones back to Android M.

In the spirit of this week, I wish you meaningful and joyful connections with the people that matter to you and seamless experiences with your favorite apps. We hope you will help us in our journey of supporting these goals.

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

11 Weeks of Android: People & Identity

Posted by Dr. Stefan Frank, Senior Product Manager, Android System UI

#11 weeks of Android with Android logo

This blog post is part of a weekly series for #11WeeksOfAndroid. Each week we’re diving into a key area of Android so you don’t miss anything. This week, we spotlight people & identity. Here's a look at what you should know.

The big news

One of the goals of Android 11 was making our phones more people-centric. Because nothing matters more to people than connecting to loved ones. It is a core human need especially during our current physical distancing constraints. We have the need to be more social than ever before. Android 11 reimagines how we have conversations on our phones by adding new capabilities to help you maintain your identity across multiple devices.

We are announcing some new features in Android 11 that allows you to easily connect with your loved ones, friends and business colleagues. At the center of this release is the Android Conversation Shortcut API and Identity Services Library. These new tools empower instant connections to your best friend, sharing funny pictures of your dogs, telling your aunt about a tasty seafood recipe that you discovered, or congratulating an office colleague about her promotion. And they also provide a new level of password management that makes it easier for your users to sign up and sign in.

One of our favorite features brings conversations from people that matter most to you right to the lock screen of your phone. You’ll easily recognize them by their avatar and instantly respond to your family, friends, or colleagues. These are people you truly want to connect to. We knew this feature would be useful, but the responses from our beta-testers made us smile. The decision to include the Conversation Shortcut API to improve the lives of our users was one of the easiest decisions for us to make in the release of Android 11.

Creating a bubble from an incoming notification and accessing the conversation from the bubble.

Creating a bubble from an incoming notification and accessing the conversation from the bubble.

One of the new features building on top of shortcuts is the new conversation space at the top of your notifications. It focuses your attention on what matters most - your conversations. Right from these notifications the user can trigger another new feature in Android 11 - Bubbles. Bubbles are small representations of conversations floating over other content on the side of the screen, which can be expanded to allow quick access to conversations, without changing what you were doing on the device. They are super handy for carrying on conversations while using the device for other tasks.

The new conversations space showing how a conversation is marked as priority and will be displayed on the lock screen.

The new conversations space showing how a conversation is marked as priority and will be displayed on the lock screen.

A long tap on conversation notifications enables the user to mark priority conversations in order to give special prominence to the most important people. Priority conversations will be displayed with their individual avatar right on the lock screen and move to the top of your notifications. They can even be set to break through do-not-disturb. Another use of the conversation shortcuts is that they are used as share targets in the system share sheet, which was already launched in Android 10.

Another focus of this week was identity. To tackle user and developer complexity that makes identity a challenge for developers, we've been working on One Tap and Block Store, part of our new Google Identity Services Library. One Tap is our new cross-platform sign-in mechanism for Web and Android, supporting and streamlining multiple types of credentials. Block Store is our new token-based sign-in mechanism that’s built on top of Backup and Restore. It allows you to keep your user signed in across Android devices.

We’re super excited about all these features [since] they help all of us connect, communicate and express ourselves to the people that we care about and to the apps we are dealing with -- which is as important now as it’s ever been.”

What to watch

For a high level overview of the people centric functionality, we recommend that you check out the Android 11 launch highlight video on People. Earlier this week, we also launched a new talk on ‘Conversation Notifications’, where Artur describes how to implement the conversation shortcut and bubbles. There is also a great overview talk on the conversation additions and other System UI news from Dan. Finally, you can also listen to Chet’s podcasts where he interviews us on People and Bubbles.

If you’re interested in learning more about Identity, we also published “In Identity on Android: What’s new in sign-in”, this week. In this video, Vishal explains the new libraries in the Google Identity System: One Tap and Block Store."

Two of the teams that worked very early with us on these conversation specific topics are the Messenger team from Facebook and the direct messaging team from Twitter. Read the stories around both of these implementations here and here.

Learning path

If you’re looking for an easy way to pick up the highlights of this week, check out the People and Identity pathway. A pathway is an ordered tutorial that allows users to complete a pre-defined module that culminates in a quiz. It may include codelabs, videos, articles and blog posts. A virtual badge is awarded to each user who passes the quiz. Test your knowledge of key takeaways about People and Identity to earn a limited edition badge.

Key takeaways

Android 11 is the starting point of an ongoing focus on what matters most to users, people and conversations. Many of our partners in our ecosystem introduced amazing apps and services enabling these connections with people and conversations. We in Android want to elevate and surface these partners more prominently in order to support this goal. Thus if you are working on an app that fosters real time communication between people, we strongly encourage you to adopt the conversation shortcut based APIs for notifications, bubbles and sharing when targeting API 30 in order to put users’ conversations front and center and give them quick access to your app. The developer documentation can be found here.

For apps that handle user accounts, we encourage you to help our users avoid messy password hunting and forgotten credential processes by integrating One Tap to streamline credential management and Block Store to handle device updates. These integrations will work on phones back to Android M.

In the spirit of this week, I wish you meaningful and joyful connections with the people that matter to you and seamless experiences with your favorite apps. We hope you will help us in our journey of supporting these goals.

Resources

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

Messenger and Conversations

Facebook logo

This blogpost is a collaboration between Google and Messenger from Facebook. Authored by Aaron Labiaga with support from Caleb Gomer and Samuel Guirado from Messenger.


Messenger is ubiquitous in the messaging app world and has pioneered the floating chat bubble. Bubbles help users keep conversations in view and accessible while multitasking, overlaying other UI elements in the foreground, and providing users easy access and visibility to their ongoing chats. Bubbles is one way Android 11 is making the platform more people-centric and expressive, reimagining the way we have conversations on our phones. Messenger’s early pioneering of the floating chat bubble, and the strong reception by users, helped lead to its native implementation in the framework.

new conversations ui gif

Bubbles

The Bubbles API is built on top of the notifications API and is exclusively focused on people in Android 11. First Introduced in Android 10, what was previously an opt-in feature is now on by default. To use bubbles, the developer must create BubbleMetadata, which is set on the notification. This metadata describes the Activity to launch when a bubble is clicked, along with various behaviors relevant to the expanded bubble. The Activity must follow the criteria of being embeddable and resizable in order to use it in a bubble.

Notification bubbles are reserved for conversations with persons in context. These are MessagingStyle notifications with a set long-lived shortcut ID. Please see the following Bubbles code sample how.


A Q&A with the Messenger team

The Messenger team shares their experience with the migration and prospect of the impact of the changes.

How was the migration to bubbles, technical challenges, scope, and impact on codebase?

Prior to Bubbles, Messenger used the SYSTEM_ALERT_WINDOW for its implementation of Bubbles. It achieved our purpose, but hosting complex Android UI outside of Activities is challenging to implement and maintain. Using this natively supported API allowed us to build more traditional, Activity-based Android UI that works well in Bubbles and full screen. This new Bubbles-based chat experience is much simpler and more maintainable than our SAW-based one. We are excited that Android believes that it is a user experience that will help drive improvements in the conversation space.
Ensuring that bubble shortcuts were up-to-date with the latest state of the conversation thread was a technical challenge worth noting. Picture changed, conversation deleted or contact blocked are events that require the bubble shortcut to be updated or even deleted. The Shortcuts API allows for easily registering/unregistering shortcuts and for querying and updating existing ones, which made this whole process very straight forward.

What are your future prospects on the impact of Messenger messages in the conversation space?

The conversation section will give our messages the right visibility. Given that conversation section ranks high in the notification drawer, we definitely want to be present in that space.

A people-centric experience in Android 11

Bubbles are just one way that Android 11 puts people at the heart of the experience for Android; if you’re a messaging or chat app, you should consider using the Bubbles API to help your users as they multi-task. It’s great to see apps like Messenger navigate the openness of Android to create innovative new experiences, and we’re excited to make Bubbles a native experience in Android 11. For more information, please visit the Conversation API guidelines.

Messenger and Conversations

Facebook logo

This blogpost is a collaboration between Google and Messenger from Facebook. Authored by Aaron Labiaga with support from Caleb Gomer and Samuel Guirado from Messenger.


Messenger is ubiquitous in the messaging app world and has pioneered the floating chat bubble. Bubbles help users keep conversations in view and accessible while multitasking, overlaying other UI elements in the foreground, and providing users easy access and visibility to their ongoing chats. Bubbles is one way Android 11 is making the platform more people-centric and expressive, reimagining the way we have conversations on our phones. Messenger’s early pioneering of the floating chat bubble, and the strong reception by users, helped lead to its native implementation in the framework.

new conversations ui gif

Bubbles

The Bubbles API is built on top of the notifications API and is exclusively focused on people in Android 11. First Introduced in Android 10, what was previously an opt-in feature is now on by default. To use bubbles, the developer must create BubbleMetadata, which is set on the notification. This metadata describes the Activity to launch when a bubble is clicked, along with various behaviors relevant to the expanded bubble. The Activity must follow the criteria of being embeddable and resizable in order to use it in a bubble.

Notification bubbles are reserved for conversations with persons in context. These are MessagingStyle notifications with a set long-lived shortcut ID. Please see the following Bubbles code sample how.


A Q&A with the Messenger team

The Messenger team shares their experience with the migration and prospect of the impact of the changes.

How was the migration to bubbles, technical challenges, scope, and impact on codebase?

Prior to Bubbles, Messenger used the SYSTEM_ALERT_WINDOW for its implementation of Bubbles. It achieved our purpose, but hosting complex Android UI outside of Activities is challenging to implement and maintain. Using this natively supported API allowed us to build more traditional, Activity-based Android UI that works well in Bubbles and full screen. This new Bubbles-based chat experience is much simpler and more maintainable than our SAW-based one. We are excited that Android believes that it is a user experience that will help drive improvements in the conversation space.
Ensuring that bubble shortcuts were up-to-date with the latest state of the conversation thread was a technical challenge worth noting. Picture changed, conversation deleted or contact blocked are events that require the bubble shortcut to be updated or even deleted. The Shortcuts API allows for easily registering/unregistering shortcuts and for querying and updating existing ones, which made this whole process very straight forward.

What are your future prospects on the impact of Messenger messages in the conversation space?

The conversation section will give our messages the right visibility. Given that conversation section ranks high in the notification drawer, we definitely want to be present in that space.

A people-centric experience in Android 11

Bubbles are just one way that Android 11 puts people at the heart of the experience for Android; if you’re a messaging or chat app, you should consider using the Bubbles API to help your users as they multi-task. It’s great to see apps like Messenger navigate the openness of Android to create innovative new experiences, and we’re excited to make Bubbles a native experience in Android 11. For more information, please visit the Conversation API guidelines.

Bringing @Twitter’s DMs into Android 11’s Conversation API

Twitter logo

This blogpost is a collaboration between Google and Twitter. Authored by Aaron Labiaga with support from Fred Lohner, Suzanne Xie, and Alex Ackerman-Greenberg from Twitter.

Direct Messages for the conversation space

Twitter is a social media app and source for what's happening in the world. And with Android 11’s Conversation APIs, surfacing Twitter’s Direct Messages to the conversation space makes perfect sense as it features real people talking in real time bidirectional conversations.

Conversation Notification API

To surface notifications in the conversation space, developers need to use Messaging style notifications set with a published long-lived shortcut ID. The shortcut ID allows for surfacing the conversation throughout various surfaces in the UI, including as a shortcut in the launcher. For details and sample code on how this is done, please see the conversation API guidelines.

A Q&A with the Twitter team

Frederik Lohner, tech lead of Twitter’s notifications team and Suzanne Xie, Product Management Director for Conversations on Twitter, share their experience with the migration and future prospect of its impact:

How was the migration to messaging style notifications, the technical scope and its impact on codebase?

Due to legacy reasons we first had to migrate to use MessagingStyle notifications for our DM pushes. This offered a great opportunity to clean up a bunch of technical debt, whilst allowing us the opportunity to begin testing things like auto generated replies for notifications with very little additional work on our side.

Were there any challenges in the implementation of shortcuts? As it is a requirement for the conversation space.

We found it challenging to manage the max number of published dynamic shortcuts since there is a limited number, requiring us to manually remove older shortcuts as new ones were created. The first Developer Preview didn't include a great way for managing shortcuts but this changed in Developer Preview 2, with the addition of the pushDynamicShortcut method that handled the limit for you.The addition of this method is a testament to the importance of exploring the APIs in the early stages and the impact one can have on shaping the API through testing and feedback.

What do think is the future prospect on the impact of migration, e.g. more engagement given visibility of DMs in conversation space?

We think these changes will make DM's much easier to use and look forward to getting feedback from our users. The conversation space also fits the purpose of DMs and it is important that these messages are categorized in the right space.

A people-centric experience in Android 11

Android 11 reimagines the way we have conversations on our phones, building an OS that can recognize and prioritize the most important people in your life. If your application has any notion of messaging between people then consider following the guidelines in the Android developer docs to ensure that you're taking advantage of the new conversation space. We’re excited to work with developers like Twitter to help showcase the importance of a people-centric experience on mobile phones with Android 11.

Bringing @Twitter’s DMs into Android 11’s Conversation API

Twitter logo

This blogpost is a collaboration between Google and Twitter. Authored by Aaron Labiaga with support from Fred Lohner, Suzanne Xie, and Alex Ackerman-Greenberg from Twitter.

Direct Messages for the conversation space

Twitter is a social media app and source for what's happening in the world. And with Android 11’s Conversation APIs, surfacing Twitter’s Direct Messages to the conversation space makes perfect sense as it features real people talking in real time bidirectional conversations.

Conversation Notification API

To surface notifications in the conversation space, developers need to use Messaging style notifications set with a published long-lived shortcut ID. The shortcut ID allows for surfacing the conversation throughout various surfaces in the UI, including as a shortcut in the launcher. For details and sample code on how this is done, please see the conversation API guidelines.

A Q&A with the Twitter team

Frederik Lohner, tech lead of Twitter’s notifications team and Suzanne Xie, Product Management Director for Conversations on Twitter, share their experience with the migration and future prospect of its impact:

How was the migration to messaging style notifications, the technical scope and its impact on codebase?

Due to legacy reasons we first had to migrate to use MessagingStyle notifications for our DM pushes. This offered a great opportunity to clean up a bunch of technical debt, whilst allowing us the opportunity to begin testing things like auto generated replies for notifications with very little additional work on our side.

Were there any challenges in the implementation of shortcuts? As it is a requirement for the conversation space.

We found it challenging to manage the max number of published dynamic shortcuts since there is a limited number, requiring us to manually remove older shortcuts as new ones were created. The first Developer Preview didn't include a great way for managing shortcuts but this changed in Developer Preview 2, with the addition of the pushDynamicShortcut method that handled the limit for you.The addition of this method is a testament to the importance of exploring the APIs in the early stages and the impact one can have on shaping the API through testing and feedback.

What do think is the future prospect on the impact of migration, e.g. more engagement given visibility of DMs in conversation space?

We think these changes will make DM's much easier to use and look forward to getting feedback from our users. The conversation space also fits the purpose of DMs and it is important that these messages are categorized in the right space.

A people-centric experience in Android 11

Android 11 reimagines the way we have conversations on our phones, building an OS that can recognize and prioritize the most important people in your life. If your application has any notion of messaging between people then consider following the guidelines in the Android developer docs to ensure that you're taking advantage of the new conversation space. We’re excited to work with developers like Twitter to help showcase the importance of a people-centric experience on mobile phones with Android 11.