Tag Archives: Android Auto

Android Developer Story: Shifty Jelly drives double-digit growth with material design and expansion to the car and wearables

Posted by Lily Sheringham, Google Play team

Pocket Casts is a leading podcasting app on Google Play built by Australian-based mobile development company Shifty Jelly. The company recently achieved $1 million in sales for the first time, reaching more than 500K users.

According to the co-founder Russell Ivanovic, the adoption of material design played a significant role in driving user engagement for Pocket Casts by streamlining the user experience. Moreover, users are now able to access the app beyond the smartphone -- in the car with Android Auto, on a watch with Android Wear or on the TV with Google Cast. The rapid innovation of Android features helped Pocket Casts increase sales by 30 percent.

We chatted with co-founders and Android developers Russell and Philip Simpson to learn more about how they are growing their business with Android.

Here are some of the features Pocket Casts used:

  • Material Design: Learn more about material design and how it helps you create beautiful, engaging apps.
  • Android Wear: Extend your app to Android Wear devices with enhanced notifications or a standalone wearable app.
  • Android Auto: Extend your app to an interface that’s optimized for driving with Android Auto.
  • Google Cast: let your users cast your app’s content to Google Cast devices like Chromecast, Android TV, and speakers with Google Cast built-in.

And check out the Pocket Casts app on Google Play!

Enable your messaging app for Android Auto

Posted by Joshua Gordon, Developer Advocate

What if there was a way for drivers to stay connected using your messaging app, while keeping their hands on the wheel and eyes on the road?

Android Auto helps drivers stay connected, but in a more convenient way that's integrated with the car. It eliminates the need to type and read messages by replacing these activities with a voice controlled interface.

Enabling your messaging app to work with Android Auto is easy. Developers like Skype and textPlus have already done so. Check out this DevByte for an overview of the messaging APIs, and see the developer training guide for a deep dive. Read on for a look at the key steps involved.


Message notifications on the car’s display

When an Android 5.0+ phone is connected to a compatible car, users receive incoming message notifications from Auto-enabled apps on the car’s head unit display. Your app runs on the phone, but is controlled by the car. To learn more about how this works, watch the Introduction to Android Auto DevByte.

A new message notification from Skype

If your app already uses notifications to alert the user to incoming messages, it’ll be easy to extend these for Auto. It takes just a few lines of code, and you won’t have to change how your app works on the phone.

There are a couple small differences between message notifications on Auto vs. a phone. On Auto, a preview of the message content isn’t shown, because messaging is driven entirely by voice. Second, message notifications are backed by a conversation object. This is simply a collection of unread messages from a particular sender.

Decorate your notification with the CarExtender to add support for the car. Next, use the UnreadConversation.Builder to create a conversation, and populate it by iterating over your app's unread messages (from a certain sender) and adding them to the conversation. Pass your conversation object to the CarExtender, and you’re done!

Tap to hear messages

Tapping on a message notification plays it back on the car's sound system, via text to speech. This is handled automatically by the framework; no additional code is required. Pretty cool, right?

In order to know when the user hears a message, you provide a PendingIntent that’s triggered by the system. That’s one of just two intents you’ll need to handle to enable your app for Auto.

Reply by voice

Voice control is the real magic of Android Auto. Users reply to messages by speaking, via voice recognition. This is far faster and more natural than typing.

Enabling this functionality is as simple as adding a RemoteInput instance to your conversation objects, before you issue the notification. Speech recognition is handled entirely by the framework. The recognition result is delivered to your app as a plain text string via a second PendingIntent.

Replying to a message from textPlus by voice.

Next Steps

Make your messaging app more natural to use in the car by enabling it for Android Auto. Now drivers can stay connected, without typing or reading messages. It just takes a few lines of code. To learn more visit developer.android.com/auto

Developing audio apps for Android Auto

Posted by Joshua Gordon, Developer Advocate

Have you ever wanted to develop apps for the car, but found the variety of OEMs and proprietary platforms too big of a hurdle? Now with Android Auto, you can target a single platform supported by vehicles coming soon from 28 manufacturers.

Using familiar Android APIs, you can easily add a great in-car user experience to your existing audio apps, with just a small amount of code. If you’re new to developing for Auto, watch this DevByte for an overview of the APIs, and check out the training docs for an end-to-end tutorial.


Playback and custom controls


Custom playback controls on NPR One and iHeartRadio.

The first thing to understand about developing audio apps on Auto is that you don’t draw your user interface directly. Instead, the framework has two well-defined UIs (one for playback, one for browsing) that are created automatically. This ensures consistent behavior across audio apps for drivers, and frees you from dealing with any car specific functionalities or layouts. Although the layout is predefined, you can customize it with artwork, color themes, and custom controls.

Both NPR One and iHeartRadio customize their UI. NPR One adds controls to mark a story as interesting, to view a list of upcoming stories, and to skip to the next story. iHeartRadio adds controls to favorite stations and to like songs. Both apps store user preferences across form factors.

Because the UI is drawn by the framework, playback commands need to be relayed to your app. This is accomplished with the MediaSession callback, which has methods like onPlay() and onPause(). All car specific functionality is handled behind the scenes. For example, you don’t need to be aware if a command came from the touch screen, the steering wheel buttons, or the user’s voice.

Browsing and recommendations


Browsing content on NPR One and iHeartRadio.

The browsing UI is likewise drawn by the framework. You implement the MediaBrowserService to share your content hierarchy with the framework. A content hierarchy is a collection of MediaItems that are either playable (e.g., a song, audio book, or radio station) or browsable (e.g., a favorites folder). Together, these form a tree used to display a browsable menu of your content.

With both apps, recommendations are key. NPR One recommends a short list of in-depth stories that can be selected from the browsing menu. These improve over time based on user feedback. iHeartRadio’s browsing menu lets you pick from favorites and recommended stations, and their “For You” feature gives recommendations based on user location. The app also provides the ability create custom stations, from the browsing menu. Doing so is efficient and requires only three taps (“Create Station” -> “Rock” -> “Foo Fighters”).

When developing for the car, it’s important to quickly connect users with content to minimize distractions while driving. It’s important to note that design considerations on Android Auto are different than on a mobile device. If you imagine a typical media player on a phone, you may picture a browsable menus of “all tracks” or “all artists”. These are not ideal in the car, where the primary focus should be on the road. Both NPR One and iHeartRadio provide good examples of this, because they avoid deep menu hierarchies and lengthy browsable lists.

Voice actions for hands free operation

Voice actions (e.g., “Play KQED”) are an important part of Android Auto. You can support voice actions in your app by implementing onPlayFromSearch() in the MediaSession.Callback. Voice actions may also be used to start your app from the home screen (e.g., “Play KQED on iHeartRadio”). To enable this functionality, declare the MEDIA_PLAY_FROM_SEARCH intent filter in your manifest. For an example, see this sample app.

Next steps

NPR One and iHeartRadio are just two examples of great apps for Android Auto today. They feel like a part of the car, and look and sound great. You can extend your apps to the car today, too, and developing for Auto is easy. The framework handles the car specific functionalities for you, so you’re free to focus on making your app special. Join the discussion at http://g.co/androidautodev if you have questions or ideas to share. To get started on your app, visit developer.android.com/auto.

Take your apps on the road with Android Auto

Posted by Wayne Piekarski, Developer Advocate

Starting today, anyone can take their apps for a drive with Android Auto using Android 5.0+ devices, connected to compatible cars and aftermarket head units. Android Auto lets you easily extend your apps to the car in an efficient way for drivers, allowing them to stay connected while still keeping their hands on the wheel and their eyes on the road. When users connect their phone to a compatible vehicle, they will see an Android experience optimized for the head unit display that seamlessly integrates voice input, touch screen controls, and steering wheel buttons. Moreover, Android Auto provides consistent UX guidelines to ensure that developers are able to create great experiences across many diverse manufacturers and vehicle models, with a single application available on Google Play.

With the availability of the Pioneer AVIC-8100NEX, AVIC-7100NEX, and AVH-4100NEX aftermarket systems in the US, the AVIC-F77DAB, AVIC-F70DAB, AVH-X8700BT in the UK, and in Australia the AVIC-F70DAB, AVH-X8750BT, it is now possible to add Android Auto to many cars already on the road. As a developer, you now have a way to test your apps in a realistic environment. These are just the first Android Auto devices to launch, and vehicles from major auto manufacturers with integrated Android Auto support are coming soon.

With the increasing adoption of Android Auto by manufacturers, your users are going to be expecting more support of their apps in the car, so now is a good time to get started with development. If you are new to Android Auto, check out our DevByte video, which explains more about how this works, along with some live demos.

The SDK for Android Auto was made available to developers a few months ago, and now Google Play is ready to accept your application updates. Your existing apps can take advantage of all these cool new Android Auto features with just a few small changes. You’ll need to add Android Auto support to your application, and then agree to the Android Auto terms in the Pricing & Distribution category in the Google Play Developer Console. Once the application is approved, it will be made available as an update to your users, and shown in the cars’ display.

Adding support for Android Auto is easy. We have created an extensive set of documentation to help you add support for messaging (sample), and audio playback (sample). There are also short introduction DevByte videos for messaging and audio as well. Stay tuned for a series of posts coming up soon discussing more details of these APIs and how to work with them. We also have simulators to help you test your applications right at your desk during development.

With the launch of Android Auto, a new set of possibilities are available for you to make even more amazing experiences for your users, providing them the right information for the road ahead. Come join the discussion about Android Auto on Google+ at http://g.co/androidautodev where you can share ideas and ask questions with other developers.

A new reference app for multi-device applications

It is now possible to bring the benefits of your app to your users wherever they happen to be, no matter what device they have near them. Today we’re releasing a reference sample that shows how to implement such a service with an app that works across multiple Android form-factors. This sample, the Universal Music Player, is a bare-bones but functional reference app that supports multiple devices and form factors in a single codebase. It is compatible with Android Auto, Android Wear, and Google Cast devices. Give it a try and easily adapt your own app for wherever your users are, be that a phone, watch, TV, car, or more!

Playback controls and album art in the lock screen.
On the application toolbar, the Google Cast icon.

screendump-2015-03-09-16:23:54.png
Controlling playback through Android Auto


Controlling playback on an Android Wear watch

This sample uses a number of new features in Android 5.0 Lollipop, like MediaStyle notifications, MediaSession and MediaBrowserService. They make it easy to implement media browsing and playback on multiple devices with a single version of your app.

Check out the source code and let your users enjoy your app from wherever they like.

Posted by Renato Mangini, Senior Developer Platform Engineer, Google Developer Platform Team

New Code Samples for Lollipop

Posted by Trevor Johns, Developer Programs Engineer

With the launch of Android 5.0 Lollipop, we’ve added more than 20 new code samples demonstrating how to implement some of the great new features of this release. To access the code samples, you can easily import them in Android Studio 1.0 using the new Samples Wizard.

Go to File > Import Sample in order to browse the available samples, which include a description and preview for each. Once you’ve made your selection, select “Next” and a new project will be automatically created for you. Run the project on an emulator or device, and feel free to experiment with the code.

Samples Wizard in Android Studio 1.0
Newly imported sample project in Android Studio

Alternatively, you can browse through them via the Samples browser on the developer site. Each sample has an Overview description, Project page to browse app file structure, and Download link for obtaining a ZIP file of the sample. As a third option, code samples can also be accessed in the SDK Manager by downloading the SDK samples for Android 5.0 (API 21) and importing them as existing projects into your IDE.


Sample demonstrating transition animations

Material Design

When adopting material design, you can refer to our collection of sample code highlighting material elements:

For additional help, please refer to our design checklist, list of key APIs and widgets, and documentation guide.

To view some of these material design elements in action, check out the Google I/O app source code.

Platform

Lollipop brings the most extensive update to the Android platform yet. The Overview screen allows an app to surface multiple tasks as concurrent documents. You can include enhanced notifications with this sample code, which shows you how to use the lockscreen and heads-up notification APIs.

We also introduced a new Camera API to provide developers more advanced image capture and processing capabilities. These samples detail how to use the camera preview and take photos, how to record video, and implement a real-time high-dynamic range camera viewfinder.

Elsewhere, Project Volta encourages developers to make their apps more battery-efficient with new APIs and tools. The JobScheduler sample demonstrates how you can schedule background tasks to be completed later or under specific conditions.

For those interested in the enterprise device administration use case, there are sample apps on setting app restrictions and creating a managed profile.

Android Wear

For Android Wear, we have a speed tracker sample to show how to take advantage of GPS support on wearables. You can browse the rest of the Android Wear samples too, and here are some highlights that demonstrate the unique capabilities of wearables, such as data synchronization, notifications, and supporting round displays:

Android TV

Extend your app for Android TV using the Leanback library described in this training guide and sample.

To try out a game that is specifically optimized for Android TV, download Pie Noon from Google Play. It’s an open-source game developed in-house at Google that supports multiple players using Bluetooth controllers or touch controls on mobile devices.

Android Auto

For the use cases highlighted in the Introduction to Android Auto DevByte, we have two code samples. The Media Browser sample (DevByte) demonstrates how easy it is to make an audio app compatible with Android Auto by using the new Lollipop media APIs, while the Messaging sample (DevByte) demonstrates how to implement notifications that support replies using speech recognition.

Google Play services

Since we’ve discussed sample resources for the Android platform and form factors, we also want to mention that there are existing samples for Google Play services. With Google Play services, your app can take advantage of the latest Google-powered APIs such as Maps, Google Fit, Google Cast, and more. Access samples in the Google Play services SDK or visit the individual pages for each API on the developer site. For game developers, you can reference the Google Play Games services samples for how to add achievements, leaderboards, and multiplayer support to your game.

Check out a sample today to help you with your development!

Begin developing with Android Auto

Posted by Daniel Holle, Product Manager

At Google I/O back in June, we provided a preview of Android Auto. Today, we’re excited to announce the availability of our first APIs for building Auto-enabled apps for audio and messaging. Android apps can now be extended to the car in a way that is optimized for the driving experience.

For users, this means they simply connect their Android handheld to a compatible vehicle and begin utilizing a car-optimized Android experience that works with the car’s head unit display, steering wheel buttons, and more. For developers, the APIs and UX guidelines make it easy to provide a simple way for users to get the information they need while on the road. As an added bonus, the Android Auto APIs let developers easily extend their existing apps targeting Android 5.0 (API level 21) or higher to work in the car without having to worry about vehicle-specific hardware differences. This gives developers wide reach across manufacturers, model and regions, by just developing with one set of APIs and UX standards.

There are two use cases that Android Auto supports today:

  • Audio apps that expose content for users to browse and allow audio playback from the car, such as music, podcasts, news, etc.
  • Messaging apps that receive incoming notifications, read messages aloud, and send replies via voice from the car.

To help you get started with Android Auto, check out our Getting Started guide. It’s important to note that while the APIs are available today, apps extended with Android Auto cannot be published quite yet. More app categories will be supported in the future, providing more opportunities for developers and drivers of Android Auto. We encourage you to join the Android Auto Developers Google+ community to stay up-to-date on the latest news and timelines.

We’ve already started working with partners to develop experiences for Android Auto: iHeartRadio, Joyride, Kik, MLB.com, NPR, Pandora, PocketCasts, Songza, SoundCloud, Spotify, Stitcher, TextMe, textPlus, TuneIn, Umano, and WhatsApp. If you happen to be in the Los Angeles area, stop by the LA Auto Show through November 30 and visit us in the Hyundai booth to take Android Auto and an app or two for a test drive.