Use Case – Bringing Universal Google Cast Functionality To Every Android App

Abstract:

Kevin Nilson / Tech lead & manager, Solutions engineer at google, May 2016: Have you ever wanted to expand your Android app to enable a multi-screen experience, leveraging large display devices like a television? Google Cast allows you to share content with a Google Cast receiver device, such as a Chromecast. Google Cast Android senders use the familiar controls of your android phone or tablet to control the experience of your application on the big screen.
In this Talk, we will talk about Android sender app development. We will also cover some of the best practices to keep your Cast experience simple, intuitive, and predictable.

Talk Transcription:

So how’s everybody doing?

[Audience] Fabulous. Good.

Great. So can we begin?

[Audience] Yes.

Awesome. All right, here we go. So I am Kevin Nilson. I work with Google as a technical solutions engineer on Chromecast. So show of hands, does everybody here have a Chromecast? Has anybody written any apps for Chromecast? A couple. Wow. Awesome. The rest of you, we’re going to change that today. All right? All right, here we go.

So, Chromecast, we all know, enjoy sitting back watching TV, being a couch potato. But it’s not about just that, it’s also about, you know, sitting with friends and family, kind of collaborating together, enjoying something. And it’s a lot of fun.

So one of the things that we’ve noticed is people are watching a lot of TV. Be it on Chromecast, be it on Apple TV, be it on Roku, Fire Stick, you know, anyone in this space and really I want to just point out the number of hours per day people are spending on TV. You notice, we in the US, I’m sure we have folks from all over, the US are winning this.

[Audience] Yeah.

Go team USA. There we are.

But anyway, it’s an awesome opportunity for you if you do have an app that has some kind of content that can go up on a TV, um, you know, it’s an awesome opportunity to be part of this. We’re going to see more and more of this, less in the way of TV, and more through mobile.

So this is the new Chromecast. I imagine some of you have the older one that’s looks kind of more like a key. This is our new one that we launched a while ago. And one of the exciting things is we have Chromecast audio. So does any of you have the audio? No, nobody. So what’s cool about the audio is it does multi-room, so it’ll sync. You can have one in your kitchen, one in your living room, all over the house. That’s kind of fun. So there they are, TV and audio.

So what’s unique about Chromecast is it sort of changes and makes the phone be the remote control. And so when a user is using your app, what’s really cool is they can — they’re looking at their phone and using their phone for navigation, finding what they want to play. So if you think about if I reached out to you, and said, everyone in the audience I’m going to give half of you a remote control and half of you a smartphone and said go find Game of Thrones, it’s obvious the people using their phones are going to be a lot happier, it’s going to be a lot easier, it’s going to have your preferences. And we at Chromecast think that this is really the future and the way people want to consume TV.

So kind of summing it up what we’re doing is taking this really immersive 17 inch world and taking the 10 foot experience, lean back, casual, relaxing and bringing those two worlds together to make the best TV experience.

And then we got SDKs, iOS, Android, and Chrome so you can kind of play it everywhere and have support everywhere. Lots of apps. Here’s some HBO Now. Lots of — we got over a thousand apps out there today.

So now let’s jump in a little bit more into the code — the technology. So some of you may be confused, you know, why is Kevin talking about Google Cast and not Chromecast. So Google Cast is the technology, where Chromecast is just one dongle that supports Google Cast. So we have a lot of speakers and TVs and really what we’re hoping is as time goes on, when you buy a TV or a speaker it will have Google Cast inside of it. So we recently launched the Vizio TVs that have that. And then we have lots of speakers from everyone from Sony, JVL, lots and lots of different people out there producing speakers.

Next is the Chrome sender. That’s like your iOS or Android or your web app. And the receiver is the code that runs in the TV. And what’s really unique and nice about Chromecast is the code in the TV is an HTML and JavaScript. We’ll talk about how to do some of that here in a minute. And then finally, casting: sending something up to the TV.

So how does it work? Someone’s watching a piece of YouTube here on their phone. They look up and see their TV and say, wow I wish I could watch it on the 10 foot experience rather on this small four inch display. And remember they got the Chromecast, they see the Cast icon, they click it and what happens is the phone connects over to the TV, over to your dongle. And then the dongle reaches out to the cloud directly. And then from the cloud, YouTube is streamed to your TV. So what this means is you’re streaming directly from the cloud to your TV. So you’re not draining your battery, you’re not using up your data connection, your phone’s not getting hot. It’s all done from the cloud to your TV. And then later when you want to control things, you do that from your phone, so your phone can become that remote control. You can change the video or using your phone to change the volume.

And then finally one of the cool features is the phone, once playback has begun, can actually disappear. So it’s pretty common for me to throw something up on TV for the kids and then go in the back yard and, you know, take out the trash, or clean the garage, or whatever it is, while the kids are in the house watching TV.

So here it is. This simple, fun animation. Pop it over your speakers or pop it over your TV.

So now I want to talk about some general tips that apply to, you know, anything you are doing over the TV space. Be it Chromecast or be it anything else like Roku, Fire Stick, you know, any of the other — Apple TV, any of those products.

So here’s kind of an interesting example. I wrote this simple Cast receiver that just says hello cast developer. And then when you look at my TV, you can see down here on the TV it’s very — all you see is hello cast developer and it’s kind of cut off. So does anyone know why this happens? Any guesses? Anyone? No. So TVs have something called overscan and this is from old displays and old technologies from the past, but it’s still here today. And so what I showed on back on my TV, that was the out of box experience that I got from my TV when I wrote a simple webpage. And so what you need to do is keep this in mind and if you think back to television shows you’ve watched you’ll remember how […] spaces are typically not near the sides or the top of their screen. Sides especially, because you wouldn’t want them cut off. Any kind of logos or any kind of text also you have to be careful with.

The next thing I’ll talk about is the second screen is sort of interactions. And so Cast is a second screen experience where your phone is the first screen and then up on the screen, the large screen is your, you know, your second screen. And so half of what people will do is try to take their web app, and since Chromecast is Chrome running in the dongle. They’ll take their web app and try to run it, kind of fairly unchanged up on the television. Then what happens is you wind up with dialogues, sort of error messages, things like that showing up on the TV. And what you have to keep in mind with second screen, be it Cast or other technology, a user is looking at their phone when they’re doing interactions, not at the TV.

And the last thing is burn-in. So you guys remember back in the day, old, being at the library, shows up well on this screen. But does anybody have any guess today, which kind of televisions — modern televisions today — where this is a big problem? Any guesses? Anyone?

[Audience] LCDs.

LCDs? Close.

[Audience] Plasmas.

Plasmas. Somebody said plasmas. Like six of you said plasmas. So how much time do you think it takes before you start to have burn-in issues on a plasma TV? Any guesses? Any guesses?

[Audience] Couple of hours?

Couple of hours? Anyone else?

[Audience] Thirty minutes.

Thirty minutes? Yes. So it’s actually a really small number. It sort of depends on the TV but there are several models out there that at 10 minutes you start to do permanent damage to your television. And so you have to think about that and code for that as you make your app. Moving logos around. And then making sure in your app, if someone pauses for example, after 10 minutes you move on and do something else. Or if — when — playback is finished, you’re doing a splash screen, rotating through several images. So definitely when dealing in the TV space, keep burn-in in mind because there’s a lot of expensive plasma screens out there that get some of the best pictures today.

So now I want to talk about the design checklist for Cast. So one of the things that we’ve done is put together a rather extensive design checklist. And really what we’re trying to achieve is when someone comes into your app, it feels like all the other apps out there. So they don’t have to learn, you know, how does Cast work, they already know from other apps. And many times, sort of a tip and trick, for Android developers as well, is one of the things, you know, that happens to me many times at conferences, developers will come to me and say, Kevin, I don’t understand why my app has never been featured by Google. I don’t understand why. And I always ask, you know, what do you think about the design checklist, you know, have you really looked at that. And at Google, the design checklist is just as important to us on Android as what it is to Apple and iOS. And where Apple has a model where they will reject your app and not let you put it in the store, at Google, we believe, the best apps will bubble to the top and we definitely do not drive traffic towards apps that are not designed, you know, they don’t match the UX guidelines.

And so there’s guidelines for Android, there’s guidelines for Cast, we treat those just as important as what Apple does and I really, you know I really, encourage you if you want your app to be featured by Google, definitely go out, look at the design guidelines. Read them a few times to make sure you’re compliant. And I will tell you that all apps that are featured do get tested.

Cool. Awesome. Thanks man.

All right. It just drops back down. Cool. So now let’s talk about the Cast SDK for Android. So the Cast SDK for Android it’s built in as part of Google Play services. Right. So Google Play Services is a big, monolithic, you know, project that contains all the services that are within Google. And what’s nice is, it keeps the user always on the latest version because they have this that contains code for Maps as well as Cast as well as all the other Android services. It just drops back down. And so, what’s nice is when your app goes out, when your APK goes out to the Play Store it doesn’t get bloated with a big library to add Cast because all the heavy lifting is done within Google Play Services.

So the Android SDK has a few domain objects. There’s the CastDevice which represents the Google Cast device, or the dongle. And then we have Cast and GoogleApiClient which are how you used to control that Cast device. And then we have a MessageReceivedCallback which is for, you know, indicating the message has been received. And RemoteMediaPlayer that allows you to control all the playback. So play and pause and things like that.

We’ll walk through some of the code now. So the first thing I want to show is how easy it is to add the Cast icon to your app. And so typically you’ll have a menu XML that’ll have a menu and within it you can add a single item here that basically defines your MediaRouteActionProvider so this is the provider that handles Cast. You just add this one item in your menu and then the Cast icon will show up in your action bar.

Now we’re going to show you how to plug in the code to make it visible. So showing that button you need to add a callback. And basically that’s — for a media router callback that tells an onRouteSelected. So this is when someone has clicked the Cast icon, you get a list of devices and then they select one. They say, I’d like to go to the living room. That’s where you’re going to get this onRouteSelected callback. And here’s some more details on what that code looks like. So the onResume. So that’s basically the foregrounding. When that activity loads or when the phone foregrounds, you’ll want to add the callback in the mediaRouterCallback has an onRouteSelected telling you that particular device has been selected and then from its bundle you can get information about which device it is. Be it your living room, your family room, your office, which ever device it is within your household. You’ll use that information about that selected device to connect up the Google API client soon.

So the next thing you need to do is create a GoogleApiClient and what you’ll do that with a, you know, there’s a builder for options — for Cast options — and this is all the options that you’d like to use within your application. And then finally you call connect and what this connection is really like opening a socket to your last connection between the phone and out to the dongle itself.

So the next piece you want to do is launch your application. So you’ve connected and now you need to launch. And to get that, when you launch, that’s where you see your app up on the TV. And so doing that you’ll get a callback for onConnected telling you that the connection was successful and then from there from the Cast API client you’ll call launchApplication. And when you do this that’s when the magic happens. That’s when your app gets launched and the phone is connected.

The next thing we want to do typically at Cast app or often is to work with media. And so we have a MediaInfo object which is the main class that represents a piece of media. Be it a song or be it a movie. Or be it an image. And then we have a RemoteMediaPlayer that deals with playback. And what happens internally is use the media session for handling things like lock screen and notifications.

So here’s an example of using the mediaInfo. It has a builder — makes it really easy. You can load your mp4 and set its content type and then type is it buffer or live and you’ll build that. And the next thing you do with load. And think of load kind of like loading a DVD. This is the piece of, you know, this is the API client I’d like to use to load this piece of media. And then at the end there’s a value true. Can anyone guess what that true is? What would the true be when you load a piece of media?

[Audience] It would be an image.

An image? Anyone else? Anyone else? So that’s autoplay. So basically telling it I want to go ahead and load this and play it right away so, you know, you don’t want to start, like, in a pause state. You want to start with that video playing right out of the gate. And then you have other commands such as play and seek and requestStatus and the status is really important because one of the things that’s unique about Chromecast is it is multi-cinder. And what that means is if all of us together are sitting and watching a movie, we can all use our phones to control the content. Or maybe we’re going to have a dance party and we’re all controlling the music and the playlist, right, you want to be able to use, to request that status so that you get a media status so you know what the current metadata is so you can update your screen with that information.

So we found that, I went through the code earlier there’s a lot of, sort of a few extra steps such as connect before launch. It’s a little bit heavy to do all the work and so we created an open source library called the Cast Companion Library. And the Cast Companion Library brings all kinds of things: lock screen, a mini-player, it brings notifications and support through the media session. So it does a lot of the heavy lifting for you and really simplifies doing a Cast integration. And so for the big partners that we work with, we have probably about an 80 percent if not larger adoption of this library for Android. So I highly recommend those of you if you are going to Cast enable something you go out and do that.

But one of the things that we’ve found was there were a lot of flaws to open source. In open source for a library like this — it sort of causes a lot of problems. Right. And so what we found was when our partners checked out, you know they cloned this repo on GitHub they kind of went out and then when they wanted to make a change, you know, be it the theming or small, you know any kind of minor changes they wanted to do, they were modifying the source directly rather than subclassing. How you would like someone to use your library. And so what happened was here a year, year and a half down the road, we found that all of our partners were stuck on one year or year and a half old versions of our library. So since then we’ve done over 10 releases, we’ve fixed tons of bugs, and we’ve added tons of features. And so the problem we had, like I said, was just partners looking at what is the easiest way to get the behavior I want out of your library. And that would be, you know, by changing that code directly. Especially the way in Eclipse that things worked, that was the case for sure. It was definitely — you didn’t have — before AAR files, we had a lot of problems.

And so how did we solve that problem? What were we doing? What did we go out there and do to try and fix things. We really looked to JCenter and looked at publishing our binary in JCenter. So rather than telling our partners to clone this project, here’s all of it’s source, we gave them a binary. And so we’ve recently made this shift. I’m really excited about what it’s going to do. You know, I think it really encourages people to look at what’s the least — the line of least resistance for making the changes you want and that would be subclassing and using your library correctly.

And it’s been, so far, it’s been a heavily requested feature by our partners. A lot of people want it because they didn’t want the overhead of building it themselves. And so this is something we did and I think not only in simplifying partners, simplifying what they needed to do to integrate but also it reduces and eliminates that maintenance headache that we’ve had. And so many times myself and others on my team had to get on airplanes to sit with partners to help them with merge conflicts of getting to our latest. I can tell you, lots of status miles. By dealing with this problem we’re hoping that JCenter really will save the day for us so we just did that here back in February 8th. So just a few months ago, but so far, so good.

Cool. So, I’ve been talking about the sender, I’ve been talking about Android, now so we can see how the whole thing works together, I’m going to talk a little about the receiver. And that’s the code that runs in the dongle, runs in your TV, or runs in your speaker.

So it’s HTML5, JavaScript, CSS. And so it’s really simple, kind of low barrier to entry for everyone wanting to do an integration. You can use all the tools, you know the ChromeDev tools, things like that you’re used to and then it supports, you know, EME and Web Audio and MSE, Media Source Extensions, as well is how that works.

So to get started for folks who are Android developers they can go out there’s a default receiver where you just plug-in a constant, you get that receiver for free, you don’t have to do anything. One of the drawbacks to that is it’s not customized at all. And so it doesn’t really look and feel like your app. The next thing we have is a styled receiver that lets you provide one CSS file. We have a few people out there who are using this and are really happy. And so here’s some examples. This is our sample app that uses it. Where we’ve done some theming to make the, you know, status bar yellow, the progress bar, sorry, yellow. You can define a splash screen with an image of, you know, sort of advertising new content. Things like that.

But what most people do is a Custom Receiver. And so here’s a Custom Receiver where you write all the JavaScript, all the HTML, CSS for doing that. And here’s just the simplest most hello world version. And I’m going to walk through that.

So you start by adding a cast_receiver.js file. So that’s one JavaScript file that contains a library. And then you need one media element so here I have the video tag. And then what I do is create a MediaManager and pass it that one media element. And then finally give it a cast, get an instance of the CastReceiverManager and call start. And so what this does is gives you the most hello world receiver that will listen to all media playback. And so basically play, pause, scrub. All of that is handled by the CastReceiverManager. Does everything for you. So it doesn’t get any simpler than this. And then there’s lots and lots of callbacks, people joining, people leaving, errors dealing with media player. There’s a lot more, but I don’t want to go through those details today cause we’re here to talk more about Android. But there is a nice sample Custom Receiver on GitHub that most of our partners use. That’s a few thousand lines of code and then you can kind of work with that.

So now I want to talk about debugging which is kind of one of the strong suites for Chromecast when compared to — to — other platforms and I think, you know, from a developer perspective, definitely it’s sort of unanimous among all of our partners and all of the developers out there, that we’re the easiest in this space and the best set of tools. And the way that’s done is — is using Chrome dev tools. And so we use Chrome remote debugging to debug. So you can sit on your laptop in Chrome and debug up to your TV. And so has anyone used Chrome remote debugging before? For anything besides Chromecast. Anywhere else? Well it’s really, really awesome. Yeah […] has. So what did you use it for?

[Audience]

Oh it wasn’t […] I think you both, like, right behind each other raised your hands, that’s very interesting. Yeah.

So for Android. So it’s really cool for Android is if you have a website that has some issues that are only showing up in mobile. And, you know, the startup I worked at before I joined Google we had some scrolling performance issues that were only showing up on mobile. They weren’t showing up on a browser. We couldn’t reproduce the problems and so what we did was is we used the Chrome remote debugger, through ADB bridge, and were able to remote debug our Android, you know, Chrome app, our Chrome website and get all the logging, debugging, but most importantly for like scrolling performance was logging so we could output logs, all the break points, you can, you know, have the console and change things. So you get all that with the Chrome dev tools with the Chrome remote debug. And that’s the same thing we do for Chromecast. So it looks and feels and drives exactly the same as it does if you’re writing any other web app here.

And then a little tip and trick for those of you who don’t — probably don’t do a lot of web development and maybe folks who do is, there’s a debugger command which I — it’s one of my favorites so I always put it in the slides here. And that’s basically a manual breakpoint. And so I like that often when I’m — when I’m — you know, working on code, put debugger in there so that I can force it to stop and I don’t have to connect and such and get my breakpoint in there. I’d like to be able to comment out places that are common places where I’d like a breakpoint. So definitely check out the debugger command, you’ll love it. And then console log, obviously, and console der, which is more of an object to walk through. That’s all there. Cool.

So I talked a lot about, kind of before we wrap things up, I talked a lot about more of a traditional cast development, right. Working with an Android app and then having a receiver that’s HTML5 based receiver. And now one of the things I wanted to point out is we also have, if you’re doing games development, we have something called remote display and this is really great, for like, games. And the way it works is basically you create a second screen on your phone. A sort of second surface view that sits behind and isn’t shown and that screen gets mirrored up to the TV. And so if you have a game that you’ve written often in like an hour or so folks can get a hello world version of it working on Cast. So basically, typically, they’ll take their main screen, push it behind and then paint some joysticks or plug in the, you know, the gyro or tap controls and they can just build that in creating that second screen, you know, passing that second screen up to the TV, so it’s really really simple and fast.

Cool. So where can you go from here? What are some of the resources that we’ve got? So you can go out to developers.google.com/cast. That’s where all the documentation is. The design checklist, there’s a link for that. And then at Google, we highly encourage you to use the Stack Overflow. And so if you do have questions about Cast, that’s really the best place to go out there and ask your questions. We’ve actually have a team of people who a lot of their job is just moderating Stack Overflow, making sure all the right answers are there and we want to be able to share that information with others and so we encourage kind of being part of that community and asking your questions there and even helping others answer their questions there. And then finally we have a Google Plus developer community and so there’s a link for that and that’s kind of a place where we do a lot of announcements and then you can also share some of the things you’ve done with other cast developers.

So with that, happy casting and if anyone has any questions, feel free.

Trusted Releases Built For Speed