Android Dev Summit is an annual event hosted by Google that brings together Android developers from all around the world and several Google Engineers/Developer Advocates for two days packed with technical Android content served as sessions, workshops and more. One of our Android tech leads attended this year’s edition, so here’s our recap of the event!
The event was hosted at the Google Event Center (MP7) in Sunnyvale, California. As for the venue, it consisted of a main and second stage for the keynote, sessions and lightning talks, a Codelabs room and a Sandbox area. There were also several gathering points where complimentary food and drinks were served, an Android Car booth and even an Introverts Lounge for people that just wanted to chill/write code between sessions.
The Sandbox included several rotating stands, each dedicated to specific technical topics on Android. There were Google engineers in each of them to answer any questions developers might have, all in a face-to-face, unstructured manner. They were also open to more light-hearted conversations when there wasn’t anyone waiting to ask questions
On top of all this, and for the first time ever, there was a complete filming set that was streaming a new segment called #AskAndroid. This segment consisted of a panel of Google Experts that took some posts made by the community on Twitter and the Youtube Livestream that included the hashtag #AskAndroid and answered them live. The questions were divided by topics and answered on different sessions, each of which had Google engineers and developer advocates as guests to answer them. Spoiler alert: they picked our question and answered it brilliantly!
The whole vibe of the event was nothing short of what you would expect from Google and the Android Community. We had an absolute blast! Chatting, sharing thoughts and beers at the after-party with hundreds of Android Developers from all around the world was awesome, everyone was cheerful and excited about the new features to be presented and they were eager to know more about you, your experience and what brought you the summit as well as sharing their own experiences. A lot of games were hosted, including a social media scavenger hunt which you could complete to get a price (android socks and/or Rubik's cube) and the chance of winning a Google Pixel Slate and Keyboard. All of this combined made for an amazing mood. The Sandbox Area was the place where we could get the most out of the conference. Meeting eminences such as Yigit Boyar, Ian Lake, and Florina Muntenescu, among others, and being able to ask specific technical questions directly to them in person was by far one of the highlights of the event. We received a lot of tips and some tricks on how to use many of Android Jetpack’s libraries alongside a sneak peek on what the life of a Googler is like.
Now, onto the meat of the matter! These are some of the hottest topics discussed at the summit:Jetpack Compose is the new UI design toolkit for Android being developed by Google. It seeks to adopt a lot of ideas from modern UI frameworks like React, Vue, and Flutter to the Android framework and, by doing so, aid developers to build more rich and robust UI/UX. An amazing fact about Compose is that it is being built from scratch in Kotlin (which we love) and promises to have a much cleaner API when compared with the current UI toolkit. Needless to say, Google is very excited about this new toolkit and they showed it by having a permanent Jetpack Compose Sandbox booth, a codelab, an #AskAndroid segment and two sessions exclusively dedicated to this topic. It is still on alpha and it is not yet recommended to be used on production, but it sure does look promising. We highly recommend you watch the sessions ‘What's New in Jetpack Compose’ and ‘Understanding Compose’ to have a grasp on what the future of Android UI/UX will look like.Android Studio 4.0 has been released in the Canary channel and it brings a lot of new toys to play with. Being the main tool that we use every day to deliver awesome apps to our clients, every minor update is exciting. However, this is no small update at all. Previous versions of Android Studio have focused on Project Marble, which included a lot of performance and bug fixes that made Android Studio much better to work with. Having passed those housekeeping updates, it is time for many new features. These include, but are not limited to, previewing Jetpack Compose components, the Motion Editor to create awesome animations easily, a brand new Layout Inspector and even running an emulator inside the IDE itself! To get the full picture, you can watch the following sessions: ‘What's New in Android Studio’ and ‘What's New in Android Studio Design Tools’Android Automotive had its own Sandbox booth outside of the event halls, where we could get into a brand new Volvo car using this technology. In contrast with Android Auto, Automotive does not require a separate smartphone, since it runs Android directly on the car as its embedded infotainment platform. This allows a much richer experience and enables users to get the advantages of Android in their car regardless of which phone they have. We cannot wait to see more about the awesome experiences we could create with it! Kotlin Coroutines are awesome. Handling complex asynchronous operations natively in Kotlin using suspend functions, flow and channels is easier and cleaner than ever. We have been using RxJava for this purpose but we're eager to include Coroutines on our new projects. If you want to know how RxJava stands in comparison to Kotlin Coroutines you can see the experts' answer to our question on the #AskAndroid section. For more on Coroutines we suggest this two sessions: ‘LiveData with Coroutines and Flow’ and ‘Testing Coroutines on Android’The Android Developer Challenge is back! This initiative was first introduced back in 2008 and it helped the platform grow a lot in its infancy days. A lot has changed since then: with Android now having more than 2.5 billion users all around the world and the technology being as advanced as it is, the limits of what can be done on Android are far beyond anything we could have expected at that time. Google is now very interested in helpful innovation, such as features that can drastically improve people’s lives or even save them. Examples of this are how Android now uses Machine Learning to know when you are in a car crash and helps you in the process of reaching out to the emergency services, or how Live Caption converts audio from videos to text in real-time, helping deaf people have a better experience in the platform. Having this in mind, this edition of the challenge is focused on using On-Device Machine Learning to achieve this helpful solution. Developers have time until December 2nd to submit their ideas, and after this point Google will select 10 developers to provide guidance and a bootcamp at their HQ to achieve their ideas. Once the ideas are developed, they will receive special listing on Google Play and be showcased in the next Google I/O. Needless to say, there were a lot of exciting announcements and we can't wait to play with some of the shiny new toys that were released. You can find all the sessions, lightning talks and #AskAndroid segments on the Android Developer Youtube channel on this playlist.