We released the first Developer Preview of Android 15, which focuses on providing access to superior media capabilities, minimizing battery impact, maintaining buttery smooth app performance, and protecting user privacy/security ? all while enabling a diverse ecosystem of devices.
Android 15 includes updates to
Privacy Sandbox
and
Health Connect
, while introducing new
file integrity protection APIs
. It provides
enhanced camera controls
and
virtual MIDI 2.0 devices
to help power creative applications. It expands the
Android Dynamic Performance Framework
to support a
power-efficiency mode
,
report GPU work durations
, and return
Thermal Headroom thresholds
. It adds quality of life focused OpenJDK APIs that will be updated on over a billion devices through Google Play system updates.
Get started today
testing your app with Android 15 in the emulator, or by flashing a system image onto a Pixel 6+, Pixel Fold, or Pixel Tablet device.
We launched
Android Studio Iguana ??
in the stable release channel to make it easier for you to create high quality apps.
Enhancements include
Compose UI Check
, which automatically audits Compose UI for accessibility and adaptive issues across different screen sizes.
Progressive rendering in Compose Preview
which speeds up iteration on complex layouts by lowering the detail of out-of-view previews. Iguana adds
Version Control System support in App Quality Insights
,
built-in support to create Baseline Profiles
, and enhanced support for Gradle Version Catalogs. The
Espresso device API
enables configuration change testing. The integrated
IntelliJ 2023.2 update
includes many enhancements such as support for GitLab and text search in Search Everywhere. The
blog has information on all these changes
and more.
Android’s
photo picker
now integrates cloud photos, giving apps a unified way to let users browse and grant access to selected local and cloud photos and videos. It’s currently available integrated with Google Photos and is open to other cloud media apps that meet the eligibility criteria. The cloud photos feature is currently rolling out with the February Google Play system update to devices running Android 12 and above.
We launched the
ML Kit Document Scanner API
, enabling you to easily integrate advanced document scanning capabilities into your apps.
The API offers a standardized and user-friendly interface for document scanning, includes precise corner and edge detection for accurate document capture, and allows users to further crop scanned documents, apply filters, and remove fingers or blemishes. It processes documents on the device, eliminates the need for camera permissions, and is supported on devices with Android API level 21 or above.
The WearOS powered OnePlus Watch 2 launched with a dual-chipset architecture that works with our hybrid interface to dramatically extend battery life up to 100 hours of Smart Mode regular use.
You can leverage existing Wear OS APIs to get the advantage of these optimizations, such as
NotificationCompat
, and
Health Services on Wear OS
. With Wear OS 4, we launched the
Watch Face Format
, and the new format helps future-proof watch faces to take advantage of emerging optimizations in future devices.
Articles??
There are a bunch of other articles worth checking out.
Levi
covered
Nested Scrolling in Jetpack Compose
, giving a deep dive into how you can implement custom nested behaviors, such as what the Material 3’s TopAppBar
scrollBehavior
parameter does.
Ben
explained
Jetpack Compose’s Strong Skipping Mode
, an experimental feature in the Jetpack Compose Compiler 1.5.4+ that changes the rules for what composables can skip recomposition which should greatly reduce recomposition, improving performance.
Rebecca
showed how you can use
shapes in Compose to create a cool progress bar that morphs between two shapes
using the
graphics-shapes library
, which has
new documentation
to help you add these effects into your apps.
Videos??
Over in videos, #WeArePlay
highlighted the developers behind
We Spot Turtles!
, whose app helps crowdsource pictures that a machine learning model uses to help collect extensive data on sea turtles in the wild.
There’s also an associated blog post if you’d rather read about them!
AndroidX releases ??
There was a bunch of activity over in Android Jetpack, including the first alphas of Annotation 1.8, Benchmark 1.3, Core-RemoteViews 1.1, Glance 1.1, ProfileInstaller 1.4, Lint 1.0, Wear Watchface 1.3, Webkit 1.11, and Compose Material 3 1.3. Highlights include:
- Compose Material 3 1.3 includes more support for predictive back, and updates to the Slider and ProgressIndicator to improve accessibility.
- The new Lint library is a set of lint checks for Gradle Plugin authors on projects that apply java-gradle-plugin to help catch mistakes in their code.
- Glance 1.1 adds a new unit test library (that doesn’t require UI Automator), higher level components, new modifiers, and a new API for getting a flow of RemoteViews from a composition.
We also released Hilt Version 1.2 with assisted injection support for hiltViewModel() and hiltNavGraphViewModels() as well as Test Uiautomator 2.3, which adds support for multiple displays and custom wait conditions.
In
Episode 204: Fan’otations
Tor
,
Romain
, and
Chet
talk about one of Tor’s favorite topics: Lint! Specifically, they talk about Lint checks and the annotations that use them to enable better, more robust, and more self-documenting APIs.
As
Chet
says, “Lint: It’s not just for pockets anymore.” Thank you Chet for all you’ve done for Android and the community, and for helping us keep our sense of humor.
Now then… ??
That’s it for this week with
Android 15 developer preview 1
, the
stable release of Android Studio Iguana
,
cloud photos now available in Photo Picker
,
ML Kit Document Scanning
, the
Wear OS hybrid interface
,
nested scrolling
/
strong skipping
/
shape morphing
in Compose,
annotations with Lint
, and more!
Check back soon for your next update from the Android developer universe! ??