This can be a visitor put up from amo Engineer, Cyril Mottier, on how they’ve leveraged Android to create magic of their Android app.
At amo, we’re redefining what it means to construct social purposes. Our mission is to create a brand new type of social firm, one which prioritizes high-quality, thoughtfully designed cellular experiences. One in every of our flagship purposes, Bump, places your pals on the map — whether or not you’re checking in in your crew or making strikes to satisfy up.
Our app leverages multiplatform applied sciences for its basis. On the core lies a shared Rust-based library that powers all of our iOS and Android apps. This library, managed by our backend engineers, is accountable for persistence and networking. The library exposes its APIs as Kotlin Stream. Along with making all the pieces reactive and realtime-enabled by default, it integrates effortlessly with Jetpack Compose, the know-how we use to construct our UI. This structure ensures a constant and high-performance expertise throughout platforms. It additionally permits cellular engineers to spend extra time on the consumer expertise the place they’ll concentrate on crafting revolutionary and immersive consumer interactions.
On this put up, we’ll discover how we leverage the Android SDK, Jetpack Compose, the Kotlin programming language, and Google Play Providers to construct distinctive, pleasant experiences in Bump. Our aim is to interrupt psychological obstacles and present what’s really attainable on Android — typically in a only a few strains of code. We wish to encourage engineers to suppose past typical UI paradigms and discover new methods to create magical moments for customers. By the top of this text, you’ll have a deeper understanding of tips on how to harness Android’s capabilities to construct experiences that really feel like magic.
At amo, we ship, be taught, and iterate quickly throughout our characteristic set. That signifies that the design of a number of the options highlighted on this article has already modified, or will accomplish that within the coming weeks and months.
Nice touch-based UX isn’t nearly flashy visuals. It’s about delivering significant suggestions via graphics, haptics, sounds, and extra. Stated otherwise, it’s about designing for all senses, not only for the eyes. We take this very severely when designing purposes and all the time take all of those potential dimensions into consideration.
One instance is our in-app notification middle. The notification middle is a visible entry level accessible from wherever within the app which reveals your whole notifications from the complete amo suite of apps. It may be moved wherever on the display. Its type additionally adjustments usually due to some in-residence or exterior artists. However styling doesn’t cease on the visible stage, we additionally type it on the audio stage: when it’s dragged round, a brief and repeating sound is performed.
To make it enjoyable and joyful, we pushed this even additional to let the consumer be a DJ. The amount, the velocity and the pitch of the audio change relying on the place the consumer drags it. It’s a “be your personal DJ” second. The implementation of this expertise could be break up in two elements. The primary half offers with the audio and the second half handles the rendering of the entry level and its interactions (dragging, tapping, and so forth.).
Let’s first dive into the code dealing with the audio. It consists of a Composable requiring an URL pointing to the music, a flag indicating whether or not it ought to play or not (true solely when dragging) and a two-dimensional offset: X axis controls the quantity, Y axis controls the playback velocity & pitch.
@Composable
enjoyable GalaxyGateAccessPointMusicPlayer(
musicUrl: String,
isActive: Boolean,
offset: Offset,
) {
val audioPlayer = rememberAudioPlayer(
uri = Uri.parse(musicUrl),
)
LaunchedEffect(audioPlayer, isActive) {
if (isActive) {
audioPlayer.play(isLooped = true)
} else {
audioPlayer.pause()
}
}SideEffect {
audioPlayer.setSpeedPitch(
velocity = 0.75f + offset.y * 0.5f,
pitch = offset.x + 0.5f
)
audioPlayer.setVolume(
(1.0f - ((offset.x - 0.5f) * 2f).coerceIn(0f, 1f)),
(1.0f - ((0.5f - offset.x) * 2f).coerceIn(0f, 1f)),
)
}
}
@Composable
enjoyable rememberAudioPlayer(
uri: Uri,
): AudioPlayer {
val context = LocalContext.present
val lifecycle = LocalLifecycleOwner.present.lifecycle
return bear in mind(context, lifecycle, uri) {
DefaultAudioPlayer(
context = context,
lifecycle = lifecycle,
uri = uri,
)
}
}
DefaultAudioPlayer
is simply an in-house wrapper round ExoPlayer
offered by Jetpack Media3 that offers with initialization, lifecycle administration, fading when beginning/stopping music, and so forth. It exposes 2 strategies setSpeedPitch
and setVolume
delegating to the underlying ExoPlayer
.
By combining gesture readings with audio pitch, velocity and quantity, we added delight and shock when customers didn’t count on it.
We named our software “Bump” as a nod to its core characteristic: individuals shut to one another can “bump” their telephones collectively. If they aren’t registered as buddies on the app, bumping will routinely ship a buddy request. And if they’re, a mesmerizing animation triggers. We additionally notify mutual buddies that they “bumped” and they need to be a part of.
This Bump characteristic is central to the app’s expertise. It stands out in its interplay, performance, and the distinctive worth it gives. To precise its significance, we needed the characteristic to have a particular visible enchantment. Right here’s a glimpse of the way it at present appears within the app:
There’s a lot occurring on this video however it may be summarized to three animations: the “wave” animation when the gadget detects a neighborhood bump/shake, the animation displaying the 2 buddies bumping and lastly a “ring pulsing” animation to complete. Whereas the second animation is apparent Compose, the 2 others are customized. Creating such customized results concerned venturing into what is commonly thought-about “unknown territory” in Android growth: customized shaders. Whereas daunting at first, it’s truly fairly accessible and unlocks immense artistic potential for really distinctive experiences.
Merely put, shaders are extremely parallelizable code segments. Every shader runs as soon as per pixel per body. This would possibly sound intense, however that is exactly the place GPUs excel. In Android 13, shaders have been built-in as first-class residents with AGSL shaders and RuntimeShader
for Views and Compose.
Since our app requires a minimal of API 30 (Android 11), we opted for a extra conventional method utilizing a customized OpenGL renderer.
We extract a Bitmap
of the view we wish to apply the impact to, move it to the OpenGL renderer, and run the shader. Whereas this technique ensures retro-compatibility, its foremost downside is that it operates on a snapshot of the view hierarchy all through the animation. Consequently, any adjustments occurring on the view through the animation aren’t mirrored on display till the animation concludes. Eager observers would possibly discover a slight glitch on the finish of the animation when the screenshot is eliminated, and regular rendering resumes.
In our apps, profile photos are a bit completely different. As a substitute of static photographs, you report a reside profile image, a clear, animated boomerang-like cutout. This method feels extra private, because it actually brings your pals to life on-screen. You see them smiling or making faces, reasonably than simply viewing curated or filtered pictures from their digital camera roll.
From a product perspective, this characteristic includes two key phases: the recording and the rendering. Earlier than diving into these particular areas, let’s talk about the format we use for knowledge transport between cellular units (Android & iOS) and the server. To optimize bandwidth and decoding time, we selected the H.265 HEVC format in an MP4 container and carry out the face detection on gadget. Most fashionable units have {hardware} decoders, making decoding extremely quick. Since cross-platform movies with transparency aren’t broadly supported or optimized, we developed a customized in-house resolution. Our movies include two “planes”:
- The unique video on prime
- A masks video on the backside
We haven’t but optimized this course of. At present, we don’t pre-apply the masks to the highest aircraft. Doing so might cut back the ultimate encoded video measurement by changing the unique background with a plain coloration.
This format is pretty efficient. As an example, the video above is barely 64KB. As soon as we aligned all cellular platforms on the format for our animated profile photos, we started implementing it.
Recording a Dwell Profile Image
Step one is capturing the video, which is dealt with by Jetpack CameraX. To supply customers with visible suggestions, we additionally make the most of ML Package Face Detection. Initially, we tried to map detected facial expressions (reminiscent of eyes closed or smiling) to a 3D mannequin rendered with Filament. Nonetheless, reaching real-time efficiency proved too difficult for the timeframe we had. We as an alternative determined to detect the face contour and to maneuver a default avatar picture on the display accordingly.
As soon as the recording is full, Jetpack CameraX gives a video file containing the recorded sequence. This marks the start of the second step. The video is decoded body by body, and every body is processed utilizing ML Package Selfie Segmentation. This API computes the face contour from the enter picture (our frames) and produces an output masks of the identical measurement. Subsequent, a composite picture is generated, with the unique video body on prime and the masks body on the backside. These composite frames are then fed into an H.265 video encoder. As soon as all frames are processed, the video meets the specs described earlier and is able to be despatched to our servers.
Whereas the method could possibly be improved with higher interframe selfie segmentation, utilization of depth sensors, or extra superior AI strategies, it performs nicely and has been efficiently working in manufacturing for over a yr.
Rendering Your Associates on the Map
Taking part in again animated profile photos offered one other problem. The principle issue arose from what appeared like a easy product requirement: displaying 10+ real-time shifting profile photos concurrently on the display, animating in a back-and-forth loop (just like boomerang movies). Video decoders, particularly {hardware} ones, excel at decoding movies ahead. Nonetheless, they wrestle with reverse playback. Moreover, decoding is computationally intensive. Whereas decoding a single video is manageable, decoding 10+ movies in parallel shouldn’t be. Our requirement was akin to wanting to look at 10+ motion pictures concurrently in your favourite streaming app, all in reverse mode. That is an unusual and distinctive use case.
We overcame this problem by buying and selling computational wants for elevated reminiscence consumption. As a substitute of repeatedly decoding video, we opted to retailer all frames of the animation in reminiscence. The video is a 30fps, 2.5-second video with a decision of 256×320 pixels and transparency. This leads to a reminiscence consumption of roughly 24MB per video. A queue-based system dealing with decoding requests sequentially can handle this effectively. For every request, we:
- Decode the video body by body utilizing Jetpack Media3 Transformer APIs
- For every body:
- Apply the decrease a part of the video as a masks to the higher half.
- Append the generated Bitmap to the checklist of frames.
Upon finishing this course of, we acquire a Checklist<Bitmap>
containing all of the ordered, remodeled (mask-applied) frames of the video. To animate the profile image in a boomerang method, we merely run a linear, infinite transition. This transition begins from the primary body, proceeds to the final body, after which returns to the primary body, repeating this cycle indefinitely.
@Immutable
class MovingCutout(
val length: Int,
val bitmaps: Checklist<ImageBitmap>,
) : Cutout@Composable
enjoyable rememberMovingCutoutPainter(cutout: MovingCutout): Painter {
val state = rememberUpdatedState(newValue = cutout)
val infiniteTransition = rememberInfiniteTransition(label = "MovingCutoutTransition")
val currentBitmap by infiniteTransition.animateValue(
initialValue = cutout.bitmaps.first(),
targetValue = cutout.bitmaps.final(),
typeConverter = state.VectorConverter,
animationSpec = infiniteRepeatable(
animation = tween(cutout.length, easing = LinearEasing),
repeatMode = RepeatMode.Reverse
),
label = "MovingCutoutFrame"
)
return bear in mind(cutout) {
// A customized BitmapPainter implementation to permit delegation when getting
// 1. Intrinsic measurement
// 2. Present Bitmap
CallbackBitmapPainter(
getIntrinsicSize = {
with(cutout.bitmaps[0]) { Measurement(width.toFloat(), top.toFloat()) }
},
getImageBitmap = { currentBitmap }
)
}
}
personal val State<MovingCutout>.VectorConverter: TwoWayConverter<ImageBitmap, AnimationVector1D>
get() = TwoWayConverter(
convertToVector = { AnimationVector1D(worth.bitmaps.indexOf(it).toFloat()) },
convertFromVector = { worth.bitmaps[it.value.roundToInt()] }
)
As a map-based social app, Bump depends closely on the Google Maps Android SDK. Whereas the framework gives default interactions, we needed to push the boundaries of what’s attainable. Particularly, customers wish to zoom out and in rapidly. Though Google Maps affords pinch-to-zoom and double-tap gestures, these have limitations. Pinch-to-zoom requires two fingers, and double-tap doesn’t cowl the complete zoom vary.
For a greater consumer expertise, we’ve added our personal gestures. One notably helpful characteristic is edge zoom, which permits fast zooming out and in utilizing a single finger. Merely swipe up or down from the left or proper fringe of the display. Swiping right down to the underside zooms out fully, whereas swiping as much as the highest zooms in totally.
Like Google Maps gestures, there are not any visible cues for this characteristic, however it’s acceptable for an influence gesture. We offer visible and haptic suggestions to assist customers bear in mind it. At present, that is achieved with a glue-like impact that follows the finger, as proven under:
Implementing this characteristic includes two duties: detecting edge zoom gestures and rendering the visible impact. Because of Jetpack Compose’s versatility, this may be achieved in only a few strains of code. We use the draggable2D
Modifier to detect drags, which triggers an onDragUpdate
callback to replace the Google Maps digital camera and triggers a recomposition by updating some extent variable.
@Composable
enjoyable EdgeZoomGestureDetector(
aspect: EdgeZoomSide,
onDragStarted: () -> Unit,
onDragUpdate: (Float) -> Unit,
onDragStopped: () -> Unit,
modifier: Modifier = Modifier,
curveSize: Dp = 160.dp,
) {
var heightPx by bear in mind { mutableIntStateOf(Int.MAX_VALUE) }
var level by bear in mind { mutableStateOf(Offset.Zero) }
val draggableState = rememberDraggable2DState { delta ->
level = when (aspect) {
EdgeZoomSide.Begin -> level + delta
EdgeZoomSide.Finish -> level + Offset(-delta.x, delta.y)
}
onDragUpdate(delta.y / heightPx)
}
val curveSizePx = with(LocalDensity.present) { curveSize.toPx() }Field(
modifier = modifier
.fillMaxHeight()
.onPlaced {
heightPx = it.measurement.top
}
.draggable2D(
state = draggableState,
onDragStarted = {
level = it
onDragStarted()
},
onDragStopped = {
level = level.copy(x = 0f)
onDragStopped()
},
)
.drawWithCache {
val path = Path()
onDrawBehind {
path.apply {
reset()
val x = level.x.coerceAtMost(curveSizePx / 2f)
val y = level.y
val prime = y - (curveSizePx - x)
val backside = y + (curveSizePx - x)
moveTo(0f, prime)
cubicTo(
0f, prime + (y - prime) / 2f,
x, prime + (y - prime) / 2f,
x, y
)
cubicTo(
x, y + (backside - y) / 2f,
0f, y + (backside - y) / 2f,
0f, backside,
)
}
scale(aspect.toXScale(), 1f) {
drawPath(path, Palette.black)
}
}
}
)
}
enum class EdgeZoomSide(val alignment: Alignment) {
Begin(Alignment.CenterStart),
Finish(Alignment.CenterEnd),
}
personal enjoyable EdgeZoomSide.toXScale(): Float = when (this) {
EdgeZoomSide.Begin -> 1f
EdgeZoomSide.Finish -> -1f
}
The drawing half is dealt with by the drawBehind
Modifier, which creates a Path
consisting of two easy cubic curves, emulating a Gaussian curve. Earlier than rendering it, the trail is flipped on the X axis primarily based on the display aspect.
This impact appears good however it additionally feels static, immediately following the finger with none animation impact. To enhance this, we added spring-based animation. By extracting the computation of x
(representing the tip of the Gaussian curve) from drawBehind
into an animatable state, we obtain a smoother visible impact:
val x by animateFloatAsState(
targetValue = level.x.coerceAtMost(curveSizePx / 2f),
label = "animated-curve-width",
)
This creates a visually interesting impact that feels pure. Nonetheless, we needed to have interaction different senses too, so we launched haptic suggestions to imitate the texture of a toothed wheel on an outdated protected. Utilizing Kotlin Stream
and LaunchedEffect
and snapshotFlow
this was applied in a only a few strains of code:
val haptic = LocalHapticFeedback.present
LaunchedEffect(heightPx, slotCount) {
val slotHeight = heightPx / slotCount
snapshotFlow { (level.y / slotHeight).toInt() }
.drop(1) // Drop the preliminary "tick"
.gather {
haptic.performHapticFeedback(HapticFeedbackType.SegmentTick)
}
}
Bump is crammed with many different revolutionary options. We invite you to discover the product additional to find extra of those gems. General, the complete Android ecosystem — together with the platform, developer instruments, Jetpack Compose, Google Play Providers — offered a lot of the essential constructing blocks. It provided the flexibleness wanted to design and implement these distinctive interactions. Because of Android, making a standout product is only a matter of ardour, time, and quite a lot of strains of code!