Beetles & Lenses, or How I spent the quarantine building custom Android UI with OpenGL

Once in a while, I have an itch for building custom Android UI. No wonder, I turned the quarantine home hibernation into an opportunity to play around with OpenGL metaballs, refraction, and other stuff:

  • View-to-OpenGL rendering using samplerExternalOES
  • TextureView + GLSurfaceView = GLTextureView
  • SpringAnimation & ViewPropertyAnimator

TL;DR — [Source code]


For this article to be not just a lame coding demo, I needed to come up with an application idea — a trivial task, especially when you don’t have to think about users, monetization strategy, or common sense.

Naturally, I ended up with the app that has a sole purpose — observing beetles with a magnifying glass. But with one catch — the only interactive element on the screen is the FAB a.k.a. Floating Action Button.

Satisfied with the idea I started coding ❤

Metaballs

One can render metaballs on Android in multiple ways, for example, using RenderScript or ColorMatrixColorFilter. Intredasting… but let’s reinvent the wheel with OpenGL.

Mixing different colors looks eye candy 🤩

I wrote a shader that takes position, size, and color of two metaballs as input arguments and blends them together in a beautiful render. Basically, you have two blobs of alpha channel radial-gradient — opaque at the center, and transparent at the radius. The total alpha channel value is then compared to the threshold and multiplied by color.

Navigation

The next thing I wanted to do is to use the FAB to switch between the different images.

I used ViewPager2 to display the beetles. Hence, at first, I tried to convert the FAB position offset into the argument for the ViewPager2.fakeDragBy() method but it didn’t play out well — too many edge cases to handle and the animation wasn’t pleasing.

Well, no big deal, there is another option — ViewPager2.setCurrentItem(page, smoothScroll = true). I call this method as soon as the FAB is dragged over the horizontal distance limit, left or right. There is also a minimum delay between page-turning so it feels natural.

Zoom

Yeah yeah, I know — sane people already got used to pinch-to-zoom gesture. But I believe it’s not much fun so let me get you out of the UX comfort zone.

I found this lens shader some time ago knowing that I’ll use it sooner or later.

The original artistic lens style was quite nice but didn’t fit my idea. So I modified the code to increase the lens curvature and make it look like magnifying glass.

As for the reveal animation, the FAB can be dragged upwards simultaneously scaling up the magnifying glass.

As soon as the FAB position offset is beyond the upward distance limit it detaches from the other metaball and turns into the magnifying glass handle.

But why stop? Let’s use more APIs and add some finishing touches here and there.

Physics-based motion

As soon as the FAB is let go, the SpringAnimation class is used to mimic a physically natural rebound effect.

Larger offset leads to more resistance followed by a physically natural rebound.

Slow-down drag

I also added custom drag “slow-down” code so it feels like the FAB is attached with a rubber band.

Haptic feedback

The FAB state change triggers haptic feedback via View.performHapticFeedback()method that pulses the vibration hardware on the device.

Onboarding

I get it — this design is outstanding and maybe a tiny bit confusing. Let’s help users figure out what’s going on using the icons and fake drag.

The drag is simulated to show that the FAB is not static and can be interacted with.

The icons fade out as soon as the corresponding functions are triggered.

Rendering a view to GL texture in real-time

The magnifying glass shader takes the target image (the one you want to magnify) as the input. Typically you use sampler2D type of texture, but it shows bad performance if the texture gets invalidated too often or every single frame. Instead, I created SurfaceTexture with external GL texture and access it in the shader code as the samplerExternalOES input.

🎩 That’s the magic 🐇

One the one hand, it is easy to draw any Android view directly into the SurfaceTexture canvas. In my case, it is just ViewPager2 with images, but this method can render any view hierarchy. On the other hand, samplerExternalOESGL texture is shared with the Android view rendering thread, improving the app performance at times.

GLSurfaceView + TextureView = GLTextureView

The composition and hierarchy of the views on the screen ruled out the possibility of using GLSurfaceView. But you simply cannot replace GLSurfaceViewwith TextureView and use the same Renderer object as nothing happened. Or can you?

What’s the first thing you do before thinking on your own? That’s right, you google it — fantastic Stack Overflow thread on the topic.

... I wanted to just replace GLSurfaceView with TextureView, and keep the rest of my code the same, and just receive the advantages of the TextureView.

He asked — Android community delivered. Now both metaballs and magnifying glass can be treated the same as the general views.

Rest was the history — it’s a tiny app after all and modern Android development with Kotlin and AndroidX is a breeze. For anyone interested, please dive into the full source code or read our other articles 😉

Farewell 👋

Let’s jump right in.
Ready to take your business to the next level with Augmented Reality?
Let’s TALK
Category
Table of Content
Book a Consultation now!
Alex
CTO at Krootl
contact us