Android HDR | Migrating from TextureView to SurfaceView (Half #1) — The best way to Migrate | by Mozart Louis | Android Builders | Dec, 2022

Customers are creating and consuming extra content material each day, particularly video content material. Whether or not it’s social media, video calls, or watching your favourite film on demand, we all know that customers anticipate the best high quality content material they will get their arms on.

That is why on Android, we’re dedicated to supporting app builders in adopting 10-bit HDR (Excessive Dynamic Vary).

Lately on the Android Developer Summit, we talked about bettering your social expertise high quality utilizing Android Digital camera. On this speak, we described how one can get HDR Video Seize working in Android 13 and the way helpful HDR is with 10-bit shade assist, leading to brighter and better distinction movies.

Mozart Louis @ ADS 22 speaking about HDR Video Seize with Android 13

My colleague Ray additionally talked concerning the full scope of HDR 10-bit seize, playback, and sharing.

Raymond Tiong @ ADS 22 speaking about HDR Seize, Playback and Sharing

To assist HDR 10-bit, it is advisable to use SurfaceView, both with ExoPlayer (which we advocate) or your individual customized decoder with 10-bit assist.

Many builders decide to make use of TextureView as an alternative of SurfaceView for the flexibility to attract content material offscreen, however this comes on the expense of a number of advantages that include utilizing SurfaceView.

This sequence is supposed to assist builders migrate away from TextureView and totally use SurfaceView for displaying all of your content material. We’ll go in-depth on the inside workings of SurfaceView and how one can change some, if not all, the prevailing performance you’ve gotten with TextureView.

Let’s begin this sequence off with how one can migrate away from TextureView to SurfaceView. There’s loads to cowl right here, so buckle up and prepare to journey into the land of Android HDR!

SurfaceView has been round for the reason that starting of Android in API 1. By definition, it gives a devoted drawing floor embedded inside a view hierarchy. Which means every occasion of a SurfaceView lives by itself aircraft. Actually, when displaying a SurfaceView, it basically cuts out a gap to show the content material immediately on display. Right here’s a visible instance of what this seems to be like:

Right here you’ll be able to think about your app being the “App Floor.” SurfaceView cuts a gap inside your app’s floor and shows content material from the underlying floor on to your display.

The underlying Floor driving the SurfaceView is assigned a {hardware} overlay. This implies the content material is made obtainable on to the show controller for scanout and not using a copy into the applying UI. This results in the next advantages:

  • Higher energy effectivity.
  • 10-bit HDR assist (depending on assist from the show on the machine).
  • DRM playback assist.

With all these wonderful advantages and the truth that it’s been right here for the reason that starting of Android, why are builders utilizing TextureView?

TextureView was launched in API 14. The overall thought is that as a result of TextureView is a more moderen API, TextureView needs to be higher than SurfaceView in each means. TextureView does have some benefits over SurfaceViewFor occasion, in contrast to SurfaceView, TextureView doesn’t create a separate window, however as an alternative behaves as a daily view. This key distinction permits a TextureView to have translucency, arbitrary rotations, and complicated clipping.

For instance, you can also make a TextureView semi-translucent by calling:


Whereas this may be very helpful for some circumstances, it comes with a efficiency penalty and further battery drain (we have now some numbers right here). That’s as a result of TextureView contents have to be copied, internally, from the underlying floor into the view displaying these contents. This copy operation makes TextureView much less environment friendly than SurfaceView, which shows its contents on to the display.

One more reason builders are utilizing TextureView is as a result of SurfaceView rendering wasn’t correctly synchronized with view animations till API 24. On earlier releases this might lead to negative effects when a SurfaceView was positioned right into a scrolling container, or when it was subjected to animation. Such results included the view’s contents showing to lag barely behind the place it needs to be displayed, and the view turning black when subjected to animation.

To attain easy animation or scrolling of video earlier than API 24, it’s subsequently needed to make use of TextureView somewhat than SurfaceView.

With that being mentioned, We nonetheless advocate utilizing SurfaceView for many circumstances. Use TextureView when:

  • You want translucency, arbitrary rotations, and complicated clipping.
  • It’s good to assist API 23 and under, however use SurfaceView for API 24+.

If in any respect potential, builders ought to use ExoPlayer because it alleviates the necessity to resolve on TextureView or SurfaceView and handles the entire complexities of HDR show.

On this part, we are going to cowl and provides steering on how one can transfer over from TextureView to SurfaceView. We’ll present code and total solutions on what to do when one thing that was obtainable in TextureView is just not obtainable in SurfaceView. Listed here are examples we are going to go although:

  • Utilizing MediaPlayer to show a neighborhood video on both a TextureView or a SurfaceView.
  • Making a easy decoder with MediaFormat and MediaCodec to decode and show 10-bit HLG HEVC video content material.
  • Figuring out Shade High quality Points when displaying 10-bit content material on a TextureView and how one can tackle it (Half #2).
  • An HDR vertical video carousel implementation to indicate how SurfaceView handles transformations (Half #3).

In these examples, we’ll observe the variations between TextureView and SurfaceView. All code samples will likely be in Kotlin and will be present in our graphics samples repository! Obtain the repository to comply with alongside as we deep dive into the transitionary steps between TextureView and SurfaceView

TextureView and SurfaceView don’t mechanically regulate their body dimension to the facet ratio of the displayed content material. This can lead to distorted views. Due to this, we have to create a FixedAspectTextureView and FixedAspectSurfaceView.

These are merely helper lessons that may override the onMeasure() perform to permit us to set a selected facet ratio like 16 by 9. We’ll set it by making a setAspectRatio() perform.

caYou can view the total implementation for FixedAspectTextureView.kt & FixedAspectSurfaceView.kt information inside our Graphics samples.

We’d like some HDR content material to attempt to show on display. Not each machine is at the moment able to HDR Video Seize, in our graphics samples, We’ve included pattern information. These have been all caught on a Pixel 7 Professional, some with HDR and a few with out. This will likely be an incredible start line to seeing the variations, particularly when displaying 10-bit HDR content material.

You might be at all times welcome to make use of your individual content material as properly.

And reference them from a Constants.kt class (change if needed):

Okay, with all that out of the way in which, let’s get into the logic!

We’re going to make use of MediaPlayer to play our non-HDR video to indicate the distinction in implementation for TextureView and SurfaceView. MediaPlayer will render the media to any floor it’s given, so it will work on each TextureView and SurfaceView. As talked about earlier, if you’re utilizing ExoPlayer for video playback, it will all be dealt with for you by the library. To discover ways to get began with ExoPlayer, see the ExoPlayer documentation.

Let’s create TextureViewVideoPlayer.kt. We’re utilizing view binding right here to bind to our texture_view_player.xml:

The important thing factor to level out right here is that our class implements the SurfaceTextureListener, which is an interface that we have to get necessary callbacks for TextureView operations. SurfaceView has comparable callbacks that we’ll evaluate later.

As of API 33, SurfaceTextureListener requires that you just implement 4 callbacks:

  • onSurfaceTextureAvailable() — Invoked when a TextureView’s SurfaceTexture is prepared to be used.
  • onSurfaceTextureSizeChanged() — Invoked when the SurfaceTexture’s buffers dimension modified.
  • onSurfaceTextureDestroyed() — Invoked when the desired SurfaceTexture is about to be destroyed.
  • onSurfaceTextureUpdated() — Invoked when the desired SurfaceTexture is up to date by means of SurfaceTexture.updateTexImage(). You can also make per body mutations right here if want be.

We will use these callbacks and hook up our MediaPlayer to the SurfaceTexture and supply media playback. Its implementation would look one thing like this:

With this carried out, it is best to get a outcome like this!

Now, let’s create the identical precise situation utilizing SurfaceView as an alternative of TextureView. SurfaceViewVideoPlayer.kt, :

It’s basically the identical deal, however you’ll be able to see that as an alternative of utilizing a SurfaceTextureListener, we’re utilizing SurfaceHolder.Callback. This interface receives details about modifications to the floor. Each SurfaceView may have considered one of these and it’s accessible by way of Surfaceview.getHolder(). So we are able to name binding.surfaceViewVideoPlayerView.holder.addCallback(this) and implement the SurfaceHolder callbacks. As of API33, the obtainable callbacks are:

  • surfaceCreated() — That is referred to as instantly after the floor is created.
  • surfaceChanged() — That is referred to as instantly after any structural modifications (format or dimension) have been made to the floor.
  • surfaceDestroyed() — That is referred to as instantly earlier than a floor is being destroyed.

We will use the SurfaceHolder callbacks to implement our MediaPlayer the identical means we did with SurfaceTextureListener:

With this carried out, we must always see the identical precise habits as with TextureView. It’s as if nothing has modified!

You’ve seen that TextureView’s callback (SurfaceTextureListener) and SurfaceView’s callback (SurfaceHolder.Callback) are virtually the identical in performance. Primarily,

  • onSurfaceTextureAvailable() == surfaceCreated()
  • onSurfaceTextureDestroyed() == surfaceDestroyed()
  • onSurfaceTextureSizeChanged() is just like surfaceChanged()

This allowed us to simply transition to utilizing SurfaceView with MediaPlayer with out a lot modifications.

The place you’ll find a distinction between the 2 is that SurfaceHolder.Callback doesn’t have an equal to onSurfaceTextureUpdated() in SurfaceTextureListener. onSurfaceTextureUpdated() is named each time there’s a new body of video on this case, which will be helpful for per-frame processing and evaluation as wanted. Relying in your work, this may be vital.

With SurfaceView, you’ll must intercept frames between MediaCodec and SurfaceView to have the identical performance.

One other key distinction between the 2 views is how the underlying Floor lifecycle is dealt with. When a SurfaceView is made invisible, the underlying floor is destroyed. This isn’t true for TextureView. A TextureView’s SurfaceTexture is tied to attachment and detachment from the window, somewhat than on visibility occasions.

Which means relying in your use circumstances, you’ll be able to expertise a SurfaceView going clean or black as its floor has been destroyed. We’ll cowl extra on this partially #3 after we implement our vertical HDR video carousel.

Enjoying HDR content material requires using the MediaCodec class. This class provides entry to low-level media codecs, i.e. encoder/decoder parts. It’s a part of the Android low-level multimedia assist infrastructure.

Creating an optimized decoder is just not a part of the scope of this information, however we created a easy CustomVideoDecoder.kt in our graphics samples that has the flexibility to decode our HDR information. It’s necessary to notice that this decoder is simply meant to decode video and skip audio.

In CustomVideoDecoder.kt, we don’t must explicitly set the KEY_PROFILE to HEVCProfileMain10 for our MediaFormat occasion since MediaExtractor takes care of that piece for us internally.

When you want to manually set a format, use the setInteger() methodology in your MediaFormat occasion like so:

On the high of the course of methodology, we name the setUpDecoder() methodology, which is able to use our MediaFormat MIME sort to create a decoder for the HEVCProfileMain10 within the file.

Relying on what sort of codec you’re utilizing, it could or might not be supported by the machine.

You may examine if the code is supported forward of time by calling MediaCodecList.findDecoderForFormat():

Now with the decoder created, we are able to implement one other SurfaceViewVideoPlayer, however as an alternative, use our customized decoder as an alternative of the MediaPlayer. Let’s name this one SurfaceViewVideoPlayerHDR.kt and simply override a few parameters on our open base class SurfaceViewVideoPlayer.kt:

To get our customized decoder connected, we simply must override the surfaceCreated() and surfaceDestroyed() strategies and initialize our decoder there.

With that, it is best to now be capable to view a 10-bit HLG HDR file being performed again on display.

With the whole lot above, it is best to be capable to get some HDR content material taking part in again and see the distinction of implementation relating to TextureView and SurfaceView!

As a reminder, a whole pattern app is accessible in our graphics samples to see this code in motion.

In Half #2, we are going to talk about how one can cope with “Shade Washout,’’ which occurs if you attempt to show 10-bit content material on an 8-bit TextureView, and how one can resolve this problem.

Related Articles


Please enter your comment!
Please enter your name here

Latest Articles