Android | Backporting Graphics Code to 4.1.2

I've created a framework for displaying a streamed/stored video from any source to Android display at 60fps, given it has a GPU, supports OpenGLES2.0+, and is running on Android 4.4.4+. This has been in use in several systems at Tonbo and all was great until we received a head mounted display - Recon Jet which runs on Android 4.1.2. The Android SDK equivalent version is 16. This is a post on how I backported the framework, the thought process involved in debugging why the Recon wouldn't play the stream. I'm certain this will make sense to only someone who's worked on Android graphics, GLES, and Gstreamer, but may interest someone who enjoys debugging in general.

-

Receive the Recon. Run my application.

Heartbreak.

My streaming library isn't compatible here. The minimum SDK version my code runs on is SDK 19, but the Recon is at SDK 16. First thoughts, I will have to backport all my base graphics rendering calls from SDK 19 introduced android.opengl.EGL14 to the older javax.microedition.khronos.egl.EGL10.

Hold on. it's a tiny device. Does it even have a GPU? And does it support OpenGLES 2.0?
Yes, installation of application after declaring <uses-feature android:glEsVersion="0x00020000" android:required="true" /> in the manifest proves it.

Alright - 4 hours pass and I've successfully back ported my code from EGL14 to EGL10, and I've tested my graphics rendering library using EGL10 on a phone running Android 4.4.4. This was quicker than I expected it to be.

I test it on the Recon, doesn't work.

Is it because the CPU doesn't have enough in it to play a video at all?
I highly doubt this and prove this wrong by playing a 640x480p video at 30fps using <VideoView>.

Could it be that Gstreamer, our media framework receiving the encoded video packets is not receiving them/decoding them?
Possible. Gstreamer works post SDK 8. I check the elements it dynamically chooses for decodebin. Wait. Could it be that it is using a random software decoder and isn't able to decode the H264 encoded stream fast enough? Highly possible. Is there a hardware decoder present at all?
My first instinct is to try the MediaCodec libraries supported by Android specifically meant to use the hardware decoder. They were introduced in SDK 16 for the first time, but most useful APIs came in only after SDK 18. It's practically unusable to me.
But wait, we were able to play the .mp4 video at 30fps, so it probably does have a hardware decoder. I double check the processor. It has an old TI dual core 1GHz processor. Should have a hardware decoder.

Still clueless. 3 more hours have passed.

Gstreamer is probably working fine - are the frames received from Gstreamer corrupt?
Can I convert the individual YUV420 frames received to a bitmap and display it to an ImageView?
YES! That's a relief. So there's nothing wrong up until Gstreamer. It has to be the drawing to display using GLES.

I'm certain my EGL10 rewrite is working because I can load bitmaps and draw - and that works on the Recon. I just can't seem to convert my frames received as a byte array to a GLES texture.

What is going wrong! My code runs on an Android 4.4.4 phone, but not on the 4.1.2 Recon.

Could it be the way I am updating the texture with latest byte array?
I start by checking all graphics Android components being used; hmm, SurfaceTexture, could there have been something that changed between 16 and 25?
I read the documentation and changelog. Nothing.

So, drawing everything but my video works, Gstreamer is fine, what am I doing wrong?

Wait, do we need the SurfaceTexture at all? Why do I need that in my case? Hold the thought.
What specifically is different in my call to create a texture handle between a simple bitmap and my video frame array?

Aha! I diff the method that creates the texture from a ByteBuffer and the one that creates the texture from Bitmap. Turns out it was a one liner difference.

GLES20.glTexParameteri(mTextureTarget, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);

Cannot use GLES20.GL_REPEAT because on pre-19 devices it requires the texture height and width to be a power of two (POT). Switched to GLES20.GL_CLAMP_TO_EDGE.

Works! What! It took me a whole day to get to this.

Alright back to SurfaceTexture. Now that the streamed video is playing on the recon, why do I have the SurfaceTexture in my code in the first place?
Remove that, can we still draw?
Yes! Why was it there? Because the Android Camera was using it internally and I mimicked that 10 months ago. 10 months we've been wasting about 4 milliseconds of CPU time for this; which is quite a bit considering a 30 FPS video gives you only 32 milliseconds to process each frame.

-

This was a fun challenge and I'm glad the Recon was so stubborn. Really it was the Android version 4.1.2. I did try to download an emulator but I was unable to get my ndk-build to run there - I should have tried harder; this might have saved me a lot of time doubting the Recon's CPU and hardware encoder, and Gstreamer not working on it.

Though these seemingly silly things that are merely one-line fixes take up so much of your time to pinpoint what exactly is going wrong, I'm glad they show up (as long as they are one-line/one-class fixes). I ended up going through my entire code line by line, refactored quite a bit of it, understood EGL better, questioned every little element used and went through a lot of Android documentation. 

Comments

Popular posts from this blog

[Breaking News] AI takes over universe to calculate Pi

Firebase Auth | The Debug vs Release Signature Problem

Cold Showers