In Android, it is very easy to create an app that can play video clips; it is also easy to create a game app with 3D graphic surface. But it is not straightforward to create an app that can play video on a 3D graphic object. This article describes an app I created with the goal of meeting this challenge. The app plays a video rendering on the 3D surface and lets the user play with the video surface interactively.
Three implementation challenges had to be overcome for this app:
This app also challenges the device capabilities since it forces the video and graphic objects to work at the same time. This paper will introduce the features and usage of this app, and then describe the design briefly.
This app has the following features:
Note: There is no video session control; so the user can’t stop, fast forward, or rewind the video. If the app is pushed to the background or stopped, the video can only be played from the beginning.
The app doesn’t include any video, so the user can choose any video clips supported by the Android platform. But it is recommended to use a video clip with the H.264AVC codec in a MPEG-4 container because this was the format tested during the development. When the app is launched for the first time, the file chooser screen will display (see the screen shot on the previous page).
Once the user selects the preferred file chooser, onscreen directions are given. After a video clip is selected, playback will start.
Note: Sometimes it takes a while before the video starts—about 5~10 seconds.
The app starts with the normal mode. To change modes, the user clicks the options button on the UI to launch the options menu.
The options menu has four choices: the first three are options of play mode; the last option allows the user to select different video clips.
This selection causes the surface that renders the video to spin randomly with a 3D effect along the horizontal and the vertical axis. It also casts a reddish color on the screen periodically.
In the touch spin mode, the user can sweep left or right to let the surface spin along the vertical axis left or right. The spin accelerates with a faster sweep and decelerates naturally when the user stops sweeping.
The three major components in this app are: the UI component to give the user an interactive presentation of the app; the video component to do the video playback; theOpenGL* surface to render the video frame buffer and create a 3D effect when in the special mode.
The video playback component has two threads. The video engine thread uses the MediaCodec class. This class was introduced in the Android JellyBean release (API 16). It exposes the low-level API of the Android media framework, so that the playback can be controlled at the frame level. On each video frame, the graphic component can hold the image data and change the way the image is being rendered.
The playback component also implemented an audio engine thread that plays the audio track of the video clips. To synchronize the video and audio threads so that the lips of the characters match the voice when they speak, the AV sync algorithm is implemented. The video playback always compares its time stamp with the audio time stamp. It will tweak the playback pace so that the video frame never exceeds the time limitation of 30 milliseconds.
The graphic component extends a GLSurfaceView so that a customized rendering class is embedded in the class. The rendering class implements the Renderer interface to do the OpenGL rendering algorithm. The algorithm implements the 3D spin effect on the texture surface and also changes the mode to different algorithms based on the user input.
In the render, spin and coloring of the video frame are handled via a simple vertex and pixel shader. Spin of the video plane is performed by application of a series of Euler-angle rotations. Colorization of the frame is achieved via a linear mixing of the video frame contents and a fixed color (red in this case) inside the pixel shader.
The graphic component also works as a video frame listener to the video engine and is passed to the MediaCodec object during the construction so that the rendering function will be called each time a frame is available. The graphic component has a function in the UI component so that the user can change the play mode .
The MediaCodec decoding process works at the pace of the video frame, and the render function in the graphic component works at the pace of the graphic frame. So the functions in both components must be synchronized when accessing the frame buffer.
For more such Android resources and tools from Intel, please visit the Intel® Developer Zone