Commit 70abbb96 by sofijajvc Committed by Ian Baker

Add test AV1 video to demo app

PiperOrigin-RevId: 273706425
parent 4026c8a0
......@@ -414,6 +414,10 @@
{
"name": "Big Buck Bunny (FLV Video)",
"uri": "https://vod.leasewebcdn.com/bbb.flv?ri=1024&rs=150&start=0"
},
{
"name": "Big Buck Bunny 480p (MP4, AV1 video)",
"uri": "https://storage.googleapis.com/downloads.webmproject.org/av1/exoplayer/bbb-av1-480p.mp4"
}
]
},
......
# ExoPlayer AV1 extension #
The AV1 extension provides `Libgav1VideoRenderer`, which uses libgav1 native
library to decode AV1 videos.
## License note ##
Please note that whilst the code in this repository is licensed under
[Apache 2.0][], using this extension also requires building and including one or
more external libraries as described below. These are licensed separately.
[Apache 2.0]: https://github.com/google/ExoPlayer/blob/release-v2/LICENSE
## Build instructions ##
To use this extension you need to clone the ExoPlayer repository and depend on
its modules locally. Instructions for doing this can be found in ExoPlayer's
[top level README][].
In addition, it's necessary to fetch libgav1 and its dependencies as follows:
* Set the following environment variables:
```
cd "<path to exoplayer checkout>"
EXOPLAYER_ROOT="$(pwd)"
AV1_EXT_PATH="${EXOPLAYER_ROOT}/extensions/av1/src/main"
```
* Fetch libgav1:
```
cd "${AV1_EXT_PATH}/jni" && \
git clone https://chromium.googlesource.com/codecs/libgav1 libgav1
```
* Fetch Abseil:
```
cd "${AV1_EXT_PATH}/jni/libgav1" && \
git clone https://github.com/abseil/abseil-cpp.git third_party/abseil-cpp
```
libgav1 and [JNI wrapper library][] are built using [CMake][] set-up with
[Ninja][]. After following the instructions above to fetch libgav1, gradle will
build the extension automatically when run on the command line or via Android
Studio.
[top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md
[JNI wrapper library]: https://github.com/google/ExoPlayer/blob/release-v2/extensions/av1/src/main/jni/gav1_jni.cc
[CMake]: https://cmake.org/
[Ninja]: https://ninja-build.org
## Using the extension ##
Once you've followed the instructions above to check out, build and depend on
the extension, the next step is to tell ExoPlayer to use `Libgav1VideoRenderer`.
How you do this depends on which player API you're using:
* If you're passing a `DefaultRenderersFactory` to `SimpleExoPlayer.Builder`,
you can enable using the extension by setting the `extensionRendererMode`
parameter of the `DefaultRenderersFactory` constructor to
`EXTENSION_RENDERER_MODE_ON`. This will use `Libgav1VideoRenderer` for
playback if `MediaCodecVideoRenderer` doesn't support decoding the input AV1
stream. Pass `EXTENSION_RENDERER_MODE_PREFER` to give `Libgav1VideoRenderer`
priority over `MediaCodecVideoRenderer`.
* If you've subclassed `DefaultRenderersFactory`, add a `Libvgav1VideoRenderer`
to the output list in `buildVideoRenderers`. ExoPlayer will use the first
`Renderer` in the list that supports the input media format.
* If you've implemented your own `RenderersFactory`, return a
`Libgav1VideoRenderer` instance from `createRenderers`. ExoPlayer will use the
first `Renderer` in the returned array that supports the input media format.
* If you're using `ExoPlayer.Builder`, pass a `Libgav1VideoRenderer` in the
array of `Renderer`s. ExoPlayer will use the first `Renderer` in the list that
supports the input media format.
Note: These instructions assume you're using `DefaultTrackSelector`. If you have
a custom track selector the choice of `Renderer` is up to your implementation.
You need to make sure you are passing a `Libgav1VideoRenderer` to the player and
then you need to implement your own logic to use the renderer for a given track.
There are two possibilities for rendering the output `Libgav1VideoRenderer`
gets from the libgav1 decoder:
* Native rendering with `ANativeWindow`
* OpenGL rendering.
`SimpleExoPlayer` uses `ANativeWindow` rendering. To enable this mode send the
renderer a message of type `C.MSG_SET_SURFACE` with a `Surface` as its object.
`Libgav1VideoRenderer` can also output to a `VideoDecoderSurfaceView` when
not being used via `SimpleExoPlayer`, in which case color space conversion will
be performed using a GL shader. To enable this mode, send the renderer a message
of type `C.MSG_SET_OUTPUT_BUFFER_RENDERER` with the `VideoDecoderSurfaceView` as
its object.
Note: Although the default option uses `ANativeWindow`, based on our testing the
GL rendering mode has better performance, so should be preferred by apps that
can use `VideoDecoderSurfaceView`.
## Links ##
* [Javadoc][]: Classes matching `com.google.android.exoplayer2.ext.av1.*`
belong to this module.
[Javadoc]: https://exoplayer.dev/doc/reference/index.html
// Copyright (C) 2019 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
apply from: '../../constants.gradle'
apply plugin: 'com.android.library'
android {
compileSdkVersion project.ext.compileSdkVersion
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
defaultConfig {
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.targetSdkVersion
consumerProguardFiles 'proguard-rules.txt'
externalNativeBuild {
cmake {
// Debug CMake build type causes video frames to drop,
// so native library should always use Release build type.
arguments "-DCMAKE_BUILD_TYPE=Release"
targets "gav1JNI"
}
}
}
// This option resolves the problem of finding libgav1JNI.so
// on multiple paths. The first one found is picked.
packagingOptions {
pickFirst 'lib/arm64-v8a/libgav1JNI.so'
pickFirst 'lib/armeabi-v7a/libgav1JNI.so'
pickFirst 'lib/x86/libgav1JNI.so'
pickFirst 'lib/x86_64/libgav1JNI.so'
}
sourceSets.main {
// As native JNI library build is invoked from gradle, this is
// not needed. However, it exposes the built library and keeps
// consistency with the other extensions.
jniLibs.srcDir 'src/main/libs'
}
}
// Check that the Android NDK and CMake are present, to avoid gradle sync
// failures if they aren't installed. To use the extension it is necessary to
// install the NDK and CMake from the SDK Manager.
def ndkDirectory = project.android.ndkDirectory
def sdkDirectory = project.android.sdkDirectory
if (ndkDirectory != null && !ndkDirectory.toString().isEmpty()
&& new File(sdkDirectory.toString() + '/cmake').exists()) {
android.externalNativeBuild.cmake.path = 'src/main/jni/CMakeLists.txt'
android.externalNativeBuild.cmake.version = '3.7.1+'
}
dependencies {
implementation project(modulePrefix + 'library-core')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
}
ext {
javadocTitle = 'AV1 extension'
}
apply from: '../../javadoc_library.gradle'
# Proguard rules specific to the AV1 extension.
# This prevents the names of native methods from being obfuscated.
-keepclasseswithmembernames class * {
native <methods>;
}
# Some members of this class are being accessed from native methods. Keep them unobfuscated.
-keep class com.google.android.exoplayer2.ext.av1.Gav1OutputBuffer {
*;
}
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright (C) 2019 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<manifest package="com.google.android.exoplayer2.ext.av1"/>
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.av1;
import android.view.Surface;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.decoder.SimpleDecoder;
import com.google.android.exoplayer2.util.Util;
import com.google.android.exoplayer2.video.VideoDecoderInputBuffer;
import com.google.android.exoplayer2.video.VideoDecoderOutputBuffer;
import java.nio.ByteBuffer;
/** Gav1 decoder. */
/* package */ final class Gav1Decoder
extends SimpleDecoder<VideoDecoderInputBuffer, VideoDecoderOutputBuffer, Gav1DecoderException> {
// LINT.IfChange
private static final int GAV1_ERROR = 0;
private static final int GAV1_OK = 1;
private static final int GAV1_DECODE_ONLY = 2;
// LINT.ThenChange(../../../../../../../jni/gav1_jni.cc)
private final long gav1DecoderContext;
@C.VideoOutputMode private volatile int outputMode;
/**
* Creates a Gav1Decoder.
*
* @param numInputBuffers Number of input buffers.
* @param numOutputBuffers Number of output buffers.
* @param initialInputBufferSize The initial size of each input buffer, in bytes.
* @param threads Number of threads libgav1 will use to decode.
* @throws Gav1DecoderException Thrown if an exception occurs when initializing the decoder.
*/
public Gav1Decoder(
int numInputBuffers, int numOutputBuffers, int initialInputBufferSize, int threads)
throws Gav1DecoderException {
super(
new VideoDecoderInputBuffer[numInputBuffers],
new VideoDecoderOutputBuffer[numOutputBuffers]);
if (!Gav1Library.isAvailable()) {
throw new Gav1DecoderException("Failed to load decoder native library.");
}
gav1DecoderContext = gav1Init(threads);
if (gav1DecoderContext == GAV1_ERROR || gav1CheckError(gav1DecoderContext) == GAV1_ERROR) {
throw new Gav1DecoderException(
"Failed to initialize decoder. Error: " + gav1GetErrorMessage(gav1DecoderContext));
}
setInitialInputBufferSize(initialInputBufferSize);
}
@Override
public String getName() {
return "libgav1";
}
/**
* Sets the output mode for frames rendered by the decoder.
*
* @param outputMode The output mode.
*/
public void setOutputMode(@C.VideoOutputMode int outputMode) {
this.outputMode = outputMode;
}
@Override
protected VideoDecoderInputBuffer createInputBuffer() {
return new VideoDecoderInputBuffer();
}
@Override
protected VideoDecoderOutputBuffer createOutputBuffer() {
return new VideoDecoderOutputBuffer(this::releaseOutputBuffer);
}
@Nullable
@Override
protected Gav1DecoderException decode(
VideoDecoderInputBuffer inputBuffer, VideoDecoderOutputBuffer outputBuffer, boolean reset) {
ByteBuffer inputData = Util.castNonNull(inputBuffer.data);
int inputSize = inputData.limit();
if (gav1Decode(gav1DecoderContext, inputData, inputSize) == GAV1_ERROR) {
return new Gav1DecoderException(
"gav1Decode error: " + gav1GetErrorMessage(gav1DecoderContext));
}
boolean decodeOnly = inputBuffer.isDecodeOnly();
if (!decodeOnly) {
@Nullable
ByteBuffer supplementalData =
inputBuffer.hasSupplementalData() ? inputBuffer.supplementalData : null;
outputBuffer.init(inputBuffer.timeUs, outputMode, supplementalData);
}
// We need to dequeue the decoded frame from the decoder even when the input data is
// decode-only.
int getFrameResult = gav1GetFrame(gav1DecoderContext, outputBuffer, decodeOnly);
if (getFrameResult == GAV1_ERROR) {
return new Gav1DecoderException(
"gav1GetFrame error: " + gav1GetErrorMessage(gav1DecoderContext));
}
if (getFrameResult == GAV1_DECODE_ONLY) {
outputBuffer.addFlag(C.BUFFER_FLAG_DECODE_ONLY);
}
if (!decodeOnly) {
outputBuffer.colorInfo = inputBuffer.colorInfo;
}
return null;
}
@Override
protected Gav1DecoderException createUnexpectedDecodeException(Throwable error) {
return new Gav1DecoderException("Unexpected decode error", error);
}
@Override
public void release() {
super.release();
gav1Close(gav1DecoderContext);
}
@Override
protected void releaseOutputBuffer(VideoDecoderOutputBuffer buffer) {
// Decode only frames do not acquire a reference on the internal decoder buffer and thus do not
// require a call to gav1ReleaseFrame.
if (buffer.mode == C.VIDEO_OUTPUT_MODE_SURFACE_YUV && !buffer.isDecodeOnly()) {
gav1ReleaseFrame(gav1DecoderContext, buffer);
}
super.releaseOutputBuffer(buffer);
}
/**
* Renders output buffer to the given surface. Must only be called when in {@link
* C#VIDEO_OUTPUT_MODE_SURFACE_YUV} mode.
*
* @param outputBuffer Output buffer.
* @param surface Output surface.
* @throws Gav1DecoderException Thrown if called with invalid output mode or frame rendering
* fails.
*/
public void renderToSurface(VideoDecoderOutputBuffer outputBuffer, Surface surface)
throws Gav1DecoderException {
if (outputBuffer.mode != C.VIDEO_OUTPUT_MODE_SURFACE_YUV) {
throw new Gav1DecoderException("Invalid output mode.");
}
if (gav1RenderFrame(gav1DecoderContext, surface, outputBuffer) == GAV1_ERROR) {
throw new Gav1DecoderException(
"Buffer render error: " + gav1GetErrorMessage(gav1DecoderContext));
}
}
/**
* Initializes libgav1 decoder.
*
* @param threads Number of threads used by libgav1 decoder.
* @return The address of the decoder context or {@link #GAV1_ERROR} if there was an error.
*/
private native long gav1Init(int threads);
/**
* Deallocates the decoder context.
*
* @param context Decoder context.
*/
private native void gav1Close(long context);
/**
* Decodes the encoded data passed.
*
* @param context Decoder context.
* @param encodedData Encoded data.
* @param length Length of the data buffer.
* @return {@link #GAV1_OK} if successful, {@link #GAV1_ERROR} if an error occurred.
*/
private native int gav1Decode(long context, ByteBuffer encodedData, int length);
/**
* Gets the decoded frame.
*
* @param context Decoder context.
* @param outputBuffer Output buffer for the decoded frame.
* @return {@link #GAV1_OK} if successful, {@link #GAV1_DECODE_ONLY} if successful but the frame
* is decode-only, {@link #GAV1_ERROR} if an error occurred.
*/
private native int gav1GetFrame(
long context, VideoDecoderOutputBuffer outputBuffer, boolean decodeOnly);
/**
* Renders the frame to the surface. Used with {@link C#VIDEO_OUTPUT_MODE_SURFACE_YUV} only.
*
* @param context Decoder context.
* @param surface Output surface.
* @param outputBuffer Output buffer with the decoded frame.
* @return {@link #GAV1_OK} if successful, {@link #GAV1_ERROR} if an error occured.
*/
private native int gav1RenderFrame(
long context, Surface surface, VideoDecoderOutputBuffer outputBuffer);
/**
* Releases the frame. Used with {@link C#VIDEO_OUTPUT_MODE_SURFACE_YUV} only.
*
* @param context Decoder context.
* @param outputBuffer Output buffer.
*/
private native void gav1ReleaseFrame(long context, VideoDecoderOutputBuffer outputBuffer);
/**
* Returns a human-readable string describing the last error encountered in the given context.
*
* @param context Decoder context.
* @return A string describing the last encountered error.
*/
private native String gav1GetErrorMessage(long context);
/**
* Returns whether an error occured.
*
* @param context Decoder context.
* @return {@link #GAV1_OK} if there was no error, {@link #GAV1_ERROR} if an error occured.
*/
private native int gav1CheckError(long context);
}
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.av1;
import com.google.android.exoplayer2.video.VideoDecoderException;
/** Thrown when a libgav1 decoder error occurs. */
public final class Gav1DecoderException extends VideoDecoderException {
/* package */ Gav1DecoderException(String message) {
super(message);
}
/* package */ Gav1DecoderException(String message, Throwable cause) {
super(message, cause);
}
}
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.av1;
import com.google.android.exoplayer2.ExoPlayerLibraryInfo;
import com.google.android.exoplayer2.util.LibraryLoader;
/** Configures and queries the underlying native library. */
public final class Gav1Library {
static {
ExoPlayerLibraryInfo.registerModule("goog.exo.gav1");
}
private static final LibraryLoader LOADER = new LibraryLoader("gav1JNI");
private Gav1Library() {}
/**
* Override the names of the Gav1 native libraries. If an application wishes to call this method,
* it must do so before calling any other method defined by this class, and before instantiating a
* {@link Libgav1VideoRenderer} instance.
*
* @param libraries The names of the Gav1 native libraries.
*/
public static void setLibraries(String... libraries) {
LOADER.setLibraries(libraries);
}
/** Returns whether the underlying library is available, loading it if necessary. */
public static boolean isAvailable() {
return LOADER.isAvailable();
}
}
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.av1;
import static java.lang.Runtime.getRuntime;
import android.os.Handler;
import android.view.Surface;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.ExoPlaybackException;
import com.google.android.exoplayer2.ExoPlayer;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.PlayerMessage.Target;
import com.google.android.exoplayer2.decoder.SimpleDecoder;
import com.google.android.exoplayer2.drm.DrmSessionManager;
import com.google.android.exoplayer2.drm.ExoMediaCrypto;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.exoplayer2.util.TraceUtil;
import com.google.android.exoplayer2.util.Util;
import com.google.android.exoplayer2.video.SimpleDecoderVideoRenderer;
import com.google.android.exoplayer2.video.VideoDecoderException;
import com.google.android.exoplayer2.video.VideoDecoderInputBuffer;
import com.google.android.exoplayer2.video.VideoDecoderOutputBuffer;
import com.google.android.exoplayer2.video.VideoDecoderOutputBufferRenderer;
import com.google.android.exoplayer2.video.VideoRendererEventListener;
/**
* Decodes and renders video using the native libgav1 decoder.
*
* <p>This renderer accepts the following messages sent via {@link ExoPlayer#createMessage(Target)}
* on the playback thread:
*
* <ul>
* <li>Message with type {@link C#MSG_SET_SURFACE} to set the output surface. The message payload
* should be the target {@link Surface}, or null.
* <li>Message with type {@link C#MSG_SET_OUTPUT_BUFFER_RENDERER} to set the output buffer
* renderer. The message payload should be the target {@link
* VideoDecoderOutputBufferRenderer}, or null.
* </ul>
*/
public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer {
private static final int DEFAULT_NUM_OF_INPUT_BUFFERS = 4;
private static final int DEFAULT_NUM_OF_OUTPUT_BUFFERS = 4;
/* Default size based on 720p resolution video compressed by a factor of two. */
private static final int DEFAULT_INPUT_BUFFER_SIZE =
Util.ceilDivide(1280, 64) * Util.ceilDivide(720, 64) * (64 * 64 * 3 / 2) / 2;
/** The number of input buffers. */
private final int numInputBuffers;
/**
* The number of output buffers. The renderer may limit the minimum possible value due to
* requiring multiple output buffers to be dequeued at a time for it to make progress.
*/
private final int numOutputBuffers;
private final int threads;
@Nullable private Gav1Decoder decoder;
/**
* Creates a Libgav1VideoRenderer.
*
* @param allowedJoiningTimeMs The maximum duration in milliseconds for which this video renderer
* can attempt to seamlessly join an ongoing playback.
* @param eventHandler A handler to use when delivering events to {@code eventListener}. May be
* null if delivery of events is not required.
* @param eventListener A listener of events. May be null if delivery of events is not required.
* @param maxDroppedFramesToNotify The maximum number of frames that can be dropped between
* invocations of {@link VideoRendererEventListener#onDroppedFrames(int, long)}.
*/
public Libgav1VideoRenderer(
long allowedJoiningTimeMs,
@Nullable Handler eventHandler,
@Nullable VideoRendererEventListener eventListener,
int maxDroppedFramesToNotify) {
this(
allowedJoiningTimeMs,
eventHandler,
eventListener,
maxDroppedFramesToNotify,
/* threads= */ getRuntime().availableProcessors(),
DEFAULT_NUM_OF_INPUT_BUFFERS,
DEFAULT_NUM_OF_OUTPUT_BUFFERS);
}
/**
* Creates a Libgav1VideoRenderer.
*
* @param allowedJoiningTimeMs The maximum duration in milliseconds for which this video renderer
* can attempt to seamlessly join an ongoing playback.
* @param eventHandler A handler to use when delivering events to {@code eventListener}. May be
* null if delivery of events is not required.
* @param eventListener A listener of events. May be null if delivery of events is not required.
* @param maxDroppedFramesToNotify The maximum number of frames that can be dropped between
* invocations of {@link VideoRendererEventListener#onDroppedFrames(int, long)}.
* @param threads Number of threads libgav1 will use to decode.
* @param numInputBuffers Number of input buffers.
* @param numOutputBuffers Number of output buffers.
*/
public Libgav1VideoRenderer(
long allowedJoiningTimeMs,
@Nullable Handler eventHandler,
@Nullable VideoRendererEventListener eventListener,
int maxDroppedFramesToNotify,
int threads,
int numInputBuffers,
int numOutputBuffers) {
super(
allowedJoiningTimeMs,
eventHandler,
eventListener,
maxDroppedFramesToNotify,
/* drmSessionManager= */ null,
/* playClearSamplesWithoutKeys= */ false);
this.threads = threads;
this.numInputBuffers = numInputBuffers;
this.numOutputBuffers = numOutputBuffers;
}
@Override
protected int supportsFormatInternal(
@Nullable DrmSessionManager<ExoMediaCrypto> drmSessionManager, Format format) {
if (!MimeTypes.VIDEO_AV1.equalsIgnoreCase(format.sampleMimeType)
|| !Gav1Library.isAvailable()) {
return FORMAT_UNSUPPORTED_TYPE;
}
if (!supportsFormatDrm(drmSessionManager, format.drmInitData)) {
return FORMAT_UNSUPPORTED_DRM;
}
return FORMAT_HANDLED | ADAPTIVE_SEAMLESS;
}
@Override
protected SimpleDecoder<
VideoDecoderInputBuffer,
? extends VideoDecoderOutputBuffer,
? extends VideoDecoderException>
createDecoder(Format format, @Nullable ExoMediaCrypto mediaCrypto)
throws VideoDecoderException {
TraceUtil.beginSection("createGav1Decoder");
int initialInputBufferSize =
format.maxInputSize != Format.NO_VALUE ? format.maxInputSize : DEFAULT_INPUT_BUFFER_SIZE;
Gav1Decoder decoder =
new Gav1Decoder(numInputBuffers, numOutputBuffers, initialInputBufferSize, threads);
this.decoder = decoder;
TraceUtil.endSection();
return decoder;
}
@Override
protected void renderOutputBufferToSurface(VideoDecoderOutputBuffer outputBuffer, Surface surface)
throws Gav1DecoderException {
if (decoder == null) {
throw new Gav1DecoderException(
"Failed to render output buffer to surface: decoder is not initialized.");
}
decoder.renderToSurface(outputBuffer, surface);
outputBuffer.release();
}
@Override
protected void setDecoderOutputMode(@C.VideoOutputMode int outputMode) {
if (decoder != null) {
decoder.setOutputMode(outputMode);
}
}
// PlayerMessage.Target implementation.
@Override
public void handleMessage(int messageType, @Nullable Object message) throws ExoPlaybackException {
if (messageType == C.MSG_SET_SURFACE) {
setOutputSurface((Surface) message);
} else if (messageType == C.MSG_SET_OUTPUT_BUFFER_RENDERER) {
setOutputBufferRenderer((VideoDecoderOutputBufferRenderer) message);
} else {
super.handleMessage(messageType, message);
}
}
}
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
@NonNullApi
package com.google.android.exoplayer2.ext.av1;
import com.google.android.exoplayer2.util.NonNullApi;
# libgav1JNI requires modern CMake.
cmake_minimum_required(VERSION 3.7.1 FATAL_ERROR)
# libgav1JNI requires C++11.
set(CMAKE_CXX_STANDARD 11)
project(libgav1JNI C CXX)
# Devices using armeabi-v7a are not required to support
# Neon which is why Neon is disabled by default for
# armeabi-v7a build. This flag enables it.
if(${ANDROID_ABI} MATCHES "armeabi-v7a")
add_compile_options("-mfpu=neon")
endif()
set(libgav1_jni_root "${CMAKE_CURRENT_SOURCE_DIR}")
set(libgav1_jni_build "${CMAKE_BINARY_DIR}")
set(libgav1_jni_output_directory
${libgav1_jni_root}/../libs/${ANDROID_ABI}/)
set(libgav1_root "${libgav1_jni_root}/libgav1")
set(libgav1_build "${libgav1_jni_build}/libgav1")
# Build libgav1.
add_subdirectory("${libgav1_root}"
"${libgav1_build}"
EXCLUDE_FROM_ALL)
# Build libgav1JNI.
add_library(gav1JNI
SHARED
gav1_jni.cc)
# Locate NDK log library.
find_library(android_log_lib log)
# Build cpufeatures library.
include(AndroidNdkModules)
android_ndk_import_module_cpufeatures()
# Link libgav1JNI against used libraries.
target_link_libraries(gav1JNI
PRIVATE android
PRIVATE cpufeatures
PRIVATE libgav1_static
PRIVATE ${android_log_lib})
# Specify output directory for libgav1JNI.
set_target_properties(gav1JNI PROPERTIES
LIBRARY_OUTPUT_DIRECTORY
${libgav1_jni_output_directory})
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#include <android/log.h>
#include <android/native_window.h>
#include <android/native_window_jni.h>
#ifdef __ARM_NEON
#include <arm_neon.h>
#endif // __ARM_NEON
#include <cpu-features.h>
#include <jni.h>
#include <cstring>
#include <mutex> // NOLINT
#include <new>
#include "gav1/decoder.h"
#define LOG_TAG "gav1_jni"
#define LOGE(...) \
((void)__android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__))
#define DECODER_FUNC(RETURN_TYPE, NAME, ...) \
extern "C" { \
JNIEXPORT RETURN_TYPE \
Java_com_google_android_exoplayer2_ext_av1_Gav1Decoder_##NAME( \
JNIEnv* env, jobject thiz, ##__VA_ARGS__); \
} \
JNIEXPORT RETURN_TYPE \
Java_com_google_android_exoplayer2_ext_av1_Gav1Decoder_##NAME( \
JNIEnv* env, jobject thiz, ##__VA_ARGS__)
jint JNI_OnLoad(JavaVM* vm, void* reserved) {
JNIEnv* env;
if (vm->GetEnv(reinterpret_cast<void**>(&env), JNI_VERSION_1_6) != JNI_OK) {
return -1;
}
return JNI_VERSION_1_6;
}
namespace {
// YUV plane indices.
const int kPlaneY = 0;
const int kPlaneU = 1;
const int kPlaneV = 2;
const int kMaxPlanes = 3;
// Android YUV format. See:
// https://developer.android.com/reference/android/graphics/ImageFormat.html#YV12.
const int kImageFormatYV12 = 0x32315659;
// LINT.IfChange
// Output modes.
const int kOutputModeYuv = 0;
const int kOutputModeSurfaceYuv = 1;
// LINT.ThenChange(../../../../../library/core/src/main/java/com/google/android/exoplayer2/C.java)
// LINT.IfChange
const int kColorSpaceUnknown = 0;
// LINT.ThenChange(../../../../../library/core/src/main/java/com/google/android/exoplayer2/video/VideoDecoderOutputBuffer.java)
// LINT.IfChange
// Return codes for jni methods.
const int kStatusError = 0;
const int kStatusOk = 1;
const int kStatusDecodeOnly = 2;
// LINT.ThenChange(../java/com/google/android/exoplayer2/ext/av1/Gav1Decoder.java)
// Status codes specific to the JNI wrapper code.
enum JniStatusCode {
kJniStatusOk = 0,
kJniStatusOutOfMemory = -1,
kJniStatusBufferAlreadyReleased = -2,
kJniStatusInvalidNumOfPlanes = -3,
kJniStatusBitDepth12NotSupportedWithYuv = -4,
kJniStatusHighBitDepthNotSupportedWithSurfaceYuv = -5,
kJniStatusANativeWindowError = -6,
kJniStatusBufferResizeError = -7,
kJniStatusNeonNotSupported = -8
};
const char* GetJniErrorMessage(JniStatusCode error_code) {
switch (error_code) {
case kJniStatusOutOfMemory:
return "Out of memory.";
case kJniStatusBufferAlreadyReleased:
return "JNI buffer already released.";
case kJniStatusBitDepth12NotSupportedWithYuv:
return "Bit depth 12 is not supported with YUV.";
case kJniStatusHighBitDepthNotSupportedWithSurfaceYuv:
return "High bit depth (10 or 12 bits per pixel) output format is not "
"supported with YUV surface.";
case kJniStatusInvalidNumOfPlanes:
return "Libgav1 decoded buffer has invalid number of planes.";
case kJniStatusANativeWindowError:
return "ANativeWindow error.";
case kJniStatusBufferResizeError:
return "Buffer resize failed.";
case kJniStatusNeonNotSupported:
return "Neon is not supported.";
default:
return "Unrecognized error code.";
}
}
// Manages Libgav1FrameBuffer and reference information.
class JniFrameBuffer {
public:
explicit JniFrameBuffer(int id) : id_(id), reference_count_(0) {
gav1_frame_buffer_.private_data = &id_;
}
~JniFrameBuffer() {
for (int plane_index = kPlaneY; plane_index < kMaxPlanes; plane_index++) {
delete[] gav1_frame_buffer_.data[plane_index];
}
}
void SetFrameData(const libgav1::DecoderBuffer& decoder_buffer) {
for (int plane_index = kPlaneY; plane_index < decoder_buffer.NumPlanes();
plane_index++) {
stride_[plane_index] = decoder_buffer.stride[plane_index];
plane_[plane_index] = decoder_buffer.plane[plane_index];
displayed_width_[plane_index] =
decoder_buffer.displayed_width[plane_index];
displayed_height_[plane_index] =
decoder_buffer.displayed_height[plane_index];
}
}
int Stride(int plane_index) const { return stride_[plane_index]; }
uint8_t* Plane(int plane_index) const { return plane_[plane_index]; }
int DisplayedWidth(int plane_index) const {
return displayed_width_[plane_index];
}
int DisplayedHeight(int plane_index) const {
return displayed_height_[plane_index];
}
// Methods maintaining reference count are not thread-safe. They must be
// called with a lock held.
void AddReference() { ++reference_count_; }
void RemoveReference() { reference_count_--; }
bool InUse() const { return reference_count_ != 0; }
const Libgav1FrameBuffer& GetGav1FrameBuffer() const {
return gav1_frame_buffer_;
}
// Attempts to reallocate data planes if the existing ones don't have enough
// capacity. Returns true if the allocation was successful or wasn't needed,
// false if the allocation failed.
bool MaybeReallocateGav1DataPlanes(int y_plane_min_size,
int uv_plane_min_size) {
for (int plane_index = kPlaneY; plane_index < kMaxPlanes; plane_index++) {
const int min_size =
(plane_index == kPlaneY) ? y_plane_min_size : uv_plane_min_size;
if (gav1_frame_buffer_.size[plane_index] >= min_size) continue;
delete[] gav1_frame_buffer_.data[plane_index];
gav1_frame_buffer_.data[plane_index] =
new (std::nothrow) uint8_t[min_size];
if (!gav1_frame_buffer_.data[plane_index]) {
gav1_frame_buffer_.size[plane_index] = 0;
return false;
}
gav1_frame_buffer_.size[plane_index] = min_size;
}
return true;
}
private:
int stride_[kMaxPlanes];
uint8_t* plane_[kMaxPlanes];
int displayed_width_[kMaxPlanes];
int displayed_height_[kMaxPlanes];
int id_;
int reference_count_;
Libgav1FrameBuffer gav1_frame_buffer_ = {};
};
// Manages frame buffers used by libgav1 decoder and ExoPlayer.
// Handles synchronization between libgav1 and ExoPlayer threads.
class JniBufferManager {
public:
~JniBufferManager() {
// This lock does not do anything since libgav1 has released all the frame
// buffers. It exists to merely be consistent with all other usage of
// |all_buffers_| and |all_buffer_count_|.
std::lock_guard<std::mutex> lock(mutex_);
while (all_buffer_count_--) {
delete all_buffers_[all_buffer_count_];
}
}
JniStatusCode GetBuffer(size_t y_plane_min_size, size_t uv_plane_min_size,
Libgav1FrameBuffer* frame_buffer) {
std::lock_guard<std::mutex> lock(mutex_);
JniFrameBuffer* output_buffer;
if (free_buffer_count_) {
output_buffer = free_buffers_[--free_buffer_count_];
} else if (all_buffer_count_ < kMaxFrames) {
output_buffer = new (std::nothrow) JniFrameBuffer(all_buffer_count_);
if (output_buffer == nullptr) return kJniStatusOutOfMemory;
all_buffers_[all_buffer_count_++] = output_buffer;
} else {
// Maximum number of buffers is being used.
return kJniStatusOutOfMemory;
}
if (!output_buffer->MaybeReallocateGav1DataPlanes(y_plane_min_size,
uv_plane_min_size)) {
return kJniStatusOutOfMemory;
}
output_buffer->AddReference();
*frame_buffer = output_buffer->GetGav1FrameBuffer();
return kJniStatusOk;
}
JniFrameBuffer* GetBuffer(int id) const { return all_buffers_[id]; }
void AddBufferReference(int id) {
std::lock_guard<std::mutex> lock(mutex_);
all_buffers_[id]->AddReference();
}
JniStatusCode ReleaseBuffer(int id) {
std::lock_guard<std::mutex> lock(mutex_);
JniFrameBuffer* buffer = all_buffers_[id];
if (!buffer->InUse()) {
return kJniStatusBufferAlreadyReleased;
}
buffer->RemoveReference();
if (!buffer->InUse()) {
free_buffers_[free_buffer_count_++] = buffer;
}
return kJniStatusOk;
}
private:
static const int kMaxFrames = 32;
JniFrameBuffer* all_buffers_[kMaxFrames];
int all_buffer_count_ = 0;
JniFrameBuffer* free_buffers_[kMaxFrames];
int free_buffer_count_ = 0;
std::mutex mutex_;
};
struct JniContext {
~JniContext() {
if (native_window) {
ANativeWindow_release(native_window);
}
}
bool MaybeAcquireNativeWindow(JNIEnv* env, jobject new_surface) {
if (surface == new_surface) {
return true;
}
if (native_window) {
ANativeWindow_release(native_window);
}
native_window_width = 0;
native_window_height = 0;
native_window = ANativeWindow_fromSurface(env, new_surface);
if (native_window == nullptr) {
jni_status_code = kJniStatusANativeWindowError;
surface = nullptr;
return false;
}
surface = new_surface;
return true;
}
jfieldID decoder_private_field;
jfieldID output_mode_field;
jfieldID data_field;
jmethodID init_for_private_frame_method;
jmethodID init_for_yuv_frame_method;
JniBufferManager buffer_manager;
// The libgav1 decoder instance has to be deleted before |buffer_manager| is
// destructed. This will make sure that libgav1 releases all the frame
// buffers that it might be holding references to. So this has to be declared
// after |buffer_manager| since the destruction happens in reverse order of
// declaration.
libgav1::Decoder decoder;
ANativeWindow* native_window = nullptr;
jobject surface = nullptr;
int native_window_width = 0;
int native_window_height = 0;
Libgav1StatusCode libgav1_status_code = kLibgav1StatusOk;
JniStatusCode jni_status_code = kJniStatusOk;
};
int Libgav1GetFrameBuffer(void* private_data, size_t y_plane_min_size,
size_t uv_plane_min_size,
Libgav1FrameBuffer* frame_buffer) {
JniContext* const context = reinterpret_cast<JniContext*>(private_data);
context->jni_status_code = context->buffer_manager.GetBuffer(
y_plane_min_size, uv_plane_min_size, frame_buffer);
if (context->jni_status_code != kJniStatusOk) {
LOGE("%s", GetJniErrorMessage(context->jni_status_code));
return -1;
}
return 0;
}
int Libgav1ReleaseFrameBuffer(void* private_data,
Libgav1FrameBuffer* frame_buffer) {
JniContext* const context = reinterpret_cast<JniContext*>(private_data);
const int buffer_id = *reinterpret_cast<int*>(frame_buffer->private_data);
context->jni_status_code = context->buffer_manager.ReleaseBuffer(buffer_id);
if (context->jni_status_code != kJniStatusOk) {
LOGE("%s", GetJniErrorMessage(context->jni_status_code));
return -1;
}
return 0;
}
constexpr int AlignTo16(int value) { return (value + 15) & (~15); }
void CopyPlane(const uint8_t* source, int source_stride, uint8_t* destination,
int destination_stride, int width, int height) {
while (height--) {
std::memcpy(destination, source, width);
source += source_stride;
destination += destination_stride;
}
}
void CopyFrameToDataBuffer(const libgav1::DecoderBuffer* decoder_buffer,
jbyte* data) {
for (int plane_index = kPlaneY; plane_index < decoder_buffer->NumPlanes();
plane_index++) {
const uint64_t length = decoder_buffer->stride[plane_index] *
decoder_buffer->displayed_height[plane_index];
memcpy(data, decoder_buffer->plane[plane_index], length);
data += length;
}
}
void Convert10BitFrameTo8BitDataBuffer(
const libgav1::DecoderBuffer* decoder_buffer, jbyte* data) {
for (int plane_index = kPlaneY; plane_index < decoder_buffer->NumPlanes();
plane_index++) {
int sample = 0;
const uint8_t* source = decoder_buffer->plane[plane_index];
for (int i = 0; i < decoder_buffer->displayed_height[plane_index]; i++) {
const uint16_t* source_16 = reinterpret_cast<const uint16_t*>(source);
for (int j = 0; j < decoder_buffer->displayed_width[plane_index]; j++) {
// Lightweight dither. Carryover the remainder of each 10->8 bit
// conversion to the next pixel.
sample += source_16[j];
data[j] = sample >> 2;
sample &= 3; // Remainder.
}
source += decoder_buffer->stride[plane_index];
data += decoder_buffer->stride[plane_index];
}
}
}
#ifdef __ARM_NEON
void Convert10BitFrameTo8BitDataBufferNeon(
const libgav1::DecoderBuffer* decoder_buffer, jbyte* data) {
uint32x2_t lcg_value = vdup_n_u32(random());
lcg_value = vset_lane_u32(random(), lcg_value, 1);
// LCG values recommended in "Numerical Recipes".
const uint32x2_t LCG_MULT = vdup_n_u32(1664525);
const uint32x2_t LCG_INCR = vdup_n_u32(1013904223);
for (int plane_index = kPlaneY; plane_index < kMaxPlanes; plane_index++) {
const uint8_t* source = decoder_buffer->plane[plane_index];
for (int i = 0; i < decoder_buffer->displayed_height[plane_index]; i++) {
const uint16_t* source_16 = reinterpret_cast<const uint16_t*>(source);
uint8_t* destination = reinterpret_cast<uint8_t*>(data);
// Each read consumes 4 2-byte samples, but to reduce branches and
// random steps we unroll to 4 rounds, so each loop consumes 16
// samples.
const int j_max = decoder_buffer->displayed_width[plane_index] & ~15;
int j;
for (j = 0; j < j_max; j += 16) {
// Run a round of the RNG.
lcg_value = vmla_u32(LCG_INCR, lcg_value, LCG_MULT);
// Round 1.
// The lower two bits of this LCG parameterization are garbage,
// leaving streaks on the image. We access the upper bits of each
// 16-bit lane by shifting. (We use this both as an 8- and 16-bit
// vector, so the choice of which one to keep it as is arbitrary.)
uint8x8_t randvec =
vreinterpret_u8_u16(vshr_n_u16(vreinterpret_u16_u32(lcg_value), 8));
// We retrieve the values and shift them so that the bits we'll
// shift out (after biasing) are in the upper 8 bits of each 16-bit
// lane.
uint16x4_t values = vshl_n_u16(vld1_u16(source_16), 6);
// We add the bias bits in the lower 8 to the shifted values to get
// the final values in the upper 8 bits.
uint16x4_t added_1 = vqadd_u16(values, vreinterpret_u16_u8(randvec));
source_16 += 4;
// Round 2.
// Shifting the randvec bits left by 2 bits, as an 8-bit vector,
// should leave us with enough bias to get the needed rounding
// operation.
randvec = vshl_n_u8(randvec, 2);
// Retrieve and sum the next 4 pixels.
values = vshl_n_u16(vld1_u16(source_16), 6);
uint16x4_t added_2 = vqadd_u16(values, vreinterpret_u16_u8(randvec));
source_16 += 4;
// Reinterpret the two added vectors as 8x8, zip them together, and
// discard the lower portions.
uint8x8_t zipped =
vuzp_u8(vreinterpret_u8_u16(added_1), vreinterpret_u8_u16(added_2))
.val[1];
vst1_u8(destination, zipped);
destination += 8;
// Run it again with the next two rounds using the remaining
// entropy in randvec.
// Round 3.
randvec = vshl_n_u8(randvec, 2);
values = vshl_n_u16(vld1_u16(source_16), 6);
added_1 = vqadd_u16(values, vreinterpret_u16_u8(randvec));
source_16 += 4;
// Round 4.
randvec = vshl_n_u8(randvec, 2);
values = vshl_n_u16(vld1_u16(source_16), 6);
added_2 = vqadd_u16(values, vreinterpret_u16_u8(randvec));
source_16 += 4;
zipped =
vuzp_u8(vreinterpret_u8_u16(added_1), vreinterpret_u8_u16(added_2))
.val[1];
vst1_u8(destination, zipped);
destination += 8;
}
uint32_t randval = 0;
// For the remaining pixels in each row - usually none, as most
// standard sizes are divisible by 32 - convert them "by hand".
for (; j < decoder_buffer->displayed_width[plane_index]; j++) {
if (!randval) randval = random();
destination[j] = (source_16[j] + (randval & 3)) >> 2;
randval >>= 2;
}
source += decoder_buffer->stride[plane_index];
data += decoder_buffer->stride[plane_index];
}
}
}
#endif // __ARM_NEON
} // namespace
DECODER_FUNC(jlong, gav1Init, jint threads) {
JniContext* context = new (std::nothrow) JniContext();
if (context == nullptr) {
return kStatusError;
}
#ifdef __arm__
// Libgav1 requires NEON with arm ABIs.
#ifdef __ARM_NEON
if (!(android_getCpuFeatures() & ANDROID_CPU_ARM_FEATURE_NEON)) {
context->jni_status_code = kJniStatusNeonNotSupported;
return reinterpret_cast<jlong>(context);
}
#else
context->jni_status_code = kJniStatusNeonNotSupported;
return reinterpret_cast<jlong>(context);
#endif // __ARM_NEON
#endif // __arm__
libgav1::DecoderSettings settings;
settings.threads = threads;
settings.get = Libgav1GetFrameBuffer;
settings.release = Libgav1ReleaseFrameBuffer;
settings.callback_private_data = context;
context->libgav1_status_code = context->decoder.Init(&settings);
if (context->libgav1_status_code != kLibgav1StatusOk) {
return reinterpret_cast<jlong>(context);
}
// Populate JNI References.
const jclass outputBufferClass = env->FindClass(
"com/google/android/exoplayer2/video/VideoDecoderOutputBuffer");
context->decoder_private_field =
env->GetFieldID(outputBufferClass, "decoderPrivate", "I");
context->output_mode_field = env->GetFieldID(outputBufferClass, "mode", "I");
context->data_field =
env->GetFieldID(outputBufferClass, "data", "Ljava/nio/ByteBuffer;");
context->init_for_private_frame_method =
env->GetMethodID(outputBufferClass, "initForPrivateFrame", "(II)V");
context->init_for_yuv_frame_method =
env->GetMethodID(outputBufferClass, "initForYuvFrame", "(IIIII)Z");
return reinterpret_cast<jlong>(context);
}
DECODER_FUNC(void, gav1Close, jlong jContext) {
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
delete context;
}
DECODER_FUNC(jint, gav1Decode, jlong jContext, jobject encodedData,
jint length) {
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
const uint8_t* const buffer = reinterpret_cast<const uint8_t*>(
env->GetDirectBufferAddress(encodedData));
context->libgav1_status_code =
context->decoder.EnqueueFrame(buffer, length, /*user_private_data=*/0);
if (context->libgav1_status_code != kLibgav1StatusOk) {
return kStatusError;
}
return kStatusOk;
}
DECODER_FUNC(jint, gav1GetFrame, jlong jContext, jobject jOutputBuffer,
jboolean decodeOnly) {
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
const libgav1::DecoderBuffer* decoder_buffer;
context->libgav1_status_code = context->decoder.DequeueFrame(&decoder_buffer);
if (context->libgav1_status_code != kLibgav1StatusOk) {
return kStatusError;
}
if (decodeOnly || decoder_buffer == nullptr) {
// This is not an error. The input data was decode-only or no displayable
// frames are available.
return kStatusDecodeOnly;
}
const int output_mode =
env->GetIntField(jOutputBuffer, context->output_mode_field);
if (output_mode == kOutputModeYuv) {
// Resize the buffer if required. Default color conversion will be used as
// libgav1::DecoderBuffer doesn't expose color space info.
const jboolean init_result = env->CallBooleanMethod(
jOutputBuffer, context->init_for_yuv_frame_method,
decoder_buffer->displayed_width[kPlaneY],
decoder_buffer->displayed_height[kPlaneY],
decoder_buffer->stride[kPlaneY], decoder_buffer->stride[kPlaneU],
kColorSpaceUnknown);
if (env->ExceptionCheck()) {
// Exception is thrown in Java when returning from the native call.
return kStatusError;
}
if (!init_result) {
context->jni_status_code = kJniStatusBufferResizeError;
return kStatusError;
}
const jobject data_object =
env->GetObjectField(jOutputBuffer, context->data_field);
jbyte* const data =
reinterpret_cast<jbyte*>(env->GetDirectBufferAddress(data_object));
switch (decoder_buffer->bitdepth) {
case 8:
CopyFrameToDataBuffer(decoder_buffer, data);
break;
case 10:
#ifdef __ARM_NEON
Convert10BitFrameTo8BitDataBufferNeon(decoder_buffer, data);
#else
Convert10BitFrameTo8BitDataBuffer(decoder_buffer, data);
#endif // __ARM_NEON
break;
default:
context->jni_status_code = kJniStatusBitDepth12NotSupportedWithYuv;
return kStatusError;
}
} else if (output_mode == kOutputModeSurfaceYuv) {
if (decoder_buffer->bitdepth != 8) {
context->jni_status_code =
kJniStatusHighBitDepthNotSupportedWithSurfaceYuv;
return kStatusError;
}
if (decoder_buffer->NumPlanes() > kMaxPlanes) {
context->jni_status_code = kJniStatusInvalidNumOfPlanes;
return kStatusError;
}
const int buffer_id =
*reinterpret_cast<int*>(decoder_buffer->buffer_private_data);
context->buffer_manager.AddBufferReference(buffer_id);
JniFrameBuffer* const jni_buffer =
context->buffer_manager.GetBuffer(buffer_id);
jni_buffer->SetFrameData(*decoder_buffer);
env->CallVoidMethod(jOutputBuffer, context->init_for_private_frame_method,
decoder_buffer->displayed_width[kPlaneY],
decoder_buffer->displayed_height[kPlaneY]);
if (env->ExceptionCheck()) {
// Exception is thrown in Java when returning from the native call.
return kStatusError;
}
env->SetIntField(jOutputBuffer, context->decoder_private_field, buffer_id);
}
return kStatusOk;
}
DECODER_FUNC(jint, gav1RenderFrame, jlong jContext, jobject jSurface,
jobject jOutputBuffer) {
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
const int buffer_id =
env->GetIntField(jOutputBuffer, context->decoder_private_field);
JniFrameBuffer* const jni_buffer =
context->buffer_manager.GetBuffer(buffer_id);
if (!context->MaybeAcquireNativeWindow(env, jSurface)) {
return kStatusError;
}
if (context->native_window_width != jni_buffer->DisplayedWidth(kPlaneY) ||
context->native_window_height != jni_buffer->DisplayedHeight(kPlaneY)) {
if (ANativeWindow_setBuffersGeometry(
context->native_window, jni_buffer->DisplayedWidth(kPlaneY),
jni_buffer->DisplayedHeight(kPlaneY), kImageFormatYV12)) {
context->jni_status_code = kJniStatusANativeWindowError;
return kStatusError;
}
context->native_window_width = jni_buffer->DisplayedWidth(kPlaneY);
context->native_window_height = jni_buffer->DisplayedHeight(kPlaneY);
}
ANativeWindow_Buffer native_window_buffer;
if (ANativeWindow_lock(context->native_window, &native_window_buffer,
/*inOutDirtyBounds=*/nullptr) ||
native_window_buffer.bits == nullptr) {
context->jni_status_code = kJniStatusANativeWindowError;
return kStatusError;
}
// Y plane
CopyPlane(jni_buffer->Plane(kPlaneY), jni_buffer->Stride(kPlaneY),
reinterpret_cast<uint8_t*>(native_window_buffer.bits),
native_window_buffer.stride, jni_buffer->DisplayedWidth(kPlaneY),
jni_buffer->DisplayedHeight(kPlaneY));
const int y_plane_size =
native_window_buffer.stride * native_window_buffer.height;
const int32_t native_window_buffer_uv_height =
(native_window_buffer.height + 1) / 2;
const int native_window_buffer_uv_stride =
AlignTo16(native_window_buffer.stride / 2);
// TODO(b/140606738): Handle monochrome videos.
// V plane
// Since the format for ANativeWindow is YV12, V plane is being processed
// before U plane.
const int v_plane_height = std::min(native_window_buffer_uv_height,
jni_buffer->DisplayedHeight(kPlaneV));
CopyPlane(
jni_buffer->Plane(kPlaneV), jni_buffer->Stride(kPlaneV),
reinterpret_cast<uint8_t*>(native_window_buffer.bits) + y_plane_size,
native_window_buffer_uv_stride, jni_buffer->DisplayedWidth(kPlaneV),
v_plane_height);
const int v_plane_size = v_plane_height * native_window_buffer_uv_stride;
// U plane
CopyPlane(jni_buffer->Plane(kPlaneU), jni_buffer->Stride(kPlaneU),
reinterpret_cast<uint8_t*>(native_window_buffer.bits) +
y_plane_size + v_plane_size,
native_window_buffer_uv_stride, jni_buffer->DisplayedWidth(kPlaneU),
std::min(native_window_buffer_uv_height,
jni_buffer->DisplayedHeight(kPlaneU)));
if (ANativeWindow_unlockAndPost(context->native_window)) {
context->jni_status_code = kJniStatusANativeWindowError;
return kStatusError;
}
return kStatusOk;
}
DECODER_FUNC(void, gav1ReleaseFrame, jlong jContext, jobject jOutputBuffer) {
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
const int buffer_id =
env->GetIntField(jOutputBuffer, context->decoder_private_field);
env->SetIntField(jOutputBuffer, context->decoder_private_field, -1);
context->jni_status_code = context->buffer_manager.ReleaseBuffer(buffer_id);
if (context->jni_status_code != kJniStatusOk) {
LOGE("%s", GetJniErrorMessage(context->jni_status_code));
}
}
DECODER_FUNC(jstring, gav1GetErrorMessage, jlong jContext) {
if (jContext == 0) {
return env->NewStringUTF("Failed to initialize JNI context.");
}
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
if (context->libgav1_status_code != kLibgav1StatusOk) {
return env->NewStringUTF(
libgav1::GetErrorString(context->libgav1_status_code));
}
if (context->jni_status_code != kJniStatusOk) {
return env->NewStringUTF(GetJniErrorMessage(context->jni_status_code));
}
return env->NewStringUTF("None.");
}
DECODER_FUNC(jint, gav1CheckError, jlong jContext) {
JniContext* const context = reinterpret_cast<JniContext*>(jContext);
if (context->libgav1_status_code != kLibgav1StatusOk ||
context->jni_status_code != kJniStatusOk) {
return kStatusError;
}
return kStatusOk;
}
// TODO(b/139902005): Add functions for getting libgav1 version and build
// configuration once libgav1 ABI provides this information.
# Format: google3/devtools/metadata/metadata.proto (go/google3metadata)
tricorder: {
options: {
builder: {
config: "android"
}
}
}
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or sign in to comment