Commit 6bfdf8f8 by Oliver Woodman Committed by GitHub

Merge pull request #7437 from google/dev-v2-r2.11.5

r2.11.5
parents 7d3f54a3 46d29b25
Showing with 1688 additions and 1055 deletions
# Release notes # # Release notes #
### 2.11.5 (2020-06-05) ###
* Improve the smoothness of video playback immediately after starting, seeking
or resuming a playback
([#6901](https://github.com/google/ExoPlayer/issues/6901)).
* Add `SilenceMediaSource.Factory` to support tags.
* Enable the configuration of `SilenceSkippingAudioProcessor`
([#6705](https://github.com/google/ExoPlayer/issues/6705)).
* Fix bug where `PlayerMessages` throw an exception after `MediaSources`
are removed from the playlist
([#7278](https://github.com/google/ExoPlayer/issues/7278)).
* Fix "Not allowed to start service" `IllegalStateException` in
`DownloadService`
([#7306](https://github.com/google/ExoPlayer/issues/7306)).
* Fix issue in `AudioTrackPositionTracker` that could cause negative positions
to be reported at the start of playback and immediately after seeking
([#7456](https://github.com/google/ExoPlayer/issues/7456).
* Fix further cases where downloads would sometimes not resume after their
network requirements are met
([#7453](https://github.com/google/ExoPlayer/issues/7453).
* DASH:
* Merge trick play adaptation sets (i.e., adaptation sets marked with
`http://dashif.org/guidelines/trickmode`) into the same `TrackGroup` as
the main adaptation sets to which they refer. Trick play tracks are
marked with the `C.ROLE_FLAG_TRICK_PLAY` flag.
* Fix assertion failure in `SampleQueue` when playing DASH streams with
EMSG tracks ([#7273](https://github.com/google/ExoPlayer/issues/7273)).
* MP4: Store the Android capture frame rate only in `Format.metadata`.
`Format.frameRate` now stores the calculated frame rate.
* FMP4: Avoid throwing an exception while parsing default sample values whose
most significant bits are set
([#7207](https://github.com/google/ExoPlayer/issues/7207)).
* MP3: Fix issue parsing the XING headers belonging to files larger than 2GB
([#7337](https://github.com/google/ExoPlayer/issues/7337)).
* MPEG-TS: Fix issue where SEI NAL units were incorrectly dropped from H.265
samples ([#7113](https://github.com/google/ExoPlayer/issues/7113)).
* UI:
* Fix `DefaultTimeBar` to respect touch transformations
([#7303](https://github.com/google/ExoPlayer/issues/7303)).
* Add `showScrubber` and `hideScrubber` methods to `DefaultTimeBar`.
* Text:
* Use anti-aliasing and bitmap filtering when displaying bitmap
subtitles.
* Fix `SubtitlePainter` to render `EDGE_TYPE_OUTLINE` using the correct
color.
* IMA extension:
* Upgrade to IMA SDK version 3.19.0, and migrate to new
preloading APIs
([#6429](https://github.com/google/ExoPlayer/issues/6429)). This fixes
several issues involving preloading and handling of ad loading error
cases: ([#4140](https://github.com/google/ExoPlayer/issues/4140),
[#5006](https://github.com/google/ExoPlayer/issues/5006),
[#6030](https://github.com/google/ExoPlayer/issues/6030),
[#6097](https://github.com/google/ExoPlayer/issues/6097),
[#6425](https://github.com/google/ExoPlayer/issues/6425),
[#6967](https://github.com/google/ExoPlayer/issues/6967),
[#7041](https://github.com/google/ExoPlayer/issues/7041),
[#7161](https://github.com/google/ExoPlayer/issues/7161),
[#7212](https://github.com/google/ExoPlayer/issues/7212),
[#7340](https://github.com/google/ExoPlayer/issues/7340)).
* Add support for timing out ad preloading, to avoid playback getting
stuck if an ad group unexpectedly fails to load
([#5444](https://github.com/google/ExoPlayer/issues/5444),
[#5966](https://github.com/google/ExoPlayer/issues/5966),
[#7002](https://github.com/google/ExoPlayer/issues/7002)).
* Fix `AdsMediaSource` child `MediaSource`s not being released.
* Cronet extension: Default to using the Cronet implementation in Google Play
Services rather than Cronet Embedded. This allows Cronet to be used with a
negligible increase in application size, compared to approximately 8MB when
embedding the library.
* OkHttp extension: Upgrade OkHttp dependency to 3.12.11.
* MediaSession extension:
* Only set the playback state to `BUFFERING` if `playWhenReady` is true
([#7206](https://github.com/google/ExoPlayer/issues/7206)).
* Add missing `@Nullable` annotations to `MediaSessionConnector`
([#7234](https://github.com/google/ExoPlayer/issues/7234)).
* AV1 extension: Add a heuristic to determine the default number of threads
used for AV1 playback using the extension.
### 2.11.4 (2020-04-08) ### ### 2.11.4 (2020-04-08) ###
* Add `SimpleExoPlayer.setWakeMode` to allow automatic `WifiLock` and `WakeLock` * Add `SimpleExoPlayer.setWakeMode` to allow automatic `WifiLock` and `WakeLock`
......
...@@ -13,8 +13,8 @@ ...@@ -13,8 +13,8 @@
// limitations under the License. // limitations under the License.
project.ext { project.ext {
// ExoPlayer version and version code. // ExoPlayer version and version code.
releaseVersion = '2.11.4' releaseVersion = '2.11.5'
releaseVersionCode = 2011004 releaseVersionCode = 2011005
minSdkVersion = 16 minSdkVersion = 16
appTargetSdkVersion = 29 appTargetSdkVersion = 29
targetSdkVersion = 28 // TODO: Bump once b/143232359 is resolved targetSdkVersion = 28 // TODO: Bump once b/143232359 is resolved
......
...@@ -2,3 +2,24 @@ ...@@ -2,3 +2,24 @@
This directory contains applications that demonstrate how to use ExoPlayer. This directory contains applications that demonstrate how to use ExoPlayer.
Browse the individual demos and their READMEs to learn more. Browse the individual demos and their READMEs to learn more.
## Running a demo ##
### From Android Studio ###
* File -> New -> Import Project -> Specify the root ExoPlayer folder.
* Choose the demo from the run configuration dropdown list.
* Click Run.
### Using gradle from the command line: ###
* Open a Terminal window at the root ExoPlayer folder.
* Run `./gradlew projects` to show all projects. Demo projects start with `demo`.
* Run `./gradlew :<demo name>:tasks` to view the list of available tasks for
the demo project. Choose an install option from the `Install tasks` section.
* Run `./gradlew :<demo name>:<install task>`.
**Example**:
`./gradlew :demo:installNoExtensionsDebug` installs the main ExoPlayer demo app
in debug mode with no extensions.
...@@ -2,3 +2,6 @@ ...@@ -2,3 +2,6 @@
This folder contains a demo application that showcases ExoPlayer integration This folder contains a demo application that showcases ExoPlayer integration
with Google Cast. with Google Cast.
Please see the [demos README](../README.md) for instructions on how to build and
run this demo.
...@@ -8,4 +8,7 @@ drawn using an Android canvas, and includes the current frame's presentation ...@@ -8,4 +8,7 @@ drawn using an Android canvas, and includes the current frame's presentation
timestamp, to show how to get the timestamp of the frame currently in the timestamp, to show how to get the timestamp of the frame currently in the
off-screen surface texture. off-screen surface texture.
Please see the [demos README](../README.md) for instructions on how to build and
run this demo.
[GLSurfaceView]: https://developer.android.com/reference/android/opengl/GLSurfaceView [GLSurfaceView]: https://developer.android.com/reference/android/opengl/GLSurfaceView
...@@ -3,3 +3,6 @@ ...@@ -3,3 +3,6 @@
This is the main ExoPlayer demo application. It uses ExoPlayer to play a number This is the main ExoPlayer demo application. It uses ExoPlayer to play a number
of test streams. It can be used as a starting point or reference project when of test streams. It can be used as a starting point or reference project when
developing other applications that make use of the ExoPlayer library. developing other applications that make use of the ExoPlayer library.
Please see the [demos README](../README.md) for instructions on how to build and
run this demo.
...@@ -628,7 +628,10 @@ public class PlayerActivity extends AppCompatActivity ...@@ -628,7 +628,10 @@ public class PlayerActivity extends AppCompatActivity
@Override @Override
public MediaSourceFactory setDrmSessionManager(DrmSessionManager<?> drmSessionManager) { public MediaSourceFactory setDrmSessionManager(DrmSessionManager<?> drmSessionManager) {
this.drmSessionManager = drmSessionManager; this.drmSessionManager =
drmSessionManager != null
? drmSessionManager
: DrmSessionManager.getDummyDrmSessionManager();
return this; return this;
} }
......
...@@ -18,4 +18,7 @@ called, and because you can move output off-screen easily (`setOutputSurface` ...@@ -18,4 +18,7 @@ called, and because you can move output off-screen easily (`setOutputSurface`
can't take a `null` surface, so the player has to use a `DummySurface`, which can't take a `null` surface, so the player has to use a `DummySurface`, which
doesn't handle protected output on all devices). doesn't handle protected output on all devices).
Please see the [demos README](../README.md) for instructions on how to build and
run this demo.
[SurfaceControl]: https://developer.android.com/reference/android/view/SurfaceControl [SurfaceControl]: https://developer.android.com/reference/android/view/SurfaceControl
...@@ -15,6 +15,8 @@ ...@@ -15,6 +15,8 @@
*/ */
package com.google.android.exoplayer2.ext.av1; package com.google.android.exoplayer2.ext.av1;
import static java.lang.Runtime.getRuntime;
import android.view.Surface; import android.view.Surface;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C; import com.google.android.exoplayer2.C;
...@@ -44,7 +46,9 @@ import java.nio.ByteBuffer; ...@@ -44,7 +46,9 @@ import java.nio.ByteBuffer;
* @param numInputBuffers Number of input buffers. * @param numInputBuffers Number of input buffers.
* @param numOutputBuffers Number of output buffers. * @param numOutputBuffers Number of output buffers.
* @param initialInputBufferSize The initial size of each input buffer, in bytes. * @param initialInputBufferSize The initial size of each input buffer, in bytes.
* @param threads Number of threads libgav1 will use to decode. * @param threads Number of threads libgav1 will use to decode. If {@link
* Libgav1VideoRenderer#THREAD_COUNT_AUTODETECT} is passed, then this class will auto detect
* the number of threads to be used.
* @throws Gav1DecoderException Thrown if an exception occurs when initializing the decoder. * @throws Gav1DecoderException Thrown if an exception occurs when initializing the decoder.
*/ */
public Gav1Decoder( public Gav1Decoder(
...@@ -56,6 +60,16 @@ import java.nio.ByteBuffer; ...@@ -56,6 +60,16 @@ import java.nio.ByteBuffer;
if (!Gav1Library.isAvailable()) { if (!Gav1Library.isAvailable()) {
throw new Gav1DecoderException("Failed to load decoder native library."); throw new Gav1DecoderException("Failed to load decoder native library.");
} }
if (threads == Libgav1VideoRenderer.THREAD_COUNT_AUTODETECT) {
// Try to get the optimal number of threads from the AV1 heuristic.
threads = gav1GetThreads();
if (threads <= 0) {
// If that is not available, default to the number of available processors.
threads = getRuntime().availableProcessors();
}
}
gav1DecoderContext = gav1Init(threads); gav1DecoderContext = gav1Init(threads);
if (gav1DecoderContext == GAV1_ERROR || gav1CheckError(gav1DecoderContext) == GAV1_ERROR) { if (gav1DecoderContext == GAV1_ERROR || gav1CheckError(gav1DecoderContext) == GAV1_ERROR) {
throw new Gav1DecoderException( throw new Gav1DecoderException(
...@@ -231,4 +245,11 @@ import java.nio.ByteBuffer; ...@@ -231,4 +245,11 @@ import java.nio.ByteBuffer;
* @return {@link #GAV1_OK} if there was no error, {@link #GAV1_ERROR} if an error occured. * @return {@link #GAV1_OK} if there was no error, {@link #GAV1_ERROR} if an error occured.
*/ */
private native int gav1CheckError(long context); private native int gav1CheckError(long context);
/**
* Returns the optimal number of threads to be used for AV1 decoding.
*
* @return Optimal number of threads if there was no error, 0 if an error occurred.
*/
private native int gav1GetThreads();
} }
...@@ -15,8 +15,6 @@ ...@@ -15,8 +15,6 @@
*/ */
package com.google.android.exoplayer2.ext.av1; package com.google.android.exoplayer2.ext.av1;
import static java.lang.Runtime.getRuntime;
import android.os.Handler; import android.os.Handler;
import android.view.Surface; import android.view.Surface;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
...@@ -55,6 +53,13 @@ import com.google.android.exoplayer2.video.VideoRendererEventListener; ...@@ -55,6 +53,13 @@ import com.google.android.exoplayer2.video.VideoRendererEventListener;
*/ */
public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer { public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer {
/**
* Attempts to use as many threads as performance processors available on the device. If the
* number of performance processors cannot be detected, the number of available processors is
* used.
*/
public static final int THREAD_COUNT_AUTODETECT = 0;
private static final int DEFAULT_NUM_OF_INPUT_BUFFERS = 4; private static final int DEFAULT_NUM_OF_INPUT_BUFFERS = 4;
private static final int DEFAULT_NUM_OF_OUTPUT_BUFFERS = 4; private static final int DEFAULT_NUM_OF_OUTPUT_BUFFERS = 4;
/* Default size based on 720p resolution video compressed by a factor of two. */ /* Default size based on 720p resolution video compressed by a factor of two. */
...@@ -94,7 +99,7 @@ public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer { ...@@ -94,7 +99,7 @@ public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer {
eventHandler, eventHandler,
eventListener, eventListener,
maxDroppedFramesToNotify, maxDroppedFramesToNotify,
/* threads= */ getRuntime().availableProcessors(), THREAD_COUNT_AUTODETECT,
DEFAULT_NUM_OF_INPUT_BUFFERS, DEFAULT_NUM_OF_INPUT_BUFFERS,
DEFAULT_NUM_OF_OUTPUT_BUFFERS); DEFAULT_NUM_OF_OUTPUT_BUFFERS);
} }
...@@ -109,7 +114,9 @@ public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer { ...@@ -109,7 +114,9 @@ public class Libgav1VideoRenderer extends SimpleDecoderVideoRenderer {
* @param eventListener A listener of events. May be null if delivery of events is not required. * @param eventListener A listener of events. May be null if delivery of events is not required.
* @param maxDroppedFramesToNotify The maximum number of frames that can be dropped between * @param maxDroppedFramesToNotify The maximum number of frames that can be dropped between
* invocations of {@link VideoRendererEventListener#onDroppedFrames(int, long)}. * invocations of {@link VideoRendererEventListener#onDroppedFrames(int, long)}.
* @param threads Number of threads libgav1 will use to decode. * @param threads Number of threads libgav1 will use to decode. If
* {@link #THREAD_COUNT_AUTODETECT} is passed, then the number of threads to use is
* auto-detected based on CPU capabilities.
* @param numInputBuffers Number of input buffers. * @param numInputBuffers Number of input buffers.
* @param numOutputBuffers Number of output buffers. * @param numOutputBuffers Number of output buffers.
*/ */
......
...@@ -44,7 +44,9 @@ add_subdirectory("${libgav1_root}" ...@@ -44,7 +44,9 @@ add_subdirectory("${libgav1_root}"
# Build libgav1JNI. # Build libgav1JNI.
add_library(gav1JNI add_library(gav1JNI
SHARED SHARED
gav1_jni.cc) gav1_jni.cc
cpu_info.cc
cpu_info.h)
# Locate NDK log library. # Locate NDK log library.
find_library(android_log_lib log) find_library(android_log_lib log)
......
#include "cpu_info.h" // NOLINT
#include <unistd.h>
#include <cerrno>
#include <climits>
#include <cstdio>
#include <cstdlib>
#include <cstring>
namespace gav1_jni {
namespace {
// Note: The code in this file needs to use the 'long' type because it is the
// return type of the Standard C Library function strtol(). The linter warnings
// are suppressed with NOLINT comments since they are integers at runtime.
// Returns the number of online processor cores.
int GetNumberOfProcessorsOnline() {
// See https://developer.android.com/ndk/guides/cpu-features.
long num_cpus = sysconf(_SC_NPROCESSORS_ONLN); // NOLINT
if (num_cpus < 0) {
return 0;
}
// It is safe to cast num_cpus to int. sysconf(_SC_NPROCESSORS_ONLN) returns
// the return value of get_nprocs(), which is an int.
return static_cast<int>(num_cpus);
}
} // namespace
// These CPUs support heterogeneous multiprocessing.
#if defined(__arm__) || defined(__aarch64__)
// A helper function used by GetNumberOfPerformanceCoresOnline().
//
// Returns the cpuinfo_max_freq value (in kHz) of the given CPU. Returns 0 on
// failure.
long GetCpuinfoMaxFreq(int cpu_index) { // NOLINT
char buffer[128];
const int rv = snprintf(
buffer, sizeof(buffer),
"/sys/devices/system/cpu/cpu%d/cpufreq/cpuinfo_max_freq", cpu_index);
if (rv < 0 || rv >= sizeof(buffer)) {
return 0;
}
FILE* file = fopen(buffer, "r");
if (file == nullptr) {
return 0;
}
char* const str = fgets(buffer, sizeof(buffer), file);
fclose(file);
if (str == nullptr) {
return 0;
}
const long freq = strtol(str, nullptr, 10); // NOLINT
if (freq <= 0 || freq == LONG_MAX) {
return 0;
}
return freq;
}
// Returns the number of performance CPU cores that are online. The number of
// efficiency CPU cores is subtracted from the total number of CPU cores. Uses
// cpuinfo_max_freq to determine whether a CPU is a performance core or an
// efficiency core.
//
// This function is not perfect. For example, the Snapdragon 632 SoC used in
// Motorola Moto G7 has performance and efficiency cores with the same
// cpuinfo_max_freq but different cpuinfo_min_freq. This function fails to
// differentiate the two kinds of cores and reports all the cores as
// performance cores.
int GetNumberOfPerformanceCoresOnline() {
// Get the online CPU list. Some examples of the online CPU list are:
// "0-7"
// "0"
// "0-1,2,3,4-7"
FILE* file = fopen("/sys/devices/system/cpu/online", "r");
if (file == nullptr) {
return 0;
}
char online[512];
char* const str = fgets(online, sizeof(online), file);
fclose(file);
file = nullptr;
if (str == nullptr) {
return 0;
}
// Count the number of the slowest CPUs. Some SoCs such as Snapdragon 855
// have performance cores with different max frequencies, so only the slowest
// CPUs are efficiency cores. If we count the number of the fastest CPUs, we
// will fail to count the second fastest performance cores.
long slowest_cpu_freq = LONG_MAX; // NOLINT
int num_slowest_cpus = 0;
int num_cpus = 0;
const char* cp = online;
int range_begin = -1;
while (true) {
char* str_end;
const int cpu = static_cast<int>(strtol(cp, &str_end, 10)); // NOLINT
if (str_end == cp) {
break;
}
cp = str_end;
if (*cp == '-') {
range_begin = cpu;
} else {
if (range_begin == -1) {
range_begin = cpu;
}
num_cpus += cpu - range_begin + 1;
for (int i = range_begin; i <= cpu; ++i) {
const long freq = GetCpuinfoMaxFreq(i); // NOLINT
if (freq <= 0) {
return 0;
}
if (freq < slowest_cpu_freq) {
slowest_cpu_freq = freq;
num_slowest_cpus = 0;
}
if (freq == slowest_cpu_freq) {
++num_slowest_cpus;
}
}
range_begin = -1;
}
if (*cp == '\0') {
break;
}
++cp;
}
// If there are faster CPU cores than the slowest CPU cores, exclude the
// slowest CPU cores.
if (num_slowest_cpus < num_cpus) {
num_cpus -= num_slowest_cpus;
}
return num_cpus;
}
#else
// Assume symmetric multiprocessing.
int GetNumberOfPerformanceCoresOnline() {
return GetNumberOfProcessorsOnline();
}
#endif
} // namespace gav1_jni
#ifndef EXOPLAYER_V2_EXTENSIONS_AV1_SRC_MAIN_JNI_CPU_INFO_H_
#define EXOPLAYER_V2_EXTENSIONS_AV1_SRC_MAIN_JNI_CPU_INFO_H_
namespace gav1_jni {
// Returns the number of performance cores that are available for AV1 decoding.
// This is a heuristic that works on most common android devices. Returns 0 on
// error or if the number of performance cores cannot be determined.
int GetNumberOfPerformanceCoresOnline();
} // namespace gav1_jni
#endif // EXOPLAYER_V2_EXTENSIONS_AV1_SRC_MAIN_JNI_CPU_INFO_H_
...@@ -32,6 +32,7 @@ ...@@ -32,6 +32,7 @@
#include <mutex> // NOLINT #include <mutex> // NOLINT
#include <new> #include <new>
#include "cpu_info.h" // NOLINT
#include "gav1/decoder.h" #include "gav1/decoder.h"
#define LOG_TAG "gav1_jni" #define LOG_TAG "gav1_jni"
...@@ -774,5 +775,9 @@ DECODER_FUNC(jint, gav1CheckError, jlong jContext) { ...@@ -774,5 +775,9 @@ DECODER_FUNC(jint, gav1CheckError, jlong jContext) {
return kStatusOk; return kStatusOk;
} }
DECODER_FUNC(jint, gav1GetThreads) {
return gav1_jni::GetNumberOfPerformanceCoresOnline();
}
// TODO(b/139902005): Add functions for getting libgav1 version and build // TODO(b/139902005): Add functions for getting libgav1 version and build
// configuration once libgav1 ABI provides this information. // configuration once libgav1 ABI provides this information.
...@@ -20,6 +20,10 @@ Alternatively, you can clone the ExoPlayer repository and depend on the module ...@@ -20,6 +20,10 @@ Alternatively, you can clone the ExoPlayer repository and depend on the module
locally. Instructions for doing this can be found in ExoPlayer's locally. Instructions for doing this can be found in ExoPlayer's
[top level README][]. [top level README][].
Note that by default, the extension will use the Cronet implementation in
Google Play Services. If you prefer, it's also possible to embed the Cronet
implementation directly into your application. See below for more details.
[top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md [top level README]: https://github.com/google/ExoPlayer/blob/release-v2/README.md
## Using the extension ## ## Using the extension ##
...@@ -47,6 +51,46 @@ new DefaultDataSourceFactory( ...@@ -47,6 +51,46 @@ new DefaultDataSourceFactory(
``` ```
respectively. respectively.
## Choosing between Google Play Services Cronet and Cronet Embedded ##
The underlying Cronet implementation is available both via a [Google Play
Services](https://developers.google.com/android/guides/overview) API, and as a
library that can be embedded directly into your application. When you depend on
`com.google.android.exoplayer:extension-cronet:2.X.X`, the library will _not_ be
embedded into your application by default. The extension will attempt to use the
Cronet implementation in Google Play Services. The benefits of this approach
are:
* A negligible increase in the size of your application.
* The Cronet implementation is updated automatically by Google Play Services.
If Google Play Services is not available on a device, `CronetDataSourceFactory`
will fall back to creating `DefaultHttpDataSource` instances, or
`HttpDataSource` instances created by a `fallbackFactory` that you can specify.
It's also possible to embed the Cronet implementation directly into your
application. To do this, add an additional gradle dependency to the Cronet
Embedded library:
```gradle
implementation 'com.google.android.exoplayer:extension-cronet:2.X.X'
implementation 'org.chromium.net:cronet-embedded:XX.XXXX.XXX'
```
where `XX.XXXX.XXX` is the version of the library that you wish to use. The
extension will automatically detect and use the library. Embedding will add
approximately 8MB to your application, however it may be suitable if:
* Your application is likely to be used in markets where Google Play Services is
not widely available.
* You want to control the exact version of the Cronet implementation being used.
If you do embed the library, you can specify which implementation should
be preferred if the Google Play Services implementation is also available. This
is controlled by a `preferGMSCoreCronet` parameter, which can be passed to the
`CronetEngineWrapper` constructor (GMS Core is another name for Google Play
Services).
## Links ## ## Links ##
* [Javadoc][]: Classes matching `com.google.android.exoplayer2.ext.cronet.*` * [Javadoc][]: Classes matching `com.google.android.exoplayer2.ext.cronet.*`
......
...@@ -31,7 +31,7 @@ android { ...@@ -31,7 +31,7 @@ android {
} }
dependencies { dependencies {
api 'org.chromium.net:cronet-embedded:76.3809.111' api "com.google.android.gms:play-services-cronet:17.0.0"
implementation project(modulePrefix + 'library-core') implementation project(modulePrefix + 'library-core')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion
......
...@@ -32,6 +32,7 @@ import com.google.android.exoplayer2.util.ConditionVariable; ...@@ -32,6 +32,7 @@ import com.google.android.exoplayer2.util.ConditionVariable;
import com.google.android.exoplayer2.util.Log; import com.google.android.exoplayer2.util.Log;
import com.google.android.exoplayer2.util.Predicate; import com.google.android.exoplayer2.util.Predicate;
import java.io.IOException; import java.io.IOException;
import java.io.InterruptedIOException;
import java.net.SocketTimeoutException; import java.net.SocketTimeoutException;
import java.net.UnknownHostException; import java.net.UnknownHostException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
...@@ -83,14 +84,6 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource { ...@@ -83,14 +84,6 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
} }
/** Thrown on catching an InterruptedException. */
public static final class InterruptedIOException extends IOException {
public InterruptedIOException(InterruptedException e) {
super(e);
}
}
static { static {
ExoPlayerLibraryInfo.registerModule("goog.exo.cronet"); ExoPlayerLibraryInfo.registerModule("goog.exo.cronet");
} }
...@@ -440,7 +433,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource { ...@@ -440,7 +433,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
} }
} catch (InterruptedException e) { } catch (InterruptedException e) {
Thread.currentThread().interrupt(); Thread.currentThread().interrupt();
throw new OpenException(new InterruptedIOException(e), dataSpec, Status.INVALID); throw new OpenException(new InterruptedIOException(), dataSpec, Status.INVALID);
} }
// Check for a valid response code. // Check for a valid response code.
...@@ -705,7 +698,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource { ...@@ -705,7 +698,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
if (dataSpec.httpBody != null && !requestHeaders.containsKey(CONTENT_TYPE)) { if (dataSpec.httpBody != null && !requestHeaders.containsKey(CONTENT_TYPE)) {
throw new IOException("HTTP request with non-empty body must set Content-Type"); throw new IOException("HTTP request with non-empty body must set Content-Type");
} }
// Set the Range header. // Set the Range header.
if (dataSpec.position != 0 || dataSpec.length != C.LENGTH_UNSET) { if (dataSpec.position != 0 || dataSpec.length != C.LENGTH_UNSET) {
StringBuilder rangeValue = new StringBuilder(); StringBuilder rangeValue = new StringBuilder();
...@@ -769,7 +762,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource { ...@@ -769,7 +762,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
} }
Thread.currentThread().interrupt(); Thread.currentThread().interrupt();
throw new HttpDataSourceException( throw new HttpDataSourceException(
new InterruptedIOException(e), new InterruptedIOException(),
castNonNull(currentDataSpec), castNonNull(currentDataSpec),
HttpDataSourceException.TYPE_READ); HttpDataSourceException.TYPE_READ);
} catch (SocketTimeoutException e) { } catch (SocketTimeoutException e) {
......
...@@ -40,6 +40,7 @@ import com.google.android.exoplayer2.upstream.TransferListener; ...@@ -40,6 +40,7 @@ import com.google.android.exoplayer2.upstream.TransferListener;
import com.google.android.exoplayer2.util.Clock; import com.google.android.exoplayer2.util.Clock;
import com.google.android.exoplayer2.util.Util; import com.google.android.exoplayer2.util.Util;
import java.io.IOException; import java.io.IOException;
import java.io.InterruptedIOException;
import java.net.SocketTimeoutException; import java.net.SocketTimeoutException;
import java.net.UnknownHostException; import java.net.UnknownHostException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
...@@ -47,6 +48,7 @@ import java.util.ArrayList; ...@@ -47,6 +48,7 @@ import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.concurrent.CountDownLatch; import java.util.concurrent.CountDownLatch;
import java.util.concurrent.Executor; import java.util.concurrent.Executor;
...@@ -56,7 +58,6 @@ import org.chromium.net.CronetEngine; ...@@ -56,7 +58,6 @@ import org.chromium.net.CronetEngine;
import org.chromium.net.NetworkException; import org.chromium.net.NetworkException;
import org.chromium.net.UrlRequest; import org.chromium.net.UrlRequest;
import org.chromium.net.UrlResponseInfo; import org.chromium.net.UrlResponseInfo;
import org.chromium.net.impl.UrlResponseInfoImpl;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
import org.junit.runner.RunWith; import org.junit.runner.RunWith;
...@@ -139,15 +140,62 @@ public final class CronetDataSourceTest { ...@@ -139,15 +140,62 @@ public final class CronetDataSourceTest {
private UrlResponseInfo createUrlResponseInfoWithUrl(String url, int statusCode) { private UrlResponseInfo createUrlResponseInfoWithUrl(String url, int statusCode) {
ArrayList<Map.Entry<String, String>> responseHeaderList = new ArrayList<>(); ArrayList<Map.Entry<String, String>> responseHeaderList = new ArrayList<>();
responseHeaderList.addAll(testResponseHeader.entrySet()); Map<String, List<String>> responseHeaderMap = new HashMap<>();
return new UrlResponseInfoImpl( for (Map.Entry<String, String> entry : testResponseHeader.entrySet()) {
Collections.singletonList(url), responseHeaderList.add(entry);
statusCode, responseHeaderMap.put(entry.getKey(), Collections.singletonList(entry.getValue()));
null, // httpStatusText }
responseHeaderList, return new UrlResponseInfo() {
false, // wasCached @Override
null, // negotiatedProtocol public String getUrl() {
null); // proxyServer return url;
}
@Override
public List<String> getUrlChain() {
return Collections.singletonList(url);
}
@Override
public int getHttpStatusCode() {
return statusCode;
}
@Override
public String getHttpStatusText() {
return null;
}
@Override
public List<Map.Entry<String, String>> getAllHeadersAsList() {
return responseHeaderList;
}
@Override
public Map<String, List<String>> getAllHeaders() {
return responseHeaderMap;
}
@Override
public boolean wasCached() {
return false;
}
@Override
public String getNegotiatedProtocol() {
return null;
}
@Override
public String getProxyServer() {
return null;
}
@Override
public long getReceivedByteCount() {
return 0;
}
};
} }
@Test @Test
...@@ -282,7 +330,7 @@ public final class CronetDataSourceTest { ...@@ -282,7 +330,7 @@ public final class CronetDataSourceTest {
fail("HttpDataSource.HttpDataSourceException expected"); fail("HttpDataSource.HttpDataSourceException expected");
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Check for connection not automatically closed. // Check for connection not automatically closed.
assertThat(e.getCause() instanceof UnknownHostException).isFalse(); assertThat(e).hasCauseThat().isNotInstanceOf(UnknownHostException.class);
verify(mockUrlRequest, never()).cancel(); verify(mockUrlRequest, never()).cancel();
verify(mockTransferListener, never()) verify(mockTransferListener, never())
.onTransferStart(dataSourceUnderTest, testDataSpec, /* isNetwork= */ true); .onTransferStart(dataSourceUnderTest, testDataSpec, /* isNetwork= */ true);
...@@ -320,7 +368,7 @@ public final class CronetDataSourceTest { ...@@ -320,7 +368,7 @@ public final class CronetDataSourceTest {
fail("HttpDataSource.HttpDataSourceException expected"); fail("HttpDataSource.HttpDataSourceException expected");
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Check for connection not automatically closed. // Check for connection not automatically closed.
assertThat(e.getCause() instanceof UnknownHostException).isTrue(); assertThat(e).hasCauseThat().isInstanceOf(UnknownHostException.class);
verify(mockUrlRequest, never()).cancel(); verify(mockUrlRequest, never()).cancel();
verify(mockTransferListener, never()) verify(mockTransferListener, never())
.onTransferStart(dataSourceUnderTest, testDataSpec, /* isNetwork= */ true); .onTransferStart(dataSourceUnderTest, testDataSpec, /* isNetwork= */ true);
...@@ -336,7 +384,7 @@ public final class CronetDataSourceTest { ...@@ -336,7 +384,7 @@ public final class CronetDataSourceTest {
dataSourceUnderTest.open(testDataSpec); dataSourceUnderTest.open(testDataSpec);
fail("HttpDataSource.HttpDataSourceException expected"); fail("HttpDataSource.HttpDataSourceException expected");
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
assertThat(e instanceof HttpDataSource.InvalidResponseCodeException).isTrue(); assertThat(e).isInstanceOf(HttpDataSource.InvalidResponseCodeException.class);
// Check for connection not automatically closed. // Check for connection not automatically closed.
verify(mockUrlRequest, never()).cancel(); verify(mockUrlRequest, never()).cancel();
verify(mockTransferListener, never()) verify(mockTransferListener, never())
...@@ -359,7 +407,7 @@ public final class CronetDataSourceTest { ...@@ -359,7 +407,7 @@ public final class CronetDataSourceTest {
dataSourceUnderTest.open(testDataSpec); dataSourceUnderTest.open(testDataSpec);
fail("HttpDataSource.HttpDataSourceException expected"); fail("HttpDataSource.HttpDataSourceException expected");
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
assertThat(e instanceof HttpDataSource.InvalidContentTypeException).isTrue(); assertThat(e).isInstanceOf(HttpDataSource.InvalidContentTypeException.class);
// Check for connection not automatically closed. // Check for connection not automatically closed.
verify(mockUrlRequest, never()).cancel(); verify(mockUrlRequest, never()).cancel();
assertThat(testedContentTypes).hasSize(1); assertThat(testedContentTypes).hasSize(1);
...@@ -890,8 +938,8 @@ public final class CronetDataSourceTest { ...@@ -890,8 +938,8 @@ public final class CronetDataSourceTest {
fail(); fail();
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Expected. // Expected.
assertThat(e instanceof CronetDataSource.OpenException).isTrue(); assertThat(e).isInstanceOf(CronetDataSource.OpenException.class);
assertThat(e.getCause() instanceof SocketTimeoutException).isTrue(); assertThat(e).hasCauseThat().isInstanceOf(SocketTimeoutException.class);
assertThat(((CronetDataSource.OpenException) e).cronetConnectionStatus) assertThat(((CronetDataSource.OpenException) e).cronetConnectionStatus)
.isEqualTo(TEST_CONNECTION_STATUS); .isEqualTo(TEST_CONNECTION_STATUS);
timedOutLatch.countDown(); timedOutLatch.countDown();
...@@ -928,8 +976,8 @@ public final class CronetDataSourceTest { ...@@ -928,8 +976,8 @@ public final class CronetDataSourceTest {
fail(); fail();
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Expected. // Expected.
assertThat(e instanceof CronetDataSource.OpenException).isTrue(); assertThat(e).isInstanceOf(CronetDataSource.OpenException.class);
assertThat(e.getCause() instanceof CronetDataSource.InterruptedIOException).isTrue(); assertThat(e).hasCauseThat().isInstanceOf(InterruptedIOException.class);
assertThat(((CronetDataSource.OpenException) e).cronetConnectionStatus) assertThat(((CronetDataSource.OpenException) e).cronetConnectionStatus)
.isEqualTo(TEST_INVALID_CONNECTION_STATUS); .isEqualTo(TEST_INVALID_CONNECTION_STATUS);
timedOutLatch.countDown(); timedOutLatch.countDown();
...@@ -999,8 +1047,8 @@ public final class CronetDataSourceTest { ...@@ -999,8 +1047,8 @@ public final class CronetDataSourceTest {
fail(); fail();
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Expected. // Expected.
assertThat(e instanceof CronetDataSource.OpenException).isTrue(); assertThat(e).isInstanceOf(CronetDataSource.OpenException.class);
assertThat(e.getCause() instanceof SocketTimeoutException).isTrue(); assertThat(e).hasCauseThat().isInstanceOf(SocketTimeoutException.class);
openExceptions.getAndIncrement(); openExceptions.getAndIncrement();
timedOutLatch.countDown(); timedOutLatch.countDown();
} }
...@@ -1224,7 +1272,7 @@ public final class CronetDataSourceTest { ...@@ -1224,7 +1272,7 @@ public final class CronetDataSourceTest {
fail(); fail();
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Expected. // Expected.
assertThat(e.getCause() instanceof CronetDataSource.InterruptedIOException).isTrue(); assertThat(e).hasCauseThat().isInstanceOf(InterruptedIOException.class);
timedOutLatch.countDown(); timedOutLatch.countDown();
} }
} }
...@@ -1255,7 +1303,7 @@ public final class CronetDataSourceTest { ...@@ -1255,7 +1303,7 @@ public final class CronetDataSourceTest {
fail(); fail();
} catch (HttpDataSourceException e) { } catch (HttpDataSourceException e) {
// Expected. // Expected.
assertThat(e.getCause() instanceof CronetDataSource.InterruptedIOException).isTrue(); assertThat(e).hasCauseThat().isInstanceOf(InterruptedIOException.class);
timedOutLatch.countDown(); timedOutLatch.countDown();
} }
} }
......
...@@ -32,11 +32,12 @@ android { ...@@ -32,11 +32,12 @@ android {
} }
dependencies { dependencies {
api 'com.google.ads.interactivemedia.v3:interactivemedia:3.11.3' api 'com.google.ads.interactivemedia.v3:interactivemedia:3.19.0'
implementation project(modulePrefix + 'library-core') implementation project(modulePrefix + 'library-core')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
implementation 'com.google.android.gms:play-services-ads-identifier:17.0.0' implementation 'com.google.android.gms:play-services-ads-identifier:17.0.0'
testImplementation project(modulePrefix + 'testutils') testImplementation project(modulePrefix + 'testutils')
testImplementation 'com.google.guava:guava:' + guavaVersion
testImplementation 'org.robolectric:robolectric:' + robolectricVersion testImplementation 'org.robolectric:robolectric:' + robolectricVersion
} }
......
/*
* Copyright (C) 2018 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.ima;
import com.google.ads.interactivemedia.v3.api.Ad;
import com.google.ads.interactivemedia.v3.api.AdPodInfo;
import com.google.ads.interactivemedia.v3.api.CompanionAd;
import com.google.ads.interactivemedia.v3.api.UiElement;
import java.util.List;
import java.util.Set;
/** A fake ad for testing. */
/* package */ final class FakeAd implements Ad {
private final boolean skippable;
private final AdPodInfo adPodInfo;
public FakeAd(boolean skippable, int podIndex, int totalAds, int adPosition) {
this.skippable = skippable;
adPodInfo =
new AdPodInfo() {
@Override
public int getTotalAds() {
return totalAds;
}
@Override
public int getAdPosition() {
return adPosition;
}
@Override
public int getPodIndex() {
return podIndex;
}
@Override
public boolean isBumper() {
throw new UnsupportedOperationException();
}
@Override
public double getMaxDuration() {
throw new UnsupportedOperationException();
}
@Override
public double getTimeOffset() {
throw new UnsupportedOperationException();
}
};
}
@Override
public int getVastMediaWidth() {
throw new UnsupportedOperationException();
}
@Override
public int getVastMediaHeight() {
throw new UnsupportedOperationException();
}
@Override
public int getVastMediaBitrate() {
throw new UnsupportedOperationException();
}
@Override
public boolean isSkippable() {
return skippable;
}
@Override
public AdPodInfo getAdPodInfo() {
return adPodInfo;
}
@Override
public String getAdId() {
throw new UnsupportedOperationException();
}
@Override
public String getCreativeId() {
throw new UnsupportedOperationException();
}
@Override
public String getCreativeAdId() {
throw new UnsupportedOperationException();
}
@Override
public String getUniversalAdIdValue() {
throw new UnsupportedOperationException();
}
@Override
public String getUniversalAdIdRegistry() {
throw new UnsupportedOperationException();
}
@Override
public String getAdSystem() {
throw new UnsupportedOperationException();
}
@Override
public String[] getAdWrapperIds() {
throw new UnsupportedOperationException();
}
@Override
public String[] getAdWrapperSystems() {
throw new UnsupportedOperationException();
}
@Override
public String[] getAdWrapperCreativeIds() {
throw new UnsupportedOperationException();
}
@Override
public boolean isLinear() {
throw new UnsupportedOperationException();
}
@Override
public double getSkipTimeOffset() {
throw new UnsupportedOperationException();
}
@Override
public boolean isUiDisabled() {
throw new UnsupportedOperationException();
}
@Override
public String getDescription() {
throw new UnsupportedOperationException();
}
@Override
public String getTitle() {
throw new UnsupportedOperationException();
}
@Override
public String getContentType() {
throw new UnsupportedOperationException();
}
@Override
public String getAdvertiserName() {
throw new UnsupportedOperationException();
}
@Override
public String getSurveyUrl() {
throw new UnsupportedOperationException();
}
@Override
public String getDealId() {
throw new UnsupportedOperationException();
}
@Override
public int getWidth() {
throw new UnsupportedOperationException();
}
@Override
public int getHeight() {
throw new UnsupportedOperationException();
}
@Override
public String getTraffickingParameters() {
throw new UnsupportedOperationException();
}
@Override
public double getDuration() {
throw new UnsupportedOperationException();
}
@Override
public Set<UiElement> getUiElements() {
throw new UnsupportedOperationException();
}
@Override
public List<CompanionAd> getCompanionAds() {
throw new UnsupportedOperationException();
}
}
/*
* Copyright (C) 2018 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.ima;
import com.google.ads.interactivemedia.v3.api.AdErrorEvent.AdErrorListener;
import com.google.ads.interactivemedia.v3.api.AdsManager;
import com.google.ads.interactivemedia.v3.api.AdsManagerLoadedEvent;
import com.google.ads.interactivemedia.v3.api.AdsRequest;
import com.google.ads.interactivemedia.v3.api.ImaSdkSettings;
import com.google.ads.interactivemedia.v3.api.StreamManager;
import com.google.ads.interactivemedia.v3.api.StreamRequest;
import com.google.android.exoplayer2.util.Assertions;
import java.util.ArrayList;
/** Fake {@link com.google.ads.interactivemedia.v3.api.AdsLoader} implementation for tests. */
public final class FakeAdsLoader implements com.google.ads.interactivemedia.v3.api.AdsLoader {
private final ImaSdkSettings imaSdkSettings;
private final AdsManager adsManager;
private final ArrayList<AdsLoadedListener> adsLoadedListeners;
private final ArrayList<AdErrorListener> adErrorListeners;
public FakeAdsLoader(ImaSdkSettings imaSdkSettings, AdsManager adsManager) {
this.imaSdkSettings = Assertions.checkNotNull(imaSdkSettings);
this.adsManager = Assertions.checkNotNull(adsManager);
adsLoadedListeners = new ArrayList<>();
adErrorListeners = new ArrayList<>();
}
@Override
public void contentComplete() {
// Do nothing.
}
@Override
public ImaSdkSettings getSettings() {
return imaSdkSettings;
}
@Override
public void requestAds(AdsRequest adsRequest) {
for (AdsLoadedListener listener : adsLoadedListeners) {
listener.onAdsManagerLoaded(
new AdsManagerLoadedEvent() {
@Override
public AdsManager getAdsManager() {
return adsManager;
}
@Override
public StreamManager getStreamManager() {
throw new UnsupportedOperationException();
}
@Override
public Object getUserRequestContext() {
return adsRequest.getUserRequestContext();
}
});
}
}
@Override
public String requestStream(StreamRequest streamRequest) {
throw new UnsupportedOperationException();
}
@Override
public void addAdsLoadedListener(AdsLoadedListener adsLoadedListener) {
adsLoadedListeners.add(adsLoadedListener);
}
@Override
public void removeAdsLoadedListener(AdsLoadedListener adsLoadedListener) {
adsLoadedListeners.remove(adsLoadedListener);
}
@Override
public void addAdErrorListener(AdErrorListener adErrorListener) {
adErrorListeners.add(adErrorListener);
}
@Override
public void removeAdErrorListener(AdErrorListener adErrorListener) {
adErrorListeners.remove(adErrorListener);
}
}
/*
* Copyright (C) 2018 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.ima;
import com.google.ads.interactivemedia.v3.api.AdDisplayContainer;
import com.google.ads.interactivemedia.v3.api.AdsRequest;
import com.google.ads.interactivemedia.v3.api.player.ContentProgressProvider;
import java.util.List;
import java.util.Map;
/** Fake {@link AdsRequest} implementation for tests. */
public final class FakeAdsRequest implements AdsRequest {
private String adTagUrl;
private String adsResponse;
private Object userRequestContext;
private AdDisplayContainer adDisplayContainer;
private ContentProgressProvider contentProgressProvider;
@Override
public void setAdTagUrl(String adTagUrl) {
this.adTagUrl = adTagUrl;
}
@Override
public String getAdTagUrl() {
return adTagUrl;
}
@Override
public void setExtraParameter(String s, String s1) {
throw new UnsupportedOperationException();
}
@Override
public String getExtraParameter(String s) {
throw new UnsupportedOperationException();
}
@Override
public Map<String, String> getExtraParameters() {
throw new UnsupportedOperationException();
}
@Override
public void setUserRequestContext(Object userRequestContext) {
this.userRequestContext = userRequestContext;
}
@Override
public Object getUserRequestContext() {
return userRequestContext;
}
@Override
public AdDisplayContainer getAdDisplayContainer() {
return adDisplayContainer;
}
@Override
public void setAdDisplayContainer(AdDisplayContainer adDisplayContainer) {
this.adDisplayContainer = adDisplayContainer;
}
@Override
public ContentProgressProvider getContentProgressProvider() {
return contentProgressProvider;
}
@Override
public void setContentProgressProvider(ContentProgressProvider contentProgressProvider) {
this.contentProgressProvider = contentProgressProvider;
}
@Override
public String getAdsResponse() {
return adsResponse;
}
@Override
public void setAdsResponse(String adsResponse) {
this.adsResponse = adsResponse;
}
@Override
public void setAdWillAutoPlay(boolean b) {
throw new UnsupportedOperationException();
}
@Override
public void setAdWillPlayMuted(boolean b) {
throw new UnsupportedOperationException();
}
@Override
public void setContentDuration(float v) {
throw new UnsupportedOperationException();
}
@Override
public void setContentKeywords(List<String> list) {
throw new UnsupportedOperationException();
}
@Override
public void setContentTitle(String s) {
throw new UnsupportedOperationException();
}
@Override
public void setVastLoadTimeout(float v) {
throw new UnsupportedOperationException();
}
@Override
public void setLiveStreamPrefetchSeconds(float v) {
throw new UnsupportedOperationException();
}
}
/*
* Copyright (C) 2018 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.ext.ima;
import android.content.Context;
import com.google.ads.interactivemedia.v3.api.AdDisplayContainer;
import com.google.ads.interactivemedia.v3.api.AdsLoader;
import com.google.ads.interactivemedia.v3.api.AdsRenderingSettings;
import com.google.ads.interactivemedia.v3.api.AdsRequest;
import com.google.ads.interactivemedia.v3.api.ImaSdkSettings;
/** {@link ImaAdsLoader.ImaFactory} that returns provided instances from each getter, for tests. */
final class SingletonImaFactory implements ImaAdsLoader.ImaFactory {
private final ImaSdkSettings imaSdkSettings;
private final AdsRenderingSettings adsRenderingSettings;
private final AdDisplayContainer adDisplayContainer;
private final AdsRequest adsRequest;
private final com.google.ads.interactivemedia.v3.api.AdsLoader adsLoader;
public SingletonImaFactory(
ImaSdkSettings imaSdkSettings,
AdsRenderingSettings adsRenderingSettings,
AdDisplayContainer adDisplayContainer,
AdsRequest adsRequest,
com.google.ads.interactivemedia.v3.api.AdsLoader adsLoader) {
this.imaSdkSettings = imaSdkSettings;
this.adsRenderingSettings = adsRenderingSettings;
this.adDisplayContainer = adDisplayContainer;
this.adsRequest = adsRequest;
this.adsLoader = adsLoader;
}
@Override
public ImaSdkSettings createImaSdkSettings() {
return imaSdkSettings;
}
@Override
public AdsRenderingSettings createAdsRenderingSettings() {
return adsRenderingSettings;
}
@Override
public AdDisplayContainer createAdDisplayContainer() {
return adDisplayContainer;
}
@Override
public AdsRequest createAdsRequest() {
return adsRequest;
}
@Override
public AdsLoader createAdsLoader(
Context context, ImaSdkSettings imaSdkSettings, AdDisplayContainer adDisplayContainer) {
return adsLoader;
}
}
...@@ -218,25 +218,25 @@ public final class MediaSessionConnector { ...@@ -218,25 +218,25 @@ public final class MediaSessionConnector {
* *
* @param mediaId The media id of the media item to be prepared. * @param mediaId The media id of the media item to be prepared.
* @param playWhenReady Whether playback should be started after preparation. * @param playWhenReady Whether playback should be started after preparation.
* @param extras A {@link Bundle} of extras passed by the media controller. * @param extras A {@link Bundle} of extras passed by the media controller, may be null.
*/ */
void onPrepareFromMediaId(String mediaId, boolean playWhenReady, Bundle extras); void onPrepareFromMediaId(String mediaId, boolean playWhenReady, @Nullable Bundle extras);
/** /**
* See {@link MediaSessionCompat.Callback#onPrepareFromSearch(String, Bundle)}. * See {@link MediaSessionCompat.Callback#onPrepareFromSearch(String, Bundle)}.
* *
* @param query The search query. * @param query The search query.
* @param playWhenReady Whether playback should be started after preparation. * @param playWhenReady Whether playback should be started after preparation.
* @param extras A {@link Bundle} of extras passed by the media controller. * @param extras A {@link Bundle} of extras passed by the media controller, may be null.
*/ */
void onPrepareFromSearch(String query, boolean playWhenReady, Bundle extras); void onPrepareFromSearch(String query, boolean playWhenReady, @Nullable Bundle extras);
/** /**
* See {@link MediaSessionCompat.Callback#onPrepareFromUri(Uri, Bundle)}. * See {@link MediaSessionCompat.Callback#onPrepareFromUri(Uri, Bundle)}.
* *
* @param uri The {@link Uri} of the media item to be prepared. * @param uri The {@link Uri} of the media item to be prepared.
* @param playWhenReady Whether playback should be started after preparation. * @param playWhenReady Whether playback should be started after preparation.
* @param extras A {@link Bundle} of extras passed by the media controller. * @param extras A {@link Bundle} of extras passed by the media controller, may be null.
*/ */
void onPrepareFromUri(Uri uri, boolean playWhenReady, Bundle extras); void onPrepareFromUri(Uri uri, boolean playWhenReady, @Nullable Bundle extras);
} }
/** /**
...@@ -336,7 +336,7 @@ public final class MediaSessionConnector { ...@@ -336,7 +336,7 @@ public final class MediaSessionConnector {
void onSetRating(Player player, RatingCompat rating); void onSetRating(Player player, RatingCompat rating);
/** See {@link MediaSessionCompat.Callback#onSetRating(RatingCompat, Bundle)}. */ /** See {@link MediaSessionCompat.Callback#onSetRating(RatingCompat, Bundle)}. */
void onSetRating(Player player, RatingCompat rating, Bundle extras); void onSetRating(Player player, RatingCompat rating, @Nullable Bundle extras);
} }
/** Handles requests for enabling or disabling captions. */ /** Handles requests for enabling or disabling captions. */
...@@ -381,7 +381,7 @@ public final class MediaSessionConnector { ...@@ -381,7 +381,7 @@ public final class MediaSessionConnector {
* @param controlDispatcher A {@link ControlDispatcher} that should be used for dispatching * @param controlDispatcher A {@link ControlDispatcher} that should be used for dispatching
* changes to the player. * changes to the player.
* @param action The name of the action which was sent by a media controller. * @param action The name of the action which was sent by a media controller.
* @param extras Optional extras sent by a media controller. * @param extras Optional extras sent by a media controller, may be null.
*/ */
void onCustomAction( void onCustomAction(
Player player, ControlDispatcher controlDispatcher, String action, @Nullable Bundle extras); Player player, ControlDispatcher controlDispatcher, String action, @Nullable Bundle extras);
...@@ -987,7 +987,9 @@ public final class MediaSessionConnector { ...@@ -987,7 +987,9 @@ public final class MediaSessionConnector {
@Player.State int exoPlayerPlaybackState, boolean playWhenReady) { @Player.State int exoPlayerPlaybackState, boolean playWhenReady) {
switch (exoPlayerPlaybackState) { switch (exoPlayerPlaybackState) {
case Player.STATE_BUFFERING: case Player.STATE_BUFFERING:
return PlaybackStateCompat.STATE_BUFFERING; return playWhenReady
? PlaybackStateCompat.STATE_BUFFERING
: PlaybackStateCompat.STATE_PAUSED;
case Player.STATE_READY: case Player.STATE_READY:
return playWhenReady ? PlaybackStateCompat.STATE_PLAYING : PlaybackStateCompat.STATE_PAUSED; return playWhenReady ? PlaybackStateCompat.STATE_PLAYING : PlaybackStateCompat.STATE_PAUSED;
case Player.STATE_ENDED: case Player.STATE_ENDED:
...@@ -1319,42 +1321,42 @@ public final class MediaSessionConnector { ...@@ -1319,42 +1321,42 @@ public final class MediaSessionConnector {
} }
@Override @Override
public void onPrepareFromMediaId(String mediaId, Bundle extras) { public void onPrepareFromMediaId(String mediaId, @Nullable Bundle extras) {
if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PREPARE_FROM_MEDIA_ID)) { if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PREPARE_FROM_MEDIA_ID)) {
playbackPreparer.onPrepareFromMediaId(mediaId, /* playWhenReady= */ false, extras); playbackPreparer.onPrepareFromMediaId(mediaId, /* playWhenReady= */ false, extras);
} }
} }
@Override @Override
public void onPrepareFromSearch(String query, Bundle extras) { public void onPrepareFromSearch(String query, @Nullable Bundle extras) {
if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PREPARE_FROM_SEARCH)) { if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PREPARE_FROM_SEARCH)) {
playbackPreparer.onPrepareFromSearch(query, /* playWhenReady= */ false, extras); playbackPreparer.onPrepareFromSearch(query, /* playWhenReady= */ false, extras);
} }
} }
@Override @Override
public void onPrepareFromUri(Uri uri, Bundle extras) { public void onPrepareFromUri(Uri uri, @Nullable Bundle extras) {
if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PREPARE_FROM_URI)) { if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PREPARE_FROM_URI)) {
playbackPreparer.onPrepareFromUri(uri, /* playWhenReady= */ false, extras); playbackPreparer.onPrepareFromUri(uri, /* playWhenReady= */ false, extras);
} }
} }
@Override @Override
public void onPlayFromMediaId(String mediaId, Bundle extras) { public void onPlayFromMediaId(String mediaId, @Nullable Bundle extras) {
if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PLAY_FROM_MEDIA_ID)) { if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PLAY_FROM_MEDIA_ID)) {
playbackPreparer.onPrepareFromMediaId(mediaId, /* playWhenReady= */ true, extras); playbackPreparer.onPrepareFromMediaId(mediaId, /* playWhenReady= */ true, extras);
} }
} }
@Override @Override
public void onPlayFromSearch(String query, Bundle extras) { public void onPlayFromSearch(String query, @Nullable Bundle extras) {
if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PLAY_FROM_SEARCH)) { if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PLAY_FROM_SEARCH)) {
playbackPreparer.onPrepareFromSearch(query, /* playWhenReady= */ true, extras); playbackPreparer.onPrepareFromSearch(query, /* playWhenReady= */ true, extras);
} }
} }
@Override @Override
public void onPlayFromUri(Uri uri, Bundle extras) { public void onPlayFromUri(Uri uri, @Nullable Bundle extras) {
if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PLAY_FROM_URI)) { if (canDispatchToPlaybackPreparer(PlaybackStateCompat.ACTION_PLAY_FROM_URI)) {
playbackPreparer.onPrepareFromUri(uri, /* playWhenReady= */ true, extras); playbackPreparer.onPrepareFromUri(uri, /* playWhenReady= */ true, extras);
} }
...@@ -1368,7 +1370,7 @@ public final class MediaSessionConnector { ...@@ -1368,7 +1370,7 @@ public final class MediaSessionConnector {
} }
@Override @Override
public void onSetRating(RatingCompat rating, Bundle extras) { public void onSetRating(RatingCompat rating, @Nullable Bundle extras) {
if (canDispatchSetRating()) { if (canDispatchSetRating()) {
ratingCallback.onSetRating(player, rating, extras); ratingCallback.onSetRating(player, rating, extras);
} }
......
...@@ -41,7 +41,7 @@ dependencies { ...@@ -41,7 +41,7 @@ dependencies {
// https://cashapp.github.io/2019-02-05/okhttp-3-13-requires-android-5 // https://cashapp.github.io/2019-02-05/okhttp-3-13-requires-android-5
// Since OkHttp is distributed as a jar rather than an aar, Gradle won't // Since OkHttp is distributed as a jar rather than an aar, Gradle won't
// stop us from making this mistake! // stop us from making this mistake!
api 'com.squareup.okhttp3:okhttp:3.12.8' api 'com.squareup.okhttp3:okhttp:3.12.11'
} }
ext { ext {
......
...@@ -1056,7 +1056,8 @@ public final class C { ...@@ -1056,7 +1056,8 @@ public final class C {
* #ROLE_FLAG_DUB}, {@link #ROLE_FLAG_EMERGENCY}, {@link #ROLE_FLAG_CAPTION}, {@link * #ROLE_FLAG_DUB}, {@link #ROLE_FLAG_EMERGENCY}, {@link #ROLE_FLAG_CAPTION}, {@link
* #ROLE_FLAG_SUBTITLE}, {@link #ROLE_FLAG_SIGN}, {@link #ROLE_FLAG_DESCRIBES_VIDEO}, {@link * #ROLE_FLAG_SUBTITLE}, {@link #ROLE_FLAG_SIGN}, {@link #ROLE_FLAG_DESCRIBES_VIDEO}, {@link
* #ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND}, {@link #ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY}, * #ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND}, {@link #ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY},
* {@link #ROLE_FLAG_TRANSCRIBES_DIALOG} and {@link #ROLE_FLAG_EASY_TO_READ}. * {@link #ROLE_FLAG_TRANSCRIBES_DIALOG}, {@link #ROLE_FLAG_EASY_TO_READ} and {@link
* #ROLE_FLAG_TRICK_PLAY}.
*/ */
@Documented @Documented
@Retention(RetentionPolicy.SOURCE) @Retention(RetentionPolicy.SOURCE)
...@@ -1076,7 +1077,8 @@ public final class C { ...@@ -1076,7 +1077,8 @@ public final class C {
ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND, ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND,
ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY, ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY,
ROLE_FLAG_TRANSCRIBES_DIALOG, ROLE_FLAG_TRANSCRIBES_DIALOG,
ROLE_FLAG_EASY_TO_READ ROLE_FLAG_EASY_TO_READ,
ROLE_FLAG_TRICK_PLAY
}) })
public @interface RoleFlags {} public @interface RoleFlags {}
/** Indicates a main track. */ /** Indicates a main track. */
...@@ -1122,6 +1124,8 @@ public final class C { ...@@ -1122,6 +1124,8 @@ public final class C {
public static final int ROLE_FLAG_TRANSCRIBES_DIALOG = 1 << 12; public static final int ROLE_FLAG_TRANSCRIBES_DIALOG = 1 << 12;
/** Indicates the track contains a text that has been edited for ease of reading. */ /** Indicates the track contains a text that has been edited for ease of reading. */
public static final int ROLE_FLAG_EASY_TO_READ = 1 << 13; public static final int ROLE_FLAG_EASY_TO_READ = 1 << 13;
/** Indicates the track is intended for trick play. */
public static final int ROLE_FLAG_TRICK_PLAY = 1 << 14;
/** /**
* Converts a time in microseconds to the corresponding time in milliseconds, preserving * Converts a time in microseconds to the corresponding time in milliseconds, preserving
......
...@@ -120,7 +120,7 @@ import java.util.concurrent.atomic.AtomicBoolean; ...@@ -120,7 +120,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
private int pendingPrepareCount; private int pendingPrepareCount;
private SeekPosition pendingInitialSeekPosition; private SeekPosition pendingInitialSeekPosition;
private long rendererPositionUs; private long rendererPositionUs;
private int nextPendingMessageIndex; private int nextPendingMessageIndexHint;
private boolean deliverPendingMessageAtStartPositionRequired; private boolean deliverPendingMessageAtStartPositionRequired;
public ExoPlayerImplInternal( public ExoPlayerImplInternal(
...@@ -928,7 +928,6 @@ import java.util.concurrent.atomic.AtomicBoolean; ...@@ -928,7 +928,6 @@ import java.util.concurrent.atomic.AtomicBoolean;
pendingMessageInfo.message.markAsProcessed(/* isDelivered= */ false); pendingMessageInfo.message.markAsProcessed(/* isDelivered= */ false);
} }
pendingMessages.clear(); pendingMessages.clear();
nextPendingMessageIndex = 0;
} }
MediaPeriodId mediaPeriodId = MediaPeriodId mediaPeriodId =
resetPosition resetPosition
...@@ -954,7 +953,12 @@ import java.util.concurrent.atomic.AtomicBoolean; ...@@ -954,7 +953,12 @@ import java.util.concurrent.atomic.AtomicBoolean;
startPositionUs); startPositionUs);
if (releaseMediaSource) { if (releaseMediaSource) {
if (mediaSource != null) { if (mediaSource != null) {
mediaSource.releaseSource(/* caller= */ this); try {
mediaSource.releaseSource(/* caller= */ this);
} catch (RuntimeException e) {
// There's nothing we can do.
Log.e(TAG, "Failed to release child source.", e);
}
mediaSource = null; mediaSource = null;
} }
} }
...@@ -1077,6 +1081,7 @@ import java.util.concurrent.atomic.AtomicBoolean; ...@@ -1077,6 +1081,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
// Correct next index if necessary (e.g. after seeking, timeline changes, or new messages) // Correct next index if necessary (e.g. after seeking, timeline changes, or new messages)
int currentPeriodIndex = int currentPeriodIndex =
playbackInfo.timeline.getIndexOfPeriod(playbackInfo.periodId.periodUid); playbackInfo.timeline.getIndexOfPeriod(playbackInfo.periodId.periodUid);
int nextPendingMessageIndex = Math.min(nextPendingMessageIndexHint, pendingMessages.size());
PendingMessageInfo previousInfo = PendingMessageInfo previousInfo =
nextPendingMessageIndex > 0 ? pendingMessages.get(nextPendingMessageIndex - 1) : null; nextPendingMessageIndex > 0 ? pendingMessages.get(nextPendingMessageIndex - 1) : null;
while (previousInfo != null while (previousInfo != null
...@@ -1122,6 +1127,7 @@ import java.util.concurrent.atomic.AtomicBoolean; ...@@ -1122,6 +1127,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
? pendingMessages.get(nextPendingMessageIndex) ? pendingMessages.get(nextPendingMessageIndex)
: null; : null;
} }
nextPendingMessageIndexHint = nextPendingMessageIndex;
} }
private void ensureStopped(Renderer renderer) throws ExoPlaybackException { private void ensureStopped(Renderer renderer) throws ExoPlaybackException {
......
...@@ -29,11 +29,11 @@ public final class ExoPlayerLibraryInfo { ...@@ -29,11 +29,11 @@ public final class ExoPlayerLibraryInfo {
/** The version of the library expressed as a string, for example "1.2.3". */ /** The version of the library expressed as a string, for example "1.2.3". */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION_INT) or vice versa. // Intentionally hardcoded. Do not derive from other constants (e.g. VERSION_INT) or vice versa.
public static final String VERSION = "2.11.4"; public static final String VERSION = "2.11.5";
/** The version of the library expressed as {@code "ExoPlayerLib/" + VERSION}. */ /** The version of the library expressed as {@code "ExoPlayerLib/" + VERSION}. */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa. // Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa.
public static final String VERSION_SLASHY = "ExoPlayerLib/2.11.4"; public static final String VERSION_SLASHY = "ExoPlayerLib/2.11.5";
/** /**
* The version of the library expressed as an integer, for example 1002003. * The version of the library expressed as an integer, for example 1002003.
...@@ -43,7 +43,7 @@ public final class ExoPlayerLibraryInfo { ...@@ -43,7 +43,7 @@ public final class ExoPlayerLibraryInfo {
* integer version 123045006 (123-045-006). * integer version 123045006 (123-045-006).
*/ */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa. // Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa.
public static final int VERSION_INT = 2011004; public static final int VERSION_INT = 2011005;
/** /**
* Whether the library was compiled with {@link com.google.android.exoplayer2.util.Assertions} * Whether the library was compiled with {@link com.google.android.exoplayer2.util.Assertions}
......
...@@ -19,6 +19,7 @@ import android.util.Pair; ...@@ -19,6 +19,7 @@ import android.util.Pair;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import com.google.android.exoplayer2.source.ads.AdPlaybackState; import com.google.android.exoplayer2.source.ads.AdPlaybackState;
import com.google.android.exoplayer2.util.Assertions; import com.google.android.exoplayer2.util.Assertions;
import com.google.android.exoplayer2.util.Util;
/** /**
* A flexible representation of the structure of media. A timeline is able to represent the * A flexible representation of the structure of media. A timeline is able to represent the
...@@ -278,6 +279,48 @@ public abstract class Timeline { ...@@ -278,6 +279,48 @@ public abstract class Timeline {
return positionInFirstPeriodUs; return positionInFirstPeriodUs;
} }
@Override
public boolean equals(@Nullable Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Window that = (Window) obj;
return Util.areEqual(uid, that.uid)
&& Util.areEqual(tag, that.tag)
&& Util.areEqual(manifest, that.manifest)
&& presentationStartTimeMs == that.presentationStartTimeMs
&& windowStartTimeMs == that.windowStartTimeMs
&& isSeekable == that.isSeekable
&& isDynamic == that.isDynamic
&& isLive == that.isLive
&& defaultPositionUs == that.defaultPositionUs
&& durationUs == that.durationUs
&& firstPeriodIndex == that.firstPeriodIndex
&& lastPeriodIndex == that.lastPeriodIndex
&& positionInFirstPeriodUs == that.positionInFirstPeriodUs;
}
@Override
public int hashCode() {
int result = 7;
result = 31 * result + uid.hashCode();
result = 31 * result + (tag == null ? 0 : tag.hashCode());
result = 31 * result + (manifest == null ? 0 : manifest.hashCode());
result = 31 * result + (int) (presentationStartTimeMs ^ (presentationStartTimeMs >>> 32));
result = 31 * result + (int) (windowStartTimeMs ^ (windowStartTimeMs >>> 32));
result = 31 * result + (isSeekable ? 1 : 0);
result = 31 * result + (isDynamic ? 1 : 0);
result = 31 * result + (isLive ? 1 : 0);
result = 31 * result + (int) (defaultPositionUs ^ (defaultPositionUs >>> 32));
result = 31 * result + (int) (durationUs ^ (durationUs >>> 32));
result = 31 * result + firstPeriodIndex;
result = 31 * result + lastPeriodIndex;
result = 31 * result + (int) (positionInFirstPeriodUs ^ (positionInFirstPeriodUs >>> 32));
return result;
}
} }
/** /**
...@@ -423,8 +466,8 @@ public abstract class Timeline { ...@@ -423,8 +466,8 @@ public abstract class Timeline {
* microseconds. * microseconds.
* *
* @param adGroupIndex The ad group index. * @param adGroupIndex The ad group index.
* @return The time of the ad group at the index, in microseconds, or {@link * @return The time of the ad group at the index relative to the start of the enclosing {@link
* C#TIME_END_OF_SOURCE} for a post-roll ad group. * Period}, in microseconds, or {@link C#TIME_END_OF_SOURCE} for a post-roll ad group.
*/ */
public long getAdGroupTimeUs(int adGroupIndex) { public long getAdGroupTimeUs(int adGroupIndex) {
return adPlaybackState.adGroupTimesUs[adGroupIndex]; return adPlaybackState.adGroupTimesUs[adGroupIndex];
...@@ -467,22 +510,23 @@ public abstract class Timeline { ...@@ -467,22 +510,23 @@ public abstract class Timeline {
} }
/** /**
* Returns the index of the ad group at or before {@code positionUs}, if that ad group is * Returns the index of the ad group at or before {@code positionUs} in the period, if that ad
* unplayed. Returns {@link C#INDEX_UNSET} if the ad group at or before {@code positionUs} has * group is unplayed. Returns {@link C#INDEX_UNSET} if the ad group at or before {@code
* no ads remaining to be played, or if there is no such ad group. * positionUs} has no ads remaining to be played, or if there is no such ad group.
* *
* @param positionUs The position at or before which to find an ad group, in microseconds. * @param positionUs The period position at or before which to find an ad group, in
* microseconds.
* @return The index of the ad group, or {@link C#INDEX_UNSET}. * @return The index of the ad group, or {@link C#INDEX_UNSET}.
*/ */
public int getAdGroupIndexForPositionUs(long positionUs) { public int getAdGroupIndexForPositionUs(long positionUs) {
return adPlaybackState.getAdGroupIndexForPositionUs(positionUs); return adPlaybackState.getAdGroupIndexForPositionUs(positionUs, durationUs);
} }
/** /**
* Returns the index of the next ad group after {@code positionUs} that has ads remaining to be * Returns the index of the next ad group after {@code positionUs} in the period that has ads
* played. Returns {@link C#INDEX_UNSET} if there is no such ad group. * remaining to be played. Returns {@link C#INDEX_UNSET} if there is no such ad group.
* *
* @param positionUs The position after which to find an ad group, in microseconds. * @param positionUs The period position after which to find an ad group, in microseconds.
* @return The index of the ad group, or {@link C#INDEX_UNSET}. * @return The index of the ad group, or {@link C#INDEX_UNSET}.
*/ */
public int getAdGroupIndexAfterPositionUs(long positionUs) { public int getAdGroupIndexAfterPositionUs(long positionUs) {
...@@ -534,6 +578,34 @@ public abstract class Timeline { ...@@ -534,6 +578,34 @@ public abstract class Timeline {
return adPlaybackState.adResumePositionUs; return adPlaybackState.adResumePositionUs;
} }
@Override
public boolean equals(@Nullable Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Period that = (Period) obj;
return Util.areEqual(id, that.id)
&& Util.areEqual(uid, that.uid)
&& windowIndex == that.windowIndex
&& durationUs == that.durationUs
&& positionInWindowUs == that.positionInWindowUs
&& Util.areEqual(adPlaybackState, that.adPlaybackState);
}
@Override
public int hashCode() {
int result = 7;
result = 31 * result + (id == null ? 0 : id.hashCode());
result = 31 * result + (uid == null ? 0 : uid.hashCode());
result = 31 * result + windowIndex;
result = 31 * result + (int) (durationUs ^ (durationUs >>> 32));
result = 31 * result + (int) (positionInWindowUs ^ (positionInWindowUs >>> 32));
result = 31 * result + (adPlaybackState == null ? 0 : adPlaybackState.hashCode());
return result;
}
} }
/** An empty timeline. */ /** An empty timeline. */
...@@ -834,4 +906,50 @@ public abstract class Timeline { ...@@ -834,4 +906,50 @@ public abstract class Timeline {
* @return The unique id of the period. * @return The unique id of the period.
*/ */
public abstract Object getUidOfPeriod(int periodIndex); public abstract Object getUidOfPeriod(int periodIndex);
@Override
public boolean equals(@Nullable Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof Timeline)) {
return false;
}
Timeline other = (Timeline) obj;
if (other.getWindowCount() != getWindowCount() || other.getPeriodCount() != getPeriodCount()) {
return false;
}
Timeline.Window window = new Timeline.Window();
Timeline.Period period = new Timeline.Period();
Timeline.Window otherWindow = new Timeline.Window();
Timeline.Period otherPeriod = new Timeline.Period();
for (int i = 0; i < getWindowCount(); i++) {
if (!getWindow(i, window).equals(other.getWindow(i, otherWindow))) {
return false;
}
}
for (int i = 0; i < getPeriodCount(); i++) {
if (!getPeriod(i, period, /* setIds= */ true)
.equals(other.getPeriod(i, otherPeriod, /* setIds= */ true))) {
return false;
}
}
return true;
}
@Override
public int hashCode() {
Window window = new Window();
Period period = new Period();
int result = 7;
result = 31 * result + getWindowCount();
for (int i = 0; i < getWindowCount(); i++) {
result = 31 * result + getWindow(i, window).hashCode();
}
result = 31 * result + getPeriodCount();
for (int i = 0; i < getPeriodCount(); i++) {
result = 31 * result + getPeriod(i, period, /* setIds= */ true).hashCode();
}
return result;
}
} }
...@@ -29,7 +29,6 @@ import java.util.HashMap; ...@@ -29,7 +29,6 @@ import java.util.HashMap;
import java.util.Iterator; import java.util.Iterator;
import java.util.Random; import java.util.Random;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull; import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
import org.checkerframework.checker.nullness.qual.RequiresNonNull;
/** /**
* Default {@link PlaybackSessionManager} which instantiates a new session for each window in the * Default {@link PlaybackSessionManager} which instantiates a new session for each window in the
...@@ -48,8 +47,7 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag ...@@ -48,8 +47,7 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
private @MonotonicNonNull Listener listener; private @MonotonicNonNull Listener listener;
private Timeline currentTimeline; private Timeline currentTimeline;
@Nullable private MediaPeriodId currentMediaPeriodId; @Nullable private String currentSessionId;
@Nullable private String activeSessionId;
/** Creates session manager. */ /** Creates session manager. */
public DefaultPlaybackSessionManager() { public DefaultPlaybackSessionManager() {
...@@ -83,22 +81,34 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag ...@@ -83,22 +81,34 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
@Override @Override
public synchronized void updateSessions(EventTime eventTime) { public synchronized void updateSessions(EventTime eventTime) {
boolean isObviouslyFinished = Assertions.checkNotNull(listener);
eventTime.mediaPeriodId != null @Nullable SessionDescriptor currentSession = sessions.get(currentSessionId);
&& currentMediaPeriodId != null if (eventTime.mediaPeriodId != null && currentSession != null) {
&& eventTime.mediaPeriodId.windowSequenceNumber // If we receive an event associated with a media period, then it needs to be either part of
< currentMediaPeriodId.windowSequenceNumber; // the current window if it's the first created media period, or a window that will be played
if (!isObviouslyFinished) { // in the future. Otherwise, we know that it belongs to a session that was already finished
SessionDescriptor descriptor = // and we can ignore the event.
getOrAddSession(eventTime.windowIndex, eventTime.mediaPeriodId); boolean isAlreadyFinished =
if (!descriptor.isCreated) { currentSession.windowSequenceNumber == C.INDEX_UNSET
descriptor.isCreated = true; ? currentSession.windowIndex != eventTime.windowIndex
Assertions.checkNotNull(listener).onSessionCreated(eventTime, descriptor.sessionId); : eventTime.mediaPeriodId.windowSequenceNumber < currentSession.windowSequenceNumber;
if (activeSessionId == null) { if (isAlreadyFinished) {
updateActiveSession(eventTime, descriptor); return;
}
} }
} }
SessionDescriptor eventSession =
getOrAddSession(eventTime.windowIndex, eventTime.mediaPeriodId);
if (currentSessionId == null) {
currentSessionId = eventSession.sessionId;
}
if (!eventSession.isCreated) {
eventSession.isCreated = true;
listener.onSessionCreated(eventTime, eventSession.sessionId);
}
if (eventSession.sessionId.equals(currentSessionId) && !eventSession.isActive) {
eventSession.isActive = true;
listener.onSessionActive(eventTime, eventSession.sessionId);
}
} }
@Override @Override
...@@ -112,8 +122,8 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag ...@@ -112,8 +122,8 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
if (!session.tryResolvingToNewTimeline(previousTimeline, currentTimeline)) { if (!session.tryResolvingToNewTimeline(previousTimeline, currentTimeline)) {
iterator.remove(); iterator.remove();
if (session.isCreated) { if (session.isCreated) {
if (session.sessionId.equals(activeSessionId)) { if (session.sessionId.equals(currentSessionId)) {
activeSessionId = null; currentSessionId = null;
} }
listener.onSessionFinished( listener.onSessionFinished(
eventTime, session.sessionId, /* automaticTransitionToNextPlayback= */ false); eventTime, session.sessionId, /* automaticTransitionToNextPlayback= */ false);
...@@ -136,36 +146,55 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag ...@@ -136,36 +146,55 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
if (session.isFinishedAtEventTime(eventTime)) { if (session.isFinishedAtEventTime(eventTime)) {
iterator.remove(); iterator.remove();
if (session.isCreated) { if (session.isCreated) {
boolean isRemovingActiveSession = session.sessionId.equals(activeSessionId); boolean isRemovingCurrentSession = session.sessionId.equals(currentSessionId);
boolean isAutomaticTransition = hasAutomaticTransition && isRemovingActiveSession; boolean isAutomaticTransition =
if (isRemovingActiveSession) { hasAutomaticTransition && isRemovingCurrentSession && session.isActive;
activeSessionId = null; if (isRemovingCurrentSession) {
currentSessionId = null;
} }
listener.onSessionFinished(eventTime, session.sessionId, isAutomaticTransition); listener.onSessionFinished(eventTime, session.sessionId, isAutomaticTransition);
} }
} }
} }
SessionDescriptor activeSessionDescriptor = @Nullable SessionDescriptor previousSessionDescriptor = sessions.get(currentSessionId);
SessionDescriptor currentSessionDescriptor =
getOrAddSession(eventTime.windowIndex, eventTime.mediaPeriodId); getOrAddSession(eventTime.windowIndex, eventTime.mediaPeriodId);
currentSessionId = currentSessionDescriptor.sessionId;
if (eventTime.mediaPeriodId != null if (eventTime.mediaPeriodId != null
&& eventTime.mediaPeriodId.isAd() && eventTime.mediaPeriodId.isAd()
&& (currentMediaPeriodId == null && (previousSessionDescriptor == null
|| currentMediaPeriodId.windowSequenceNumber || previousSessionDescriptor.windowSequenceNumber
!= eventTime.mediaPeriodId.windowSequenceNumber != eventTime.mediaPeriodId.windowSequenceNumber
|| currentMediaPeriodId.adGroupIndex != eventTime.mediaPeriodId.adGroupIndex || previousSessionDescriptor.adMediaPeriodId == null
|| currentMediaPeriodId.adIndexInAdGroup != eventTime.mediaPeriodId.adIndexInAdGroup)) { || previousSessionDescriptor.adMediaPeriodId.adGroupIndex
!= eventTime.mediaPeriodId.adGroupIndex
|| previousSessionDescriptor.adMediaPeriodId.adIndexInAdGroup
!= eventTime.mediaPeriodId.adIndexInAdGroup)) {
// New ad playback started. Find corresponding content session and notify ad playback started. // New ad playback started. Find corresponding content session and notify ad playback started.
MediaPeriodId contentMediaPeriodId = MediaPeriodId contentMediaPeriodId =
new MediaPeriodId( new MediaPeriodId(
eventTime.mediaPeriodId.periodUid, eventTime.mediaPeriodId.windowSequenceNumber); eventTime.mediaPeriodId.periodUid, eventTime.mediaPeriodId.windowSequenceNumber);
SessionDescriptor contentSession = SessionDescriptor contentSession =
getOrAddSession(eventTime.windowIndex, contentMediaPeriodId); getOrAddSession(eventTime.windowIndex, contentMediaPeriodId);
if (contentSession.isCreated && activeSessionDescriptor.isCreated) { if (contentSession.isCreated && currentSessionDescriptor.isCreated) {
listener.onAdPlaybackStarted( listener.onAdPlaybackStarted(
eventTime, contentSession.sessionId, activeSessionDescriptor.sessionId); eventTime, contentSession.sessionId, currentSessionDescriptor.sessionId);
}
}
}
@Override
public void finishAllSessions(EventTime eventTime) {
currentSessionId = null;
Iterator<SessionDescriptor> iterator = sessions.values().iterator();
while (iterator.hasNext()) {
SessionDescriptor session = iterator.next();
iterator.remove();
if (session.isCreated && listener != null) {
listener.onSessionFinished(
eventTime, session.sessionId, /* automaticTransitionToNextPlayback= */ false);
} }
} }
updateActiveSession(eventTime, activeSessionDescriptor);
} }
private SessionDescriptor getOrAddSession( private SessionDescriptor getOrAddSession(
...@@ -199,18 +228,6 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag ...@@ -199,18 +228,6 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
return bestMatch; return bestMatch;
} }
@RequiresNonNull("listener")
private void updateActiveSession(EventTime eventTime, SessionDescriptor sessionDescriptor) {
currentMediaPeriodId = eventTime.mediaPeriodId;
if (sessionDescriptor.isCreated) {
activeSessionId = sessionDescriptor.sessionId;
if (!sessionDescriptor.isActive) {
sessionDescriptor.isActive = true;
listener.onSessionActive(eventTime, sessionDescriptor.sessionId);
}
}
}
private static String generateSessionId() { private static String generateSessionId() {
byte[] randomBytes = new byte[SESSION_ID_LENGTH]; byte[] randomBytes = new byte[SESSION_ID_LENGTH];
RANDOM.nextBytes(randomBytes); RANDOM.nextBytes(randomBytes);
...@@ -284,8 +301,7 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag ...@@ -284,8 +301,7 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
int eventWindowIndex, @Nullable MediaPeriodId eventMediaPeriodId) { int eventWindowIndex, @Nullable MediaPeriodId eventMediaPeriodId) {
if (windowSequenceNumber == C.INDEX_UNSET if (windowSequenceNumber == C.INDEX_UNSET
&& eventWindowIndex == windowIndex && eventWindowIndex == windowIndex
&& eventMediaPeriodId != null && eventMediaPeriodId != null) {
&& !eventMediaPeriodId.isAd()) {
// Set window sequence number for this session as soon as we have one. // Set window sequence number for this session as soon as we have one.
windowSequenceNumber = eventMediaPeriodId.windowSequenceNumber; windowSequenceNumber = eventMediaPeriodId.windowSequenceNumber;
} }
......
...@@ -117,4 +117,12 @@ public interface PlaybackSessionManager { ...@@ -117,4 +117,12 @@ public interface PlaybackSessionManager {
* @param reason The {@link DiscontinuityReason}. * @param reason The {@link DiscontinuityReason}.
*/ */
void handlePositionDiscontinuity(EventTime eventTime, @DiscontinuityReason int reason); void handlePositionDiscontinuity(EventTime eventTime, @DiscontinuityReason int reason);
/**
* Finishes all existing sessions and calls their respective {@link
* Listener#onSessionFinished(EventTime, String, boolean)} callback.
*
* @param eventTime The event time at which sessions are finished.
*/
void finishAllSessions(EventTime eventTime);
} }
...@@ -37,7 +37,7 @@ import java.lang.annotation.RetentionPolicy; ...@@ -37,7 +37,7 @@ import java.lang.annotation.RetentionPolicy;
* *
* <p>If {@link #hasTimestamp()} returns {@code true}, call {@link #getTimestampSystemTimeUs()} to * <p>If {@link #hasTimestamp()} returns {@code true}, call {@link #getTimestampSystemTimeUs()} to
* get the system time at which the latest timestamp was sampled and {@link * get the system time at which the latest timestamp was sampled and {@link
* #getTimestampPositionFrames()} to get its position in frames. If {@link #isTimestampAdvancing()} * #getTimestampPositionFrames()} to get its position in frames. If {@link #hasAdvancingTimestamp()}
* returns {@code true}, the caller should assume that the timestamp has been increasing in real * returns {@code true}, the caller should assume that the timestamp has been increasing in real
* time since it was sampled. Otherwise, it may be stationary. * time since it was sampled. Otherwise, it may be stationary.
* *
...@@ -68,7 +68,7 @@ import java.lang.annotation.RetentionPolicy; ...@@ -68,7 +68,7 @@ import java.lang.annotation.RetentionPolicy;
private static final int STATE_ERROR = 4; private static final int STATE_ERROR = 4;
/** The polling interval for {@link #STATE_INITIALIZING} and {@link #STATE_TIMESTAMP}. */ /** The polling interval for {@link #STATE_INITIALIZING} and {@link #STATE_TIMESTAMP}. */
private static final int FAST_POLL_INTERVAL_US = 5_000; private static final int FAST_POLL_INTERVAL_US = 10_000;
/** /**
* The polling interval for {@link #STATE_TIMESTAMP_ADVANCING} and {@link #STATE_NO_TIMESTAMP}. * The polling interval for {@link #STATE_TIMESTAMP_ADVANCING} and {@link #STATE_NO_TIMESTAMP}.
*/ */
...@@ -110,7 +110,7 @@ import java.lang.annotation.RetentionPolicy; ...@@ -110,7 +110,7 @@ import java.lang.annotation.RetentionPolicy;
* timestamp is available via {@link #getTimestampSystemTimeUs()} and {@link * timestamp is available via {@link #getTimestampSystemTimeUs()} and {@link
* #getTimestampPositionFrames()}, and the caller should call {@link #acceptTimestamp()} if the * #getTimestampPositionFrames()}, and the caller should call {@link #acceptTimestamp()} if the
* timestamp was valid, or {@link #rejectTimestamp()} otherwise. The values returned by {@link * timestamp was valid, or {@link #rejectTimestamp()} otherwise. The values returned by {@link
* #hasTimestamp()} and {@link #isTimestampAdvancing()} may be updated. * #hasTimestamp()} and {@link #hasAdvancingTimestamp()} may be updated.
* *
* @param systemTimeUs The current system time, in microseconds. * @param systemTimeUs The current system time, in microseconds.
* @return Whether the timestamp was updated. * @return Whether the timestamp was updated.
...@@ -200,12 +200,12 @@ import java.lang.annotation.RetentionPolicy; ...@@ -200,12 +200,12 @@ import java.lang.annotation.RetentionPolicy;
} }
/** /**
* Returns whether the timestamp appears to be advancing. If {@code true}, call {@link * Returns whether this instance has an advancing timestamp. If {@code true}, call {@link
* #getTimestampSystemTimeUs()} and {@link #getTimestampSystemTimeUs()} to access the timestamp. A * #getTimestampSystemTimeUs()} and {@link #getTimestampSystemTimeUs()} to access the timestamp. A
* current position for the track can be extrapolated based on elapsed real time since the system * current position for the track can be extrapolated based on elapsed real time since the system
* time at which the timestamp was sampled. * time at which the timestamp was sampled.
*/ */
public boolean isTimestampAdvancing() { public boolean hasAdvancingTimestamp() {
return state == STATE_TIMESTAMP_ADVANCING; return state == STATE_TIMESTAMP_ADVANCING;
} }
......
...@@ -123,6 +123,8 @@ import java.lang.reflect.Method; ...@@ -123,6 +123,8 @@ import java.lang.reflect.Method;
* <p>This is a fail safe that should not be required on correctly functioning devices. * <p>This is a fail safe that should not be required on correctly functioning devices.
*/ */
private static final long MAX_LATENCY_US = 5 * C.MICROS_PER_SECOND; private static final long MAX_LATENCY_US = 5 * C.MICROS_PER_SECOND;
/** The duration of time used to smooth over an adjustment between position sampling modes. */
private static final long MODE_SWITCH_SMOOTHING_DURATION_US = C.MICROS_PER_SECOND;
private static final long FORCE_RESET_WORKAROUND_TIMEOUT_MS = 200; private static final long FORCE_RESET_WORKAROUND_TIMEOUT_MS = 200;
...@@ -160,6 +162,15 @@ import java.lang.reflect.Method; ...@@ -160,6 +162,15 @@ import java.lang.reflect.Method;
private long stopPlaybackHeadPosition; private long stopPlaybackHeadPosition;
private long endPlaybackHeadPosition; private long endPlaybackHeadPosition;
// Results from the previous call to getCurrentPositionUs.
private long lastPositionUs;
private long lastSystemTimeUs;
private boolean lastSampleUsedGetTimestampMode;
// Results from the last call to getCurrentPositionUs that used a different sample mode.
private long previousModePositionUs;
private long previousModeSystemTimeUs;
/** /**
* Creates a new audio track position tracker. * Creates a new audio track position tracker.
* *
...@@ -206,6 +217,7 @@ import java.lang.reflect.Method; ...@@ -206,6 +217,7 @@ import java.lang.reflect.Method;
hasData = false; hasData = false;
stopTimestampUs = C.TIME_UNSET; stopTimestampUs = C.TIME_UNSET;
forceResetWorkaroundTimeMs = C.TIME_UNSET; forceResetWorkaroundTimeMs = C.TIME_UNSET;
lastLatencySampleTimeUs = 0;
latencyUs = 0; latencyUs = 0;
} }
...@@ -217,18 +229,16 @@ import java.lang.reflect.Method; ...@@ -217,18 +229,16 @@ import java.lang.reflect.Method;
// If the device supports it, use the playback timestamp from AudioTrack.getTimestamp. // If the device supports it, use the playback timestamp from AudioTrack.getTimestamp.
// Otherwise, derive a smoothed position by sampling the track's frame position. // Otherwise, derive a smoothed position by sampling the track's frame position.
long systemTimeUs = System.nanoTime() / 1000; long systemTimeUs = System.nanoTime() / 1000;
long positionUs;
AudioTimestampPoller audioTimestampPoller = Assertions.checkNotNull(this.audioTimestampPoller); AudioTimestampPoller audioTimestampPoller = Assertions.checkNotNull(this.audioTimestampPoller);
if (audioTimestampPoller.hasTimestamp()) { boolean useGetTimestampMode = audioTimestampPoller.hasAdvancingTimestamp();
if (useGetTimestampMode) {
// Calculate the speed-adjusted position using the timestamp (which may be in the future). // Calculate the speed-adjusted position using the timestamp (which may be in the future).
long timestampPositionFrames = audioTimestampPoller.getTimestampPositionFrames(); long timestampPositionFrames = audioTimestampPoller.getTimestampPositionFrames();
long timestampPositionUs = framesToDurationUs(timestampPositionFrames); long timestampPositionUs = framesToDurationUs(timestampPositionFrames);
if (!audioTimestampPoller.isTimestampAdvancing()) {
return timestampPositionUs;
}
long elapsedSinceTimestampUs = systemTimeUs - audioTimestampPoller.getTimestampSystemTimeUs(); long elapsedSinceTimestampUs = systemTimeUs - audioTimestampPoller.getTimestampSystemTimeUs();
return timestampPositionUs + elapsedSinceTimestampUs; positionUs = timestampPositionUs + elapsedSinceTimestampUs;
} else { } else {
long positionUs;
if (playheadOffsetCount == 0) { if (playheadOffsetCount == 0) {
// The AudioTrack has started, but we don't have any samples to compute a smoothed position. // The AudioTrack has started, but we don't have any samples to compute a smoothed position.
positionUs = getPlaybackHeadPositionUs(); positionUs = getPlaybackHeadPositionUs();
...@@ -239,10 +249,31 @@ import java.lang.reflect.Method; ...@@ -239,10 +249,31 @@ import java.lang.reflect.Method;
positionUs = systemTimeUs + smoothedPlayheadOffsetUs; positionUs = systemTimeUs + smoothedPlayheadOffsetUs;
} }
if (!sourceEnded) { if (!sourceEnded) {
positionUs -= latencyUs; positionUs = Math.max(0, positionUs - latencyUs);
} }
return positionUs;
} }
if (lastSampleUsedGetTimestampMode != useGetTimestampMode) {
// We've switched sampling mode.
previousModeSystemTimeUs = lastSystemTimeUs;
previousModePositionUs = lastPositionUs;
}
long elapsedSincePreviousModeUs = systemTimeUs - previousModeSystemTimeUs;
if (elapsedSincePreviousModeUs < MODE_SWITCH_SMOOTHING_DURATION_US) {
// Use a ramp to smooth between the old mode and the new one to avoid introducing a sudden
// jump if the two modes disagree.
long previousModeProjectedPositionUs = previousModePositionUs + elapsedSincePreviousModeUs;
// A ramp consisting of 1000 points distributed over MODE_SWITCH_SMOOTHING_DURATION_US.
long rampPoint = (elapsedSincePreviousModeUs * 1000) / MODE_SWITCH_SMOOTHING_DURATION_US;
positionUs *= rampPoint;
positionUs += (1000 - rampPoint) * previousModeProjectedPositionUs;
positionUs /= 1000;
}
lastSystemTimeUs = systemTimeUs;
lastPositionUs = positionUs;
lastSampleUsedGetTimestampMode = useGetTimestampMode;
return positionUs;
} }
/** Starts position tracking. Must be called immediately before {@link AudioTrack#play()}. */ /** Starts position tracking. Must be called immediately before {@link AudioTrack#play()}. */
...@@ -353,7 +384,7 @@ import java.lang.reflect.Method; ...@@ -353,7 +384,7 @@ import java.lang.reflect.Method;
} }
/** /**
* Resets the position tracker. Should be called when the audio track previous passed to {@link * Resets the position tracker. Should be called when the audio track previously passed to {@link
* #setAudioTrack(AudioTrack, int, int, int)} is no longer in use. * #setAudioTrack(AudioTrack, int, int, int)} is no longer in use.
*/ */
public void reset() { public void reset() {
...@@ -457,6 +488,8 @@ import java.lang.reflect.Method; ...@@ -457,6 +488,8 @@ import java.lang.reflect.Method;
playheadOffsetCount = 0; playheadOffsetCount = 0;
nextPlayheadOffsetIndex = 0; nextPlayheadOffsetIndex = 0;
lastPlayheadSampleTimeUs = 0; lastPlayheadSampleTimeUs = 0;
lastSystemTimeUs = 0;
previousModeSystemTimeUs = 0;
} }
/** /**
......
...@@ -120,9 +120,20 @@ public final class DefaultAudioSink implements AudioSink { ...@@ -120,9 +120,20 @@ public final class DefaultAudioSink implements AudioSink {
/** /**
* Creates a new default chain of audio processors, with the user-defined {@code * Creates a new default chain of audio processors, with the user-defined {@code
* audioProcessors} applied before silence skipping and playback parameters. * audioProcessors} applied before silence skipping and speed adjustment processors.
*/ */
public DefaultAudioProcessorChain(AudioProcessor... audioProcessors) { public DefaultAudioProcessorChain(AudioProcessor... audioProcessors) {
this(audioProcessors, new SilenceSkippingAudioProcessor(), new SonicAudioProcessor());
}
/**
* Creates a new default chain of audio processors, with the user-defined {@code
* audioProcessors} applied before silence skipping and speed adjustment processors.
*/
public DefaultAudioProcessorChain(
AudioProcessor[] audioProcessors,
SilenceSkippingAudioProcessor silenceSkippingAudioProcessor,
SonicAudioProcessor sonicAudioProcessor) {
// The passed-in type may be more specialized than AudioProcessor[], so allocate a new array // The passed-in type may be more specialized than AudioProcessor[], so allocate a new array
// rather than using Arrays.copyOf. // rather than using Arrays.copyOf.
this.audioProcessors = new AudioProcessor[audioProcessors.length + 2]; this.audioProcessors = new AudioProcessor[audioProcessors.length + 2];
...@@ -132,8 +143,8 @@ public final class DefaultAudioSink implements AudioSink { ...@@ -132,8 +143,8 @@ public final class DefaultAudioSink implements AudioSink {
/* dest= */ this.audioProcessors, /* dest= */ this.audioProcessors,
/* destPos= */ 0, /* destPos= */ 0,
/* length= */ audioProcessors.length); /* length= */ audioProcessors.length);
silenceSkippingAudioProcessor = new SilenceSkippingAudioProcessor(); this.silenceSkippingAudioProcessor = silenceSkippingAudioProcessor;
sonicAudioProcessor = new SonicAudioProcessor(); this.sonicAudioProcessor = sonicAudioProcessor;
this.audioProcessors[audioProcessors.length] = silenceSkippingAudioProcessor; this.audioProcessors[audioProcessors.length] = silenceSkippingAudioProcessor;
this.audioProcessors[audioProcessors.length + 1] = sonicAudioProcessor; this.audioProcessors[audioProcessors.length + 1] = sonicAudioProcessor;
} }
......
...@@ -17,11 +17,13 @@ package com.google.android.exoplayer2.audio; ...@@ -17,11 +17,13 @@ package com.google.android.exoplayer2.audio;
import androidx.annotation.IntDef; import androidx.annotation.IntDef;
import com.google.android.exoplayer2.C; import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.util.Assertions;
import com.google.android.exoplayer2.util.Util; import com.google.android.exoplayer2.util.Util;
import java.lang.annotation.Documented; import java.lang.annotation.Documented;
import java.lang.annotation.Retention; import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy; import java.lang.annotation.RetentionPolicy;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
import java.nio.ByteOrder;
/** /**
* An {@link AudioProcessor} that skips silence in the input stream. Input and output are 16-bit * An {@link AudioProcessor} that skips silence in the input stream. Input and output are 16-bit
...@@ -30,27 +32,20 @@ import java.nio.ByteBuffer; ...@@ -30,27 +32,20 @@ import java.nio.ByteBuffer;
public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor { public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor {
/** /**
* The minimum duration of audio that must be below {@link #SILENCE_THRESHOLD_LEVEL} to classify * The default value for {@link #SilenceSkippingAudioProcessor(long, long, short)
* that part of audio as silent, in microseconds. * minimumSilenceDurationUs}.
*/ */
private static final long MINIMUM_SILENCE_DURATION_US = 150_000; public static final long DEFAULT_MINIMUM_SILENCE_DURATION_US = 150_000;
/** /**
* The duration of silence by which to extend non-silent sections, in microseconds. The value must * The default value for {@link #SilenceSkippingAudioProcessor(long, long, short)
* not exceed {@link #MINIMUM_SILENCE_DURATION_US}. * paddingSilenceUs}.
*/ */
private static final long PADDING_SILENCE_US = 20_000; public static final long DEFAULT_PADDING_SILENCE_US = 20_000;
/** /**
* The absolute level below which an individual PCM sample is classified as silent. Note: the * The default value for {@link #SilenceSkippingAudioProcessor(long, long, short)
* specified value will be rounded so that the threshold check only depends on the more * silenceThresholdLevel}.
* significant byte, for efficiency.
*/ */
private static final short SILENCE_THRESHOLD_LEVEL = 1024; public static final short DEFAULT_SILENCE_THRESHOLD_LEVEL = 1024;
/**
* Threshold for classifying an individual PCM sample as silent based on its more significant
* byte. This is {@link #SILENCE_THRESHOLD_LEVEL} divided by 256 with rounding.
*/
private static final byte SILENCE_THRESHOLD_LEVEL_MSB = (SILENCE_THRESHOLD_LEVEL + 128) >> 8;
/** Trimming states. */ /** Trimming states. */
@Documented @Documented
...@@ -68,8 +63,10 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor { ...@@ -68,8 +63,10 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor {
/** State when the input is silent. */ /** State when the input is silent. */
private static final int STATE_SILENT = 2; private static final int STATE_SILENT = 2;
private final long minimumSilenceDurationUs;
private final long paddingSilenceUs;
private final short silenceThresholdLevel;
private int bytesPerFrame; private int bytesPerFrame;
private boolean enabled; private boolean enabled;
/** /**
...@@ -91,8 +88,31 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor { ...@@ -91,8 +88,31 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor {
private boolean hasOutputNoise; private boolean hasOutputNoise;
private long skippedFrames; private long skippedFrames;
/** Creates a new silence trimming audio processor. */ /** Creates a new silence skipping audio processor. */
public SilenceSkippingAudioProcessor() { public SilenceSkippingAudioProcessor() {
this(
DEFAULT_MINIMUM_SILENCE_DURATION_US,
DEFAULT_PADDING_SILENCE_US,
DEFAULT_SILENCE_THRESHOLD_LEVEL);
}
/**
* Creates a new silence skipping audio processor.
*
* @param minimumSilenceDurationUs The minimum duration of audio that must be below {@code
* silenceThresholdLevel} to classify that part of audio as silent, in microseconds.
* @param paddingSilenceUs The duration of silence by which to extend non-silent sections, in
* microseconds. The value must not exceed {@code minimumSilenceDurationUs}.
* @param silenceThresholdLevel The absolute level below which an individual PCM sample is
* classified as silent.
*/
public SilenceSkippingAudioProcessor(
long minimumSilenceDurationUs, long paddingSilenceUs, short silenceThresholdLevel) {
Assertions.checkArgument(paddingSilenceUs <= minimumSilenceDurationUs);
this.minimumSilenceDurationUs = minimumSilenceDurationUs;
this.paddingSilenceUs = paddingSilenceUs;
this.silenceThresholdLevel = silenceThresholdLevel;
maybeSilenceBuffer = Util.EMPTY_BYTE_ARRAY; maybeSilenceBuffer = Util.EMPTY_BYTE_ARRAY;
paddingBuffer = Util.EMPTY_BYTE_ARRAY; paddingBuffer = Util.EMPTY_BYTE_ARRAY;
} }
...@@ -166,11 +186,11 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor { ...@@ -166,11 +186,11 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor {
protected void onFlush() { protected void onFlush() {
if (enabled) { if (enabled) {
bytesPerFrame = inputAudioFormat.bytesPerFrame; bytesPerFrame = inputAudioFormat.bytesPerFrame;
int maybeSilenceBufferSize = durationUsToFrames(MINIMUM_SILENCE_DURATION_US) * bytesPerFrame; int maybeSilenceBufferSize = durationUsToFrames(minimumSilenceDurationUs) * bytesPerFrame;
if (maybeSilenceBuffer.length != maybeSilenceBufferSize) { if (maybeSilenceBuffer.length != maybeSilenceBufferSize) {
maybeSilenceBuffer = new byte[maybeSilenceBufferSize]; maybeSilenceBuffer = new byte[maybeSilenceBufferSize];
} }
paddingSize = durationUsToFrames(PADDING_SILENCE_US) * bytesPerFrame; paddingSize = durationUsToFrames(paddingSilenceUs) * bytesPerFrame;
if (paddingBuffer.length != paddingSize) { if (paddingBuffer.length != paddingSize) {
paddingBuffer = new byte[paddingSize]; paddingBuffer = new byte[paddingSize];
} }
...@@ -325,9 +345,10 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor { ...@@ -325,9 +345,10 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor {
* classified as a noisy frame, or the limit of the buffer if no such frame exists. * classified as a noisy frame, or the limit of the buffer if no such frame exists.
*/ */
private int findNoisePosition(ByteBuffer buffer) { private int findNoisePosition(ByteBuffer buffer) {
Assertions.checkArgument(buffer.order() == ByteOrder.LITTLE_ENDIAN);
// The input is in ByteOrder.nativeOrder(), which is little endian on Android. // The input is in ByteOrder.nativeOrder(), which is little endian on Android.
for (int i = buffer.position() + 1; i < buffer.limit(); i += 2) { for (int i = buffer.position(); i < buffer.limit(); i += 2) {
if (Math.abs(buffer.get(i)) > SILENCE_THRESHOLD_LEVEL_MSB) { if (Math.abs(buffer.getShort(i)) > silenceThresholdLevel) {
// Round to the start of the frame. // Round to the start of the frame.
return bytesPerFrame * (i / bytesPerFrame); return bytesPerFrame * (i / bytesPerFrame);
} }
...@@ -340,9 +361,10 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor { ...@@ -340,9 +361,10 @@ public final class SilenceSkippingAudioProcessor extends BaseAudioProcessor {
* from the byte position to the limit are classified as silent. * from the byte position to the limit are classified as silent.
*/ */
private int findNoiseLimit(ByteBuffer buffer) { private int findNoiseLimit(ByteBuffer buffer) {
Assertions.checkArgument(buffer.order() == ByteOrder.LITTLE_ENDIAN);
// The input is in ByteOrder.nativeOrder(), which is little endian on Android. // The input is in ByteOrder.nativeOrder(), which is little endian on Android.
for (int i = buffer.limit() - 1; i >= buffer.position(); i -= 2) { for (int i = buffer.limit() - 2; i >= buffer.position(); i -= 2) {
if (Math.abs(buffer.get(i)) > SILENCE_THRESHOLD_LEVEL_MSB) { if (Math.abs(buffer.getShort(i)) > silenceThresholdLevel) {
// Return the start of the next frame. // Return the start of the next frame.
return bytesPerFrame * (i / bytesPerFrame) + bytesPerFrame; return bytesPerFrame * (i / bytesPerFrame) + bytesPerFrame;
} }
......
...@@ -80,6 +80,11 @@ public final class TeeAudioProcessor extends BaseAudioProcessor { ...@@ -80,6 +80,11 @@ public final class TeeAudioProcessor extends BaseAudioProcessor {
} }
@Override @Override
protected void onFlush() {
flushSinkIfActive();
}
@Override
protected void onQueueEndOfStream() { protected void onQueueEndOfStream() {
flushSinkIfActive(); flushSinkIfActive();
} }
...@@ -201,7 +206,7 @@ public final class TeeAudioProcessor extends BaseAudioProcessor { ...@@ -201,7 +206,7 @@ public final class TeeAudioProcessor extends BaseAudioProcessor {
} }
private void reset() throws IOException { private void reset() throws IOException {
RandomAccessFile randomAccessFile = this.randomAccessFile; @Nullable RandomAccessFile randomAccessFile = this.randomAccessFile;
if (randomAccessFile == null) { if (randomAccessFile == null) {
return; return;
} }
......
...@@ -155,18 +155,20 @@ import java.nio.ByteBuffer; ...@@ -155,18 +155,20 @@ import java.nio.ByteBuffer;
@Override @Override
protected void onFlush() { protected void onFlush() {
if (reconfigurationPending) { if (reconfigurationPending) {
// This is the initial flush after reconfiguration. Prepare to trim bytes from the start/end. // Flushing activates the new configuration, so prepare to trim bytes from the start/end.
reconfigurationPending = false; reconfigurationPending = false;
endBuffer = new byte[trimEndFrames * inputAudioFormat.bytesPerFrame]; endBuffer = new byte[trimEndFrames * inputAudioFormat.bytesPerFrame];
pendingTrimStartBytes = trimStartFrames * inputAudioFormat.bytesPerFrame; pendingTrimStartBytes = trimStartFrames * inputAudioFormat.bytesPerFrame;
} else {
// This is a flush during playback (after the initial flush). We assume this was caused by a
// seek to a non-zero position and clear pending start bytes. This assumption may be wrong (we
// may be seeking to zero), but playing data that should have been trimmed shouldn't be
// noticeable after a seek. Ideally we would check the timestamp of the first input buffer
// queued after flushing to decide whether to trim (see also [Internal: b/77292509]).
pendingTrimStartBytes = 0;
} }
// TODO(internal b/77292509): Flushing occurs to activate a configuration (handled above) but
// also when seeking within a stream. This implementation currently doesn't handle seek to start
// (where we need to trim at the start again), nor seeks to non-zero positions before start
// trimming has occurred (where we should set pendingTrimStartBytes to zero). These cases can be
// fixed by trimming in queueInput based on timestamp, once that information is available.
// Any data in the end buffer should no longer be output if we are playing from a different
// position, so discard it and refill the buffer using new input.
endBufferSize = 0; endBufferSize = 0;
} }
......
...@@ -60,7 +60,7 @@ import com.google.android.exoplayer2.util.Util; ...@@ -60,7 +60,7 @@ import com.google.android.exoplayer2.util.Util;
return new XingSeeker(position, mpegAudioHeader.frameSize, durationUs); return new XingSeeker(position, mpegAudioHeader.frameSize, durationUs);
} }
long dataSize = frame.readUnsignedIntToInt(); long dataSize = frame.readUnsignedInt();
long[] tableOfContents = new long[100]; long[] tableOfContents = new long[100];
for (int i = 0; i < 100; i++) { for (int i = 0; i < 100; i++) {
tableOfContents[i] = frame.readUnsignedByte(); tableOfContents[i] = frame.readUnsignedByte();
......
...@@ -664,9 +664,9 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -664,9 +664,9 @@ public class FragmentedMp4Extractor implements Extractor {
private static Pair<Integer, DefaultSampleValues> parseTrex(ParsableByteArray trex) { private static Pair<Integer, DefaultSampleValues> parseTrex(ParsableByteArray trex) {
trex.setPosition(Atom.FULL_HEADER_SIZE); trex.setPosition(Atom.FULL_HEADER_SIZE);
int trackId = trex.readInt(); int trackId = trex.readInt();
int defaultSampleDescriptionIndex = trex.readUnsignedIntToInt() - 1; int defaultSampleDescriptionIndex = trex.readInt() - 1;
int defaultSampleDuration = trex.readUnsignedIntToInt(); int defaultSampleDuration = trex.readInt();
int defaultSampleSize = trex.readUnsignedIntToInt(); int defaultSampleSize = trex.readInt();
int defaultSampleFlags = trex.readInt(); int defaultSampleFlags = trex.readInt();
return Pair.create(trackId, new DefaultSampleValues(defaultSampleDescriptionIndex, return Pair.create(trackId, new DefaultSampleValues(defaultSampleDescriptionIndex,
...@@ -751,8 +751,9 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -751,8 +751,9 @@ public class FragmentedMp4Extractor implements Extractor {
} }
} }
private static void parseTruns(ContainerAtom traf, TrackBundle trackBundle, long decodeTime, private static void parseTruns(
@Flags int flags) { ContainerAtom traf, TrackBundle trackBundle, long decodeTime, @Flags int flags)
throws ParserException {
int trunCount = 0; int trunCount = 0;
int totalSampleCount = 0; int totalSampleCount = 0;
List<LeafAtom> leafChildren = traf.leafChildren; List<LeafAtom> leafChildren = traf.leafChildren;
...@@ -871,13 +872,20 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -871,13 +872,20 @@ public class FragmentedMp4Extractor implements Extractor {
DefaultSampleValues defaultSampleValues = trackBundle.defaultSampleValues; DefaultSampleValues defaultSampleValues = trackBundle.defaultSampleValues;
int defaultSampleDescriptionIndex = int defaultSampleDescriptionIndex =
((atomFlags & 0x02 /* default_sample_description_index_present */) != 0) ((atomFlags & 0x02 /* default_sample_description_index_present */) != 0)
? tfhd.readUnsignedIntToInt() - 1 : defaultSampleValues.sampleDescriptionIndex; ? tfhd.readInt() - 1
int defaultSampleDuration = ((atomFlags & 0x08 /* default_sample_duration_present */) != 0) : defaultSampleValues.sampleDescriptionIndex;
? tfhd.readUnsignedIntToInt() : defaultSampleValues.duration; int defaultSampleDuration =
int defaultSampleSize = ((atomFlags & 0x10 /* default_sample_size_present */) != 0) ((atomFlags & 0x08 /* default_sample_duration_present */) != 0)
? tfhd.readUnsignedIntToInt() : defaultSampleValues.size; ? tfhd.readInt()
int defaultSampleFlags = ((atomFlags & 0x20 /* default_sample_flags_present */) != 0) : defaultSampleValues.duration;
? tfhd.readUnsignedIntToInt() : defaultSampleValues.flags; int defaultSampleSize =
((atomFlags & 0x10 /* default_sample_size_present */) != 0)
? tfhd.readInt()
: defaultSampleValues.size;
int defaultSampleFlags =
((atomFlags & 0x20 /* default_sample_flags_present */) != 0)
? tfhd.readInt()
: defaultSampleValues.flags;
trackBundle.fragment.header = new DefaultSampleValues(defaultSampleDescriptionIndex, trackBundle.fragment.header = new DefaultSampleValues(defaultSampleDescriptionIndex,
defaultSampleDuration, defaultSampleSize, defaultSampleFlags); defaultSampleDuration, defaultSampleSize, defaultSampleFlags);
return trackBundle; return trackBundle;
...@@ -910,16 +918,22 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -910,16 +918,22 @@ public class FragmentedMp4Extractor implements Extractor {
/** /**
* Parses a trun atom (defined in 14496-12). * Parses a trun atom (defined in 14496-12).
* *
* @param trackBundle The {@link TrackBundle} that contains the {@link TrackFragment} into * @param trackBundle The {@link TrackBundle} that contains the {@link TrackFragment} into which
* which parsed data should be placed. * parsed data should be placed.
* @param index Index of the track run in the fragment. * @param index Index of the track run in the fragment.
* @param decodeTime The decode time of the first sample in the fragment run. * @param decodeTime The decode time of the first sample in the fragment run.
* @param flags Flags to allow any required workaround to be executed. * @param flags Flags to allow any required workaround to be executed.
* @param trun The trun atom to decode. * @param trun The trun atom to decode.
* @return The starting position of samples for the next run. * @return The starting position of samples for the next run.
*/ */
private static int parseTrun(TrackBundle trackBundle, int index, long decodeTime, private static int parseTrun(
@Flags int flags, ParsableByteArray trun, int trackRunStart) { TrackBundle trackBundle,
int index,
long decodeTime,
@Flags int flags,
ParsableByteArray trun,
int trackRunStart)
throws ParserException {
trun.setPosition(Atom.HEADER_SIZE); trun.setPosition(Atom.HEADER_SIZE);
int fullAtom = trun.readInt(); int fullAtom = trun.readInt();
int atomFlags = Atom.parseFullAtomFlags(fullAtom); int atomFlags = Atom.parseFullAtomFlags(fullAtom);
...@@ -937,7 +951,7 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -937,7 +951,7 @@ public class FragmentedMp4Extractor implements Extractor {
boolean firstSampleFlagsPresent = (atomFlags & 0x04 /* first_sample_flags_present */) != 0; boolean firstSampleFlagsPresent = (atomFlags & 0x04 /* first_sample_flags_present */) != 0;
int firstSampleFlags = defaultSampleValues.flags; int firstSampleFlags = defaultSampleValues.flags;
if (firstSampleFlagsPresent) { if (firstSampleFlagsPresent) {
firstSampleFlags = trun.readUnsignedIntToInt(); firstSampleFlags = trun.readInt();
} }
boolean sampleDurationsPresent = (atomFlags & 0x100 /* sample_duration_present */) != 0; boolean sampleDurationsPresent = (atomFlags & 0x100 /* sample_duration_present */) != 0;
...@@ -948,20 +962,20 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -948,20 +962,20 @@ public class FragmentedMp4Extractor implements Extractor {
// Offset to the entire video timeline. In the presence of B-frames this is usually used to // Offset to the entire video timeline. In the presence of B-frames this is usually used to
// ensure that the first frame's presentation timestamp is zero. // ensure that the first frame's presentation timestamp is zero.
long edtsOffset = 0; long edtsOffsetUs = 0;
// Currently we only support a single edit that moves the entire media timeline (indicated by // Currently we only support a single edit that moves the entire media timeline (indicated by
// duration == 0). Other uses of edit lists are uncommon and unsupported. // duration == 0). Other uses of edit lists are uncommon and unsupported.
if (track.editListDurations != null && track.editListDurations.length == 1 if (track.editListDurations != null && track.editListDurations.length == 1
&& track.editListDurations[0] == 0) { && track.editListDurations[0] == 0) {
edtsOffset = edtsOffsetUs =
Util.scaleLargeTimestamp( Util.scaleLargeTimestamp(
track.editListMediaTimes[0], C.MILLIS_PER_SECOND, track.timescale); track.editListMediaTimes[0], C.MICROS_PER_SECOND, track.timescale);
} }
int[] sampleSizeTable = fragment.sampleSizeTable; int[] sampleSizeTable = fragment.sampleSizeTable;
int[] sampleCompositionTimeOffsetTable = fragment.sampleCompositionTimeOffsetTable; int[] sampleCompositionTimeOffsetUsTable = fragment.sampleCompositionTimeOffsetUsTable;
long[] sampleDecodingTimeTable = fragment.sampleDecodingTimeTable; long[] sampleDecodingTimeUsTable = fragment.sampleDecodingTimeUsTable;
boolean[] sampleIsSyncFrameTable = fragment.sampleIsSyncFrameTable; boolean[] sampleIsSyncFrameTable = fragment.sampleIsSyncFrameTable;
boolean workaroundEveryVideoFrameIsSyncFrame = track.type == C.TRACK_TYPE_VIDEO boolean workaroundEveryVideoFrameIsSyncFrame = track.type == C.TRACK_TYPE_VIDEO
...@@ -972,9 +986,10 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -972,9 +986,10 @@ public class FragmentedMp4Extractor implements Extractor {
long cumulativeTime = index > 0 ? fragment.nextFragmentDecodeTime : decodeTime; long cumulativeTime = index > 0 ? fragment.nextFragmentDecodeTime : decodeTime;
for (int i = trackRunStart; i < trackRunEnd; i++) { for (int i = trackRunStart; i < trackRunEnd; i++) {
// Use trun values if present, otherwise tfhd, otherwise trex. // Use trun values if present, otherwise tfhd, otherwise trex.
int sampleDuration = sampleDurationsPresent ? trun.readUnsignedIntToInt() int sampleDuration =
: defaultSampleValues.duration; checkNonNegative(sampleDurationsPresent ? trun.readInt() : defaultSampleValues.duration);
int sampleSize = sampleSizesPresent ? trun.readUnsignedIntToInt() : defaultSampleValues.size; int sampleSize =
checkNonNegative(sampleSizesPresent ? trun.readInt() : defaultSampleValues.size);
int sampleFlags = (i == 0 && firstSampleFlagsPresent) ? firstSampleFlags int sampleFlags = (i == 0 && firstSampleFlagsPresent) ? firstSampleFlags
: sampleFlagsPresent ? trun.readInt() : defaultSampleValues.flags; : sampleFlagsPresent ? trun.readInt() : defaultSampleValues.flags;
if (sampleCompositionTimeOffsetsPresent) { if (sampleCompositionTimeOffsetsPresent) {
...@@ -984,13 +999,13 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -984,13 +999,13 @@ public class FragmentedMp4Extractor implements Extractor {
// here, because unsigned integers will still be parsed correctly (unless their top bit is // here, because unsigned integers will still be parsed correctly (unless their top bit is
// set, which is never true in practice because sample offsets are always small). // set, which is never true in practice because sample offsets are always small).
int sampleOffset = trun.readInt(); int sampleOffset = trun.readInt();
sampleCompositionTimeOffsetTable[i] = sampleCompositionTimeOffsetUsTable[i] =
(int) ((sampleOffset * C.MILLIS_PER_SECOND) / timescale); (int) ((sampleOffset * C.MICROS_PER_SECOND) / timescale);
} else { } else {
sampleCompositionTimeOffsetTable[i] = 0; sampleCompositionTimeOffsetUsTable[i] = 0;
} }
sampleDecodingTimeTable[i] = sampleDecodingTimeUsTable[i] =
Util.scaleLargeTimestamp(cumulativeTime, C.MILLIS_PER_SECOND, timescale) - edtsOffset; Util.scaleLargeTimestamp(cumulativeTime, C.MICROS_PER_SECOND, timescale) - edtsOffsetUs;
sampleSizeTable[i] = sampleSize; sampleSizeTable[i] = sampleSize;
sampleIsSyncFrameTable[i] = ((sampleFlags >> 16) & 0x1) == 0 sampleIsSyncFrameTable[i] = ((sampleFlags >> 16) & 0x1) == 0
&& (!workaroundEveryVideoFrameIsSyncFrame || i == 0); && (!workaroundEveryVideoFrameIsSyncFrame || i == 0);
...@@ -1000,6 +1015,13 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -1000,6 +1015,13 @@ public class FragmentedMp4Extractor implements Extractor {
return trackRunEnd; return trackRunEnd;
} }
private static int checkNonNegative(int value) throws ParserException {
if (value < 0) {
throw new ParserException("Unexpected negtive value: " + value);
}
return value;
}
private static void parseUuid(ParsableByteArray uuid, TrackFragment out, private static void parseUuid(ParsableByteArray uuid, TrackFragment out,
byte[] extendedTypeScratch) throws ParserException { byte[] extendedTypeScratch) throws ParserException {
uuid.setPosition(Atom.HEADER_SIZE); uuid.setPosition(Atom.HEADER_SIZE);
...@@ -1269,7 +1291,7 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -1269,7 +1291,7 @@ public class FragmentedMp4Extractor implements Extractor {
Track track = currentTrackBundle.track; Track track = currentTrackBundle.track;
TrackOutput output = currentTrackBundle.output; TrackOutput output = currentTrackBundle.output;
int sampleIndex = currentTrackBundle.currentSampleIndex; int sampleIndex = currentTrackBundle.currentSampleIndex;
long sampleTimeUs = fragment.getSamplePresentationTime(sampleIndex) * 1000L; long sampleTimeUs = fragment.getSamplePresentationTimeUs(sampleIndex);
if (timestampAdjuster != null) { if (timestampAdjuster != null) {
sampleTimeUs = timestampAdjuster.adjustSampleTimestamp(sampleTimeUs); sampleTimeUs = timestampAdjuster.adjustSampleTimestamp(sampleTimeUs);
} }
...@@ -1513,10 +1535,9 @@ public class FragmentedMp4Extractor implements Extractor { ...@@ -1513,10 +1535,9 @@ public class FragmentedMp4Extractor implements Extractor {
* @param timeUs The seek time, in microseconds. * @param timeUs The seek time, in microseconds.
*/ */
public void seek(long timeUs) { public void seek(long timeUs) {
long timeMs = C.usToMs(timeUs);
int searchIndex = currentSampleIndex; int searchIndex = currentSampleIndex;
while (searchIndex < fragment.sampleCount while (searchIndex < fragment.sampleCount
&& fragment.getSamplePresentationTime(searchIndex) < timeMs) { && fragment.getSamplePresentationTimeUs(searchIndex) < timeUs) {
if (fragment.sampleIsSyncFrameTable[searchIndex]) { if (fragment.sampleIsSyncFrameTable[searchIndex]) {
firstSampleToOutputIndex = searchIndex; firstSampleToOutputIndex = searchIndex;
} }
......
...@@ -28,7 +28,6 @@ import com.google.android.exoplayer2.metadata.id3.InternalFrame; ...@@ -28,7 +28,6 @@ import com.google.android.exoplayer2.metadata.id3.InternalFrame;
import com.google.android.exoplayer2.metadata.id3.TextInformationFrame; import com.google.android.exoplayer2.metadata.id3.TextInformationFrame;
import com.google.android.exoplayer2.util.Log; import com.google.android.exoplayer2.util.Log;
import com.google.android.exoplayer2.util.ParsableByteArray; import com.google.android.exoplayer2.util.ParsableByteArray;
import java.nio.ByteBuffer;
/** Utilities for handling metadata in MP4. */ /** Utilities for handling metadata in MP4. */
/* package */ final class MetadataUtil { /* package */ final class MetadataUtil {
...@@ -282,7 +281,6 @@ import java.nio.ByteBuffer; ...@@ -282,7 +281,6 @@ import java.nio.ByteBuffer;
private static final int TYPE_TOP_BYTE_REPLACEMENT = 0xFD; // Truncated value of \uFFFD. private static final int TYPE_TOP_BYTE_REPLACEMENT = 0xFD; // Truncated value of \uFFFD.
private static final String MDTA_KEY_ANDROID_CAPTURE_FPS = "com.android.capture.fps"; private static final String MDTA_KEY_ANDROID_CAPTURE_FPS = "com.android.capture.fps";
private static final int MDTA_TYPE_INDICATOR_FLOAT = 23;
private MetadataUtil() {} private MetadataUtil() {}
...@@ -312,15 +310,8 @@ import java.nio.ByteBuffer; ...@@ -312,15 +310,8 @@ import java.nio.ByteBuffer;
Metadata.Entry entry = mdtaMetadata.get(i); Metadata.Entry entry = mdtaMetadata.get(i);
if (entry instanceof MdtaMetadataEntry) { if (entry instanceof MdtaMetadataEntry) {
MdtaMetadataEntry mdtaMetadataEntry = (MdtaMetadataEntry) entry; MdtaMetadataEntry mdtaMetadataEntry = (MdtaMetadataEntry) entry;
if (MDTA_KEY_ANDROID_CAPTURE_FPS.equals(mdtaMetadataEntry.key) if (MDTA_KEY_ANDROID_CAPTURE_FPS.equals(mdtaMetadataEntry.key)) {
&& mdtaMetadataEntry.typeIndicator == MDTA_TYPE_INDICATOR_FLOAT) { format = format.copyWithMetadata(new Metadata(mdtaMetadataEntry));
try {
float fps = ByteBuffer.wrap(mdtaMetadataEntry.value).asFloatBuffer().get();
format = format.copyWithFrameRate(fps);
format = format.copyWithMetadata(new Metadata(mdtaMetadataEntry));
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring invalid framerate");
}
} }
} }
} }
......
...@@ -60,14 +60,10 @@ import java.io.IOException; ...@@ -60,14 +60,10 @@ import java.io.IOException;
* The size of each sample in the fragment. * The size of each sample in the fragment.
*/ */
public int[] sampleSizeTable; public int[] sampleSizeTable;
/** /** The composition time offset of each sample in the fragment, in microseconds. */
* The composition time offset of each sample in the fragment. public int[] sampleCompositionTimeOffsetUsTable;
*/ /** The decoding time of each sample in the fragment, in microseconds. */
public int[] sampleCompositionTimeOffsetTable; public long[] sampleDecodingTimeUsTable;
/**
* The decoding time of each sample in the fragment.
*/
public long[] sampleDecodingTimeTable;
/** /**
* Indicates which samples are sync frames. * Indicates which samples are sync frames.
*/ */
...@@ -139,8 +135,8 @@ import java.io.IOException; ...@@ -139,8 +135,8 @@ import java.io.IOException;
// likely. The choice of 25% is relatively arbitrary. // likely. The choice of 25% is relatively arbitrary.
int tableSize = (sampleCount * 125) / 100; int tableSize = (sampleCount * 125) / 100;
sampleSizeTable = new int[tableSize]; sampleSizeTable = new int[tableSize];
sampleCompositionTimeOffsetTable = new int[tableSize]; sampleCompositionTimeOffsetUsTable = new int[tableSize];
sampleDecodingTimeTable = new long[tableSize]; sampleDecodingTimeUsTable = new long[tableSize];
sampleIsSyncFrameTable = new boolean[tableSize]; sampleIsSyncFrameTable = new boolean[tableSize];
sampleHasSubsampleEncryptionTable = new boolean[tableSize]; sampleHasSubsampleEncryptionTable = new boolean[tableSize];
} }
...@@ -186,8 +182,14 @@ import java.io.IOException; ...@@ -186,8 +182,14 @@ import java.io.IOException;
sampleEncryptionDataNeedsFill = false; sampleEncryptionDataNeedsFill = false;
} }
public long getSamplePresentationTime(int index) { /**
return sampleDecodingTimeTable[index] + sampleCompositionTimeOffsetTable[index]; * Returns the sample presentation timestamp in microseconds.
*
* @param index The sample index.
* @return The presentation timestamps of this sample in microseconds.
*/
public long getSamplePresentationTimeUs(int index) {
return sampleDecodingTimeUsTable[index] + sampleCompositionTimeOffsetUsTable[index];
} }
/** Returns whether the sample at the given index has a subsample encryption table. */ /** Returns whether the sample at the given index has a subsample encryption table. */
......
...@@ -41,6 +41,7 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -41,6 +41,7 @@ public final class H265Reader implements ElementaryStreamReader {
private static final int VPS_NUT = 32; private static final int VPS_NUT = 32;
private static final int SPS_NUT = 33; private static final int SPS_NUT = 33;
private static final int PPS_NUT = 34; private static final int PPS_NUT = 34;
private static final int AUD_NUT = 35;
private static final int PREFIX_SEI_NUT = 39; private static final int PREFIX_SEI_NUT = 39;
private static final int SUFFIX_SEI_NUT = 40; private static final int SUFFIX_SEI_NUT = 40;
...@@ -59,7 +60,7 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -59,7 +60,7 @@ public final class H265Reader implements ElementaryStreamReader {
private final NalUnitTargetBuffer sps; private final NalUnitTargetBuffer sps;
private final NalUnitTargetBuffer pps; private final NalUnitTargetBuffer pps;
private final NalUnitTargetBuffer prefixSei; private final NalUnitTargetBuffer prefixSei;
private final NalUnitTargetBuffer suffixSei; // TODO: Are both needed? private final NalUnitTargetBuffer suffixSei;
private long totalBytesWritten; private long totalBytesWritten;
// Per packet state that gets reset at the start of each packet. // Per packet state that gets reset at the start of each packet.
...@@ -161,9 +162,8 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -161,9 +162,8 @@ public final class H265Reader implements ElementaryStreamReader {
} }
private void startNalUnit(long position, int offset, int nalUnitType, long pesTimeUs) { private void startNalUnit(long position, int offset, int nalUnitType, long pesTimeUs) {
if (hasOutputFormat) { sampleReader.startNalUnit(position, offset, nalUnitType, pesTimeUs, hasOutputFormat);
sampleReader.startNalUnit(position, offset, nalUnitType, pesTimeUs); if (!hasOutputFormat) {
} else {
vps.startNalUnit(nalUnitType); vps.startNalUnit(nalUnitType);
sps.startNalUnit(nalUnitType); sps.startNalUnit(nalUnitType);
pps.startNalUnit(nalUnitType); pps.startNalUnit(nalUnitType);
...@@ -173,9 +173,8 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -173,9 +173,8 @@ public final class H265Reader implements ElementaryStreamReader {
} }
private void nalUnitData(byte[] dataArray, int offset, int limit) { private void nalUnitData(byte[] dataArray, int offset, int limit) {
if (hasOutputFormat) { sampleReader.readNalUnitData(dataArray, offset, limit);
sampleReader.readNalUnitData(dataArray, offset, limit); if (!hasOutputFormat) {
} else {
vps.appendToNalUnit(dataArray, offset, limit); vps.appendToNalUnit(dataArray, offset, limit);
sps.appendToNalUnit(dataArray, offset, limit); sps.appendToNalUnit(dataArray, offset, limit);
pps.appendToNalUnit(dataArray, offset, limit); pps.appendToNalUnit(dataArray, offset, limit);
...@@ -185,9 +184,8 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -185,9 +184,8 @@ public final class H265Reader implements ElementaryStreamReader {
} }
private void endNalUnit(long position, int offset, int discardPadding, long pesTimeUs) { private void endNalUnit(long position, int offset, int discardPadding, long pesTimeUs) {
if (hasOutputFormat) { sampleReader.endNalUnit(position, offset, hasOutputFormat);
sampleReader.endNalUnit(position, offset); if (!hasOutputFormat) {
} else {
vps.endNalUnit(discardPadding); vps.endNalUnit(discardPadding);
sps.endNalUnit(discardPadding); sps.endNalUnit(discardPadding);
pps.endNalUnit(discardPadding); pps.endNalUnit(discardPadding);
...@@ -400,17 +398,17 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -400,17 +398,17 @@ public final class H265Reader implements ElementaryStreamReader {
private final TrackOutput output; private final TrackOutput output;
// Per NAL unit state. A sample consists of one or more NAL units. // Per NAL unit state. A sample consists of one or more NAL units.
private long nalUnitStartPosition; private long nalUnitPosition;
private boolean nalUnitHasKeyframeData; private boolean nalUnitHasKeyframeData;
private int nalUnitBytesRead; private int nalUnitBytesRead;
private long nalUnitTimeUs; private long nalUnitTimeUs;
private boolean lookingForFirstSliceFlag; private boolean lookingForFirstSliceFlag;
private boolean isFirstSlice; private boolean isFirstSlice;
private boolean isFirstParameterSet; private boolean isFirstPrefixNalUnit;
// Per sample state that gets reset at the start of each sample. // Per sample state that gets reset at the start of each sample.
private boolean readingSample; private boolean readingSample;
private boolean writingParameterSets; private boolean readingPrefix;
private long samplePosition; private long samplePosition;
private long sampleTimeUs; private long sampleTimeUs;
private boolean sampleIsKeyframe; private boolean sampleIsKeyframe;
...@@ -422,32 +420,33 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -422,32 +420,33 @@ public final class H265Reader implements ElementaryStreamReader {
public void reset() { public void reset() {
lookingForFirstSliceFlag = false; lookingForFirstSliceFlag = false;
isFirstSlice = false; isFirstSlice = false;
isFirstParameterSet = false; isFirstPrefixNalUnit = false;
readingSample = false; readingSample = false;
writingParameterSets = false; readingPrefix = false;
} }
public void startNalUnit(long position, int offset, int nalUnitType, long pesTimeUs) { public void startNalUnit(
long position, int offset, int nalUnitType, long pesTimeUs, boolean hasOutputFormat) {
isFirstSlice = false; isFirstSlice = false;
isFirstParameterSet = false; isFirstPrefixNalUnit = false;
nalUnitTimeUs = pesTimeUs; nalUnitTimeUs = pesTimeUs;
nalUnitBytesRead = 0; nalUnitBytesRead = 0;
nalUnitStartPosition = position; nalUnitPosition = position;
if (nalUnitType >= VPS_NUT) { if (!isVclBodyNalUnit(nalUnitType)) {
if (!writingParameterSets && readingSample) { if (readingSample && !readingPrefix) {
// This is a non-VCL NAL unit, so flush the previous sample. if (hasOutputFormat) {
outputSample(offset); outputSample(offset);
}
readingSample = false; readingSample = false;
} }
if (nalUnitType <= PPS_NUT) { if (isPrefixNalUnit(nalUnitType)) {
// This sample will have parameter sets at the start. isFirstPrefixNalUnit = !readingPrefix;
isFirstParameterSet = !writingParameterSets; readingPrefix = true;
writingParameterSets = true;
} }
} }
// Look for the flag if this NAL unit contains a slice_segment_layer_rbsp. // Look for the first slice flag if this NAL unit contains a slice_segment_layer_rbsp.
nalUnitHasKeyframeData = (nalUnitType >= BLA_W_LP && nalUnitType <= CRA_NUT); nalUnitHasKeyframeData = (nalUnitType >= BLA_W_LP && nalUnitType <= CRA_NUT);
lookingForFirstSliceFlag = nalUnitHasKeyframeData || nalUnitType <= RASL_R; lookingForFirstSliceFlag = nalUnitHasKeyframeData || nalUnitType <= RASL_R;
} }
...@@ -464,31 +463,39 @@ public final class H265Reader implements ElementaryStreamReader { ...@@ -464,31 +463,39 @@ public final class H265Reader implements ElementaryStreamReader {
} }
} }
public void endNalUnit(long position, int offset) { public void endNalUnit(long position, int offset, boolean hasOutputFormat) {
if (writingParameterSets && isFirstSlice) { if (readingPrefix && isFirstSlice) {
// This sample has parameter sets. Reset the key-frame flag based on the first slice. // This sample has parameter sets. Reset the key-frame flag based on the first slice.
sampleIsKeyframe = nalUnitHasKeyframeData; sampleIsKeyframe = nalUnitHasKeyframeData;
writingParameterSets = false; readingPrefix = false;
} else if (isFirstParameterSet || isFirstSlice) { } else if (isFirstPrefixNalUnit || isFirstSlice) {
// This NAL unit is at the start of a new sample (access unit). // This NAL unit is at the start of a new sample (access unit).
if (readingSample) { if (hasOutputFormat && readingSample) {
// Output the sample ending before this NAL unit. // Output the sample ending before this NAL unit.
int nalUnitLength = (int) (position - nalUnitStartPosition); int nalUnitLength = (int) (position - nalUnitPosition);
outputSample(offset + nalUnitLength); outputSample(offset + nalUnitLength);
} }
samplePosition = nalUnitStartPosition; samplePosition = nalUnitPosition;
sampleTimeUs = nalUnitTimeUs; sampleTimeUs = nalUnitTimeUs;
readingSample = true;
sampleIsKeyframe = nalUnitHasKeyframeData; sampleIsKeyframe = nalUnitHasKeyframeData;
readingSample = true;
} }
} }
private void outputSample(int offset) { private void outputSample(int offset) {
@C.BufferFlags int flags = sampleIsKeyframe ? C.BUFFER_FLAG_KEY_FRAME : 0; @C.BufferFlags int flags = sampleIsKeyframe ? C.BUFFER_FLAG_KEY_FRAME : 0;
int size = (int) (nalUnitStartPosition - samplePosition); int size = (int) (nalUnitPosition - samplePosition);
output.sampleMetadata(sampleTimeUs, flags, size, offset, null); output.sampleMetadata(sampleTimeUs, flags, size, offset, null);
} }
} /** Returns whether a NAL unit type is one that occurs before any VCL NAL units in a sample. */
private static boolean isPrefixNalUnit(int nalUnitType) {
return (VPS_NUT <= nalUnitType && nalUnitType <= AUD_NUT) || nalUnitType == PREFIX_SEI_NUT;
}
/** Returns whether a NAL unit type is one that occurs in the VLC body of a sample. */
private static boolean isVclBodyNalUnit(int nalUnitType) {
return nalUnitType < VPS_NUT || nalUnitType == SUFFIX_SEI_NUT;
}
}
} }
...@@ -460,10 +460,15 @@ public final class TsExtractor implements Extractor { ...@@ -460,10 +460,15 @@ public final class TsExtractor implements Extractor {
// See ISO/IEC 13818-1, section 2.4.4.4 for more information on table id assignment. // See ISO/IEC 13818-1, section 2.4.4.4 for more information on table id assignment.
return; return;
} }
// section_syntax_indicator(1), '0'(1), reserved(2), section_length(12), // section_syntax_indicator(1), '0'(1), reserved(2), section_length(4)
// transport_stream_id (16), reserved (2), version_number (5), current_next_indicator (1), int secondHeaderByte = sectionData.readUnsignedByte();
// section_number (8), last_section_number (8) if ((secondHeaderByte & 0x80) == 0) {
sectionData.skipBytes(7); // section_syntax_indicator must be 1. See ISO/IEC 13818-1, section 2.4.4.5.
return;
}
// section_length(8), transport_stream_id (16), reserved (2), version_number (5),
// current_next_indicator (1), section_number (8), last_section_number (8)
sectionData.skipBytes(6);
int programCount = sectionData.bytesLeft() / 4; int programCount = sectionData.bytesLeft() / 4;
for (int i = 0; i < programCount; i++) { for (int i = 0; i < programCount; i++) {
...@@ -535,8 +540,14 @@ public final class TsExtractor implements Extractor { ...@@ -535,8 +540,14 @@ public final class TsExtractor implements Extractor {
timestampAdjusters.add(timestampAdjuster); timestampAdjusters.add(timestampAdjuster);
} }
// section_syntax_indicator(1), '0'(1), reserved(2), section_length(12) // section_syntax_indicator(1), '0'(1), reserved(2), section_length(4)
sectionData.skipBytes(2); int secondHeaderByte = sectionData.readUnsignedByte();
if ((secondHeaderByte & 0x80) == 0) {
// section_syntax_indicator must be 1. See ISO/IEC 13818-1, section 2.4.4.9.
return;
}
// section_length(8)
sectionData.skipBytes(1);
int programNumber = sectionData.readUnsignedShort(); int programNumber = sectionData.readUnsignedShort();
// Skip 3 bytes (24 bits), including: // Skip 3 bytes (24 bits), including:
......
...@@ -573,7 +573,9 @@ public final class MediaCodecInfo { ...@@ -573,7 +573,9 @@ public final class MediaCodecInfo {
width = alignedSize.x; width = alignedSize.x;
height = alignedSize.y; height = alignedSize.y;
if (frameRate == Format.NO_VALUE || frameRate <= 0) { // VideoCapabilities.areSizeAndRateSupported incorrectly returns false if frameRate < 1 on some
// versions of Android, so we only check the size in this case [Internal ref: b/153940404].
if (frameRate == Format.NO_VALUE || frameRate < 1) {
return capabilities.isSizeSupported(width, height); return capabilities.isSizeSupported(width, height);
} else { } else {
// The signaled frame rate may be slightly higher than the actual frame rate, so we take the // The signaled frame rate may be slightly higher than the actual frame rate, so we take the
......
...@@ -1022,7 +1022,7 @@ public abstract class DownloadService extends Service { ...@@ -1022,7 +1022,7 @@ public abstract class DownloadService extends Service {
try { try {
Intent intent = getIntent(context, serviceClass, DownloadService.ACTION_INIT); Intent intent = getIntent(context, serviceClass, DownloadService.ACTION_INIT);
context.startService(intent); context.startService(intent);
} catch (IllegalArgumentException e) { } catch (IllegalStateException e) {
// The process is classed as idle by the platform. Starting a background service is not // The process is classed as idle by the platform. Starting a background service is not
// allowed in this state. // allowed in this state.
Log.w(TAG, "Failed to restart DownloadService (process is idle)."); Log.w(TAG, "Failed to restart DownloadService (process is idle).");
......
...@@ -129,8 +129,9 @@ public final class Requirements implements Parcelable { ...@@ -129,8 +129,9 @@ public final class Requirements implements Parcelable {
} }
ConnectivityManager connectivityManager = ConnectivityManager connectivityManager =
(ConnectivityManager) context.getSystemService(Context.CONNECTIVITY_SERVICE); (ConnectivityManager)
NetworkInfo networkInfo = Assertions.checkNotNull(connectivityManager).getActiveNetworkInfo(); Assertions.checkNotNull(context.getSystemService(Context.CONNECTIVITY_SERVICE));
@Nullable NetworkInfo networkInfo = connectivityManager.getActiveNetworkInfo();
if (networkInfo == null if (networkInfo == null
|| !networkInfo.isConnected() || !networkInfo.isConnected()
|| !isInternetConnectivityValidated(connectivityManager)) { || !isInternetConnectivityValidated(connectivityManager)) {
...@@ -156,23 +157,27 @@ public final class Requirements implements Parcelable { ...@@ -156,23 +157,27 @@ public final class Requirements implements Parcelable {
} }
private boolean isDeviceIdle(Context context) { private boolean isDeviceIdle(Context context) {
PowerManager powerManager = (PowerManager) context.getSystemService(Context.POWER_SERVICE); PowerManager powerManager =
(PowerManager) Assertions.checkNotNull(context.getSystemService(Context.POWER_SERVICE));
return Util.SDK_INT >= 23 return Util.SDK_INT >= 23
? powerManager.isDeviceIdleMode() ? powerManager.isDeviceIdleMode()
: Util.SDK_INT >= 20 ? !powerManager.isInteractive() : !powerManager.isScreenOn(); : Util.SDK_INT >= 20 ? !powerManager.isInteractive() : !powerManager.isScreenOn();
} }
private static boolean isInternetConnectivityValidated(ConnectivityManager connectivityManager) { private static boolean isInternetConnectivityValidated(ConnectivityManager connectivityManager) {
// It's possible to query NetworkCapabilities from API level 23, but RequirementsWatcher only // It's possible to check NetworkCapabilities.NET_CAPABILITY_VALIDATED from API level 23, but
// fires an event to update its Requirements when NetworkCapabilities change from API level 24. // RequirementsWatcher only fires an event to re-check the requirements when NetworkCapabilities
// Since Requirements won't be updated, we assume connectivity is validated on API level 23. // change from API level 24. We assume that network capability is validated for API level 23 to
// keep in sync.
if (Util.SDK_INT < 24) { if (Util.SDK_INT < 24) {
return true; return true;
} }
Network activeNetwork = connectivityManager.getActiveNetwork();
@Nullable Network activeNetwork = connectivityManager.getActiveNetwork();
if (activeNetwork == null) { if (activeNetwork == null) {
return false; return false;
} }
@Nullable
NetworkCapabilities networkCapabilities = NetworkCapabilities networkCapabilities =
connectivityManager.getNetworkCapabilities(activeNetwork); connectivityManager.getNetworkCapabilities(activeNetwork);
return networkCapabilities != null return networkCapabilities != null
......
...@@ -150,6 +150,23 @@ public final class RequirementsWatcher { ...@@ -150,6 +150,23 @@ public final class RequirementsWatcher {
} }
} }
/**
* Re-checks the requirements if there are network requirements that are currently not met.
*
* <p>When we receive an event that implies newly established network connectivity, we re-check
* the requirements by calling {@link #checkRequirements()}. This check sometimes sees that there
* is still no active network, meaning that any network requirements will remain not met. By
* calling this method when we receive other events that imply continued network connectivity, we
* can detect that the requirements are met once an active network does exist.
*/
private void recheckNotMetNetworkRequirements() {
if ((notMetRequirements & (Requirements.NETWORK | Requirements.NETWORK_UNMETERED)) == 0) {
// No unmet network requirements to recheck.
return;
}
checkRequirements();
}
private class DeviceStatusChangeReceiver extends BroadcastReceiver { private class DeviceStatusChangeReceiver extends BroadcastReceiver {
@Override @Override
public void onReceive(Context context, Intent intent) { public void onReceive(Context context, Intent intent) {
...@@ -161,17 +178,25 @@ public final class RequirementsWatcher { ...@@ -161,17 +178,25 @@ public final class RequirementsWatcher {
@RequiresApi(24) @RequiresApi(24)
private final class NetworkCallback extends ConnectivityManager.NetworkCallback { private final class NetworkCallback extends ConnectivityManager.NetworkCallback {
boolean receivedCapabilitiesChange;
boolean networkValidated; private boolean receivedCapabilitiesChange;
private boolean networkValidated;
@Override @Override
public void onAvailable(Network network) { public void onAvailable(Network network) {
onNetworkCallback(); postCheckRequirements();
} }
@Override @Override
public void onLost(Network network) { public void onLost(Network network) {
onNetworkCallback(); postCheckRequirements();
}
@Override
public void onBlockedStatusChanged(Network network, boolean blocked) {
if (!blocked) {
postRecheckNotMetNetworkRequirements();
}
} }
@Override @Override
...@@ -181,11 +206,13 @@ public final class RequirementsWatcher { ...@@ -181,11 +206,13 @@ public final class RequirementsWatcher {
if (!receivedCapabilitiesChange || this.networkValidated != networkValidated) { if (!receivedCapabilitiesChange || this.networkValidated != networkValidated) {
receivedCapabilitiesChange = true; receivedCapabilitiesChange = true;
this.networkValidated = networkValidated; this.networkValidated = networkValidated;
onNetworkCallback(); postCheckRequirements();
} else if (networkValidated) {
postRecheckNotMetNetworkRequirements();
} }
} }
private void onNetworkCallback() { private void postCheckRequirements() {
handler.post( handler.post(
() -> { () -> {
if (networkCallback != null) { if (networkCallback != null) {
...@@ -193,5 +220,14 @@ public final class RequirementsWatcher { ...@@ -193,5 +220,14 @@ public final class RequirementsWatcher {
} }
}); });
} }
private void postRecheckNotMetNetworkRequirements() {
handler.post(
() -> {
if (networkCallback != null) {
recheckNotMetNetworkRequirements();
}
});
}
} }
} }
...@@ -293,7 +293,7 @@ public final class MaskingMediaSource extends CompositeMediaSource<Void> { ...@@ -293,7 +293,7 @@ public final class MaskingMediaSource extends CompositeMediaSource<Void> {
} }
/** Dummy placeholder timeline with one dynamic window with a period of indeterminate duration. */ /** Dummy placeholder timeline with one dynamic window with a period of indeterminate duration. */
private static final class DummyTimeline extends Timeline { public static final class DummyTimeline extends Timeline {
@Nullable private final Object tag; @Nullable private final Object tag;
......
...@@ -679,7 +679,8 @@ import org.checkerframework.checker.nullness.compatqual.NullableType; ...@@ -679,7 +679,8 @@ import org.checkerframework.checker.nullness.compatqual.NullableType;
return sampleQueues[i]; return sampleQueues[i];
} }
} }
SampleQueue trackOutput = new SampleQueue(allocator, drmSessionManager); SampleQueue trackOutput = new SampleQueue(
allocator, /* playbackLooper= */ handler.getLooper(), drmSessionManager);
trackOutput.setUpstreamFormatChangeListener(this); trackOutput.setUpstreamFormatChangeListener(this);
@NullableType @NullableType
TrackId[] sampleQueueTrackIds = Arrays.copyOf(this.sampleQueueTrackIds, trackCount + 1); TrackId[] sampleQueueTrackIds = Arrays.copyOf(this.sampleQueueTrackIds, trackCount + 1);
...@@ -729,6 +730,11 @@ import org.checkerframework.checker.nullness.compatqual.NullableType; ...@@ -729,6 +730,11 @@ import org.checkerframework.checker.nullness.compatqual.NullableType;
trackFormat = trackFormat.copyWithBitrate(icyHeaders.bitrate); trackFormat = trackFormat.copyWithBitrate(icyHeaders.bitrate);
} }
} }
if (trackFormat.drmInitData != null) {
trackFormat =
trackFormat.copyWithExoMediaCryptoType(
drmSessionManager.getExoMediaCryptoType(trackFormat.drmInitData));
}
trackArray[i] = new TrackGroup(trackFormat); trackArray[i] = new TrackGroup(trackFormat);
} }
isLive = length == C.LENGTH_UNSET && seekMap.getDurationUs() == C.TIME_UNSET; isLive = length == C.LENGTH_UNSET && seekMap.getDurationUs() == C.TIME_UNSET;
......
...@@ -174,7 +174,10 @@ public final class ProgressiveMediaSource extends BaseMediaSource ...@@ -174,7 +174,10 @@ public final class ProgressiveMediaSource extends BaseMediaSource
@Override @Override
public Factory setDrmSessionManager(DrmSessionManager<?> drmSessionManager) { public Factory setDrmSessionManager(DrmSessionManager<?> drmSessionManager) {
Assertions.checkState(!isCreateCalled); Assertions.checkState(!isCreateCalled);
this.drmSessionManager = drmSessionManager; this.drmSessionManager =
drmSessionManager != null
? drmSessionManager
: DrmSessionManager.getDummyDrmSessionManager();
return this; return this;
} }
......
...@@ -55,6 +55,7 @@ public class SampleQueue implements TrackOutput { ...@@ -55,6 +55,7 @@ public class SampleQueue implements TrackOutput {
private final SampleExtrasHolder extrasHolder; private final SampleExtrasHolder extrasHolder;
private final DrmSessionManager<?> drmSessionManager; private final DrmSessionManager<?> drmSessionManager;
private UpstreamFormatChangedListener upstreamFormatChangeListener; private UpstreamFormatChangedListener upstreamFormatChangeListener;
private final Looper playbackLooper;
@Nullable private Format downstreamFormat; @Nullable private Format downstreamFormat;
@Nullable private DrmSession<?> currentDrmSession; @Nullable private DrmSession<?> currentDrmSession;
...@@ -91,11 +92,13 @@ public class SampleQueue implements TrackOutput { ...@@ -91,11 +92,13 @@ public class SampleQueue implements TrackOutput {
* Creates a sample queue. * Creates a sample queue.
* *
* @param allocator An {@link Allocator} from which allocations for sample data can be obtained. * @param allocator An {@link Allocator} from which allocations for sample data can be obtained.
* @param playbackLooper The looper associated with the media playback thread.
* @param drmSessionManager The {@link DrmSessionManager} to obtain {@link DrmSession DrmSessions} * @param drmSessionManager The {@link DrmSessionManager} to obtain {@link DrmSession DrmSessions}
* from. The created instance does not take ownership of this {@link DrmSessionManager}. * from. The created instance does not take ownership of this {@link DrmSessionManager}.
*/ */
public SampleQueue(Allocator allocator, DrmSessionManager<?> drmSessionManager) { public SampleQueue(Allocator allocator, Looper playbackLooper, DrmSessionManager<?> drmSessionManager) {
sampleDataQueue = new SampleDataQueue(allocator); sampleDataQueue = new SampleDataQueue(allocator);
this.playbackLooper = playbackLooper;
this.drmSessionManager = drmSessionManager; this.drmSessionManager = drmSessionManager;
extrasHolder = new SampleExtrasHolder(); extrasHolder = new SampleExtrasHolder();
capacity = SAMPLE_CAPACITY_INCREMENT; capacity = SAMPLE_CAPACITY_INCREMENT;
...@@ -789,8 +792,7 @@ public class SampleQueue implements TrackOutput { ...@@ -789,8 +792,7 @@ public class SampleQueue implements TrackOutput {
} }
// Ensure we acquire the new session before releasing the previous one in case the same session // Ensure we acquire the new session before releasing the previous one in case the same session
// is being used for both DrmInitData. // is being used for both DrmInitData.
DrmSession<?> previousSession = currentDrmSession; @Nullable DrmSession previousSession = currentDrmSession;
Looper playbackLooper = Assertions.checkNotNull(Looper.myLooper());
currentDrmSession = currentDrmSession =
newDrmInitData != null newDrmInitData != null
? drmSessionManager.acquireSession(playbackLooper, newDrmInitData) ? drmSessionManager.acquireSession(playbackLooper, newDrmInitData)
......
...@@ -33,6 +33,42 @@ import org.checkerframework.checker.nullness.compatqual.NullableType; ...@@ -33,6 +33,42 @@ import org.checkerframework.checker.nullness.compatqual.NullableType;
/** Media source with a single period consisting of silent raw audio of a given duration. */ /** Media source with a single period consisting of silent raw audio of a given duration. */
public final class SilenceMediaSource extends BaseMediaSource { public final class SilenceMediaSource extends BaseMediaSource {
/** Factory for {@link SilenceMediaSource SilenceMediaSources}. */
public static final class Factory {
private long durationUs;
@Nullable private Object tag;
/**
* Sets the duration of the silent audio.
*
* @param durationUs The duration of silent audio to output, in microseconds.
* @return This factory, for convenience.
*/
public Factory setDurationUs(long durationUs) {
this.durationUs = durationUs;
return this;
}
/**
* Sets a tag for the media source which will be published in the {@link
* com.google.android.exoplayer2.Timeline} of the source as {@link
* com.google.android.exoplayer2.Timeline.Window#tag}.
*
* @param tag A tag for the media source.
* @return This factory, for convenience.
*/
public Factory setTag(@Nullable Object tag) {
this.tag = tag;
return this;
}
/** Creates a new {@link SilenceMediaSource}. */
public SilenceMediaSource createMediaSource() {
return new SilenceMediaSource(durationUs, tag);
}
}
private static final int SAMPLE_RATE_HZ = 44100; private static final int SAMPLE_RATE_HZ = 44100;
@C.PcmEncoding private static final int ENCODING = C.ENCODING_PCM_16BIT; @C.PcmEncoding private static final int ENCODING = C.ENCODING_PCM_16BIT;
private static final int CHANNEL_COUNT = 2; private static final int CHANNEL_COUNT = 2;
...@@ -54,6 +90,7 @@ public final class SilenceMediaSource extends BaseMediaSource { ...@@ -54,6 +90,7 @@ public final class SilenceMediaSource extends BaseMediaSource {
new byte[Util.getPcmFrameSize(ENCODING, CHANNEL_COUNT) * 1024]; new byte[Util.getPcmFrameSize(ENCODING, CHANNEL_COUNT) * 1024];
private final long durationUs; private final long durationUs;
@Nullable private final Object tag;
/** /**
* Creates a new media source providing silent audio of the given duration. * Creates a new media source providing silent audio of the given duration.
...@@ -61,15 +98,25 @@ public final class SilenceMediaSource extends BaseMediaSource { ...@@ -61,15 +98,25 @@ public final class SilenceMediaSource extends BaseMediaSource {
* @param durationUs The duration of silent audio to output, in microseconds. * @param durationUs The duration of silent audio to output, in microseconds.
*/ */
public SilenceMediaSource(long durationUs) { public SilenceMediaSource(long durationUs) {
this(durationUs, /* tag= */ null);
}
private SilenceMediaSource(long durationUs, @Nullable Object tag) {
Assertions.checkArgument(durationUs >= 0); Assertions.checkArgument(durationUs >= 0);
this.durationUs = durationUs; this.durationUs = durationUs;
this.tag = tag;
} }
@Override @Override
protected void prepareSourceInternal(@Nullable TransferListener mediaTransferListener) { protected void prepareSourceInternal(@Nullable TransferListener mediaTransferListener) {
refreshSourceInfo( refreshSourceInfo(
new SinglePeriodTimeline( new SinglePeriodTimeline(
durationUs, /* isSeekable= */ true, /* isDynamic= */ false, /* isLive= */ false)); durationUs,
/* isSeekable= */ true,
/* isDynamic= */ false,
/* isLive= */ false,
/* manifest= */ null,
tag));
} }
@Override @Override
......
...@@ -29,8 +29,7 @@ import java.util.Arrays; ...@@ -29,8 +29,7 @@ import java.util.Arrays;
import org.checkerframework.checker.nullness.compatqual.NullableType; import org.checkerframework.checker.nullness.compatqual.NullableType;
/** /**
* Represents ad group times relative to the start of the media and information on the state and * Represents ad group times and information on the state and URIs of ads within each ad group.
* URIs of ads within each ad group.
* *
* <p>Instances are immutable. Call the {@code with*} methods to get new instances that have the * <p>Instances are immutable. Call the {@code with*} methods to get new instances that have the
* required changes. * required changes.
...@@ -272,8 +271,9 @@ public final class AdPlaybackState { ...@@ -272,8 +271,9 @@ public final class AdPlaybackState {
/** The number of ad groups. */ /** The number of ad groups. */
public final int adGroupCount; public final int adGroupCount;
/** /**
* The times of ad groups, in microseconds. A final element with the value {@link * The times of ad groups, in microseconds, relative to the start of the {@link
* C#TIME_END_OF_SOURCE} indicates a postroll ad. * com.google.android.exoplayer2.Timeline.Period} they belong to. A final element with the value
* {@link C#TIME_END_OF_SOURCE} indicates a postroll ad.
*/ */
public final long[] adGroupTimesUs; public final long[] adGroupTimesUs;
/** The ad groups. */ /** The ad groups. */
...@@ -286,8 +286,9 @@ public final class AdPlaybackState { ...@@ -286,8 +286,9 @@ public final class AdPlaybackState {
/** /**
* Creates a new ad playback state with the specified ad group times. * Creates a new ad playback state with the specified ad group times.
* *
* @param adGroupTimesUs The times of ad groups in microseconds. A final element with the value * @param adGroupTimesUs The times of ad groups in microseconds, relative to the start of the
* {@link C#TIME_END_OF_SOURCE} indicates that there is a postroll ad. * {@link com.google.android.exoplayer2.Timeline.Period} they belong to. A final element with
* the value {@link C#TIME_END_OF_SOURCE} indicates that there is a postroll ad.
*/ */
public AdPlaybackState(long... adGroupTimesUs) { public AdPlaybackState(long... adGroupTimesUs) {
int count = adGroupTimesUs.length; int count = adGroupTimesUs.length;
...@@ -315,16 +316,18 @@ public final class AdPlaybackState { ...@@ -315,16 +316,18 @@ public final class AdPlaybackState {
* unplayed. Returns {@link C#INDEX_UNSET} if the ad group at or before {@code positionUs} has no * unplayed. Returns {@link C#INDEX_UNSET} if the ad group at or before {@code positionUs} has no
* ads remaining to be played, or if there is no such ad group. * ads remaining to be played, or if there is no such ad group.
* *
* @param positionUs The position at or before which to find an ad group, in microseconds, or * @param positionUs The period position at or before which to find an ad group, in microseconds,
* {@link C#TIME_END_OF_SOURCE} for the end of the stream (in which case the index of any * or {@link C#TIME_END_OF_SOURCE} for the end of the stream (in which case the index of any
* unplayed postroll ad group will be returned). * unplayed postroll ad group will be returned).
* @param periodDurationUs The duration of the containing timeline period, in microseconds, or
* {@link C#TIME_UNSET} if not known.
* @return The index of the ad group, or {@link C#INDEX_UNSET}. * @return The index of the ad group, or {@link C#INDEX_UNSET}.
*/ */
public int getAdGroupIndexForPositionUs(long positionUs) { public int getAdGroupIndexForPositionUs(long positionUs, long periodDurationUs) {
// Use a linear search as the array elements may not be increasing due to TIME_END_OF_SOURCE. // Use a linear search as the array elements may not be increasing due to TIME_END_OF_SOURCE.
// In practice we expect there to be few ad groups so the search shouldn't be expensive. // In practice we expect there to be few ad groups so the search shouldn't be expensive.
int index = adGroupTimesUs.length - 1; int index = adGroupTimesUs.length - 1;
while (index >= 0 && isPositionBeforeAdGroup(positionUs, index)) { while (index >= 0 && isPositionBeforeAdGroup(positionUs, periodDurationUs, index)) {
index--; index--;
} }
return index >= 0 && adGroups[index].hasUnplayedAds() ? index : C.INDEX_UNSET; return index >= 0 && adGroups[index].hasUnplayedAds() ? index : C.INDEX_UNSET;
...@@ -334,11 +337,11 @@ public final class AdPlaybackState { ...@@ -334,11 +337,11 @@ public final class AdPlaybackState {
* Returns the index of the next ad group after {@code positionUs} that has ads remaining to be * Returns the index of the next ad group after {@code positionUs} that has ads remaining to be
* played. Returns {@link C#INDEX_UNSET} if there is no such ad group. * played. Returns {@link C#INDEX_UNSET} if there is no such ad group.
* *
* @param positionUs The position after which to find an ad group, in microseconds, or {@link * @param positionUs The period position after which to find an ad group, in microseconds, or
* C#TIME_END_OF_SOURCE} for the end of the stream (in which case there can be no ad group * {@link C#TIME_END_OF_SOURCE} for the end of the stream (in which case there can be no ad
* after the position). * group after the position).
* @param periodDurationUs The duration of the containing period in microseconds, or {@link * @param periodDurationUs The duration of the containing timeline period, in microseconds, or
* C#TIME_UNSET} if not known. * {@link C#TIME_UNSET} if not known.
* @return The index of the ad group, or {@link C#INDEX_UNSET}. * @return The index of the ad group, or {@link C#INDEX_UNSET}.
*/ */
public int getAdGroupIndexAfterPositionUs(long positionUs, long periodDurationUs) { public int getAdGroupIndexAfterPositionUs(long positionUs, long periodDurationUs) {
...@@ -357,6 +360,18 @@ public final class AdPlaybackState { ...@@ -357,6 +360,18 @@ public final class AdPlaybackState {
return index < adGroupTimesUs.length ? index : C.INDEX_UNSET; return index < adGroupTimesUs.length ? index : C.INDEX_UNSET;
} }
/** Returns whether the specified ad has been marked as in {@link #AD_STATE_ERROR}. */
public boolean isAdInErrorState(int adGroupIndex, int adIndexInAdGroup) {
if (adGroupIndex >= adGroups.length) {
return false;
}
AdGroup adGroup = adGroups[adGroupIndex];
if (adGroup.count == C.LENGTH_UNSET || adIndexInAdGroup >= adGroup.count) {
return false;
}
return adGroup.states[adIndexInAdGroup] == AdPlaybackState.AD_STATE_ERROR;
}
/** /**
* Returns an instance with the number of ads in {@code adGroupIndex} resolved to {@code adCount}. * Returns an instance with the number of ads in {@code adGroupIndex} resolved to {@code adCount}.
* The ad count must be greater than zero. * The ad count must be greater than zero.
...@@ -425,7 +440,10 @@ public final class AdPlaybackState { ...@@ -425,7 +440,10 @@ public final class AdPlaybackState {
return new AdPlaybackState(adGroupTimesUs, adGroups, adResumePositionUs, contentDurationUs); return new AdPlaybackState(adGroupTimesUs, adGroups, adResumePositionUs, contentDurationUs);
} }
/** Returns an instance with the specified ad resume position, in microseconds. */ /**
* Returns an instance with the specified ad resume position, in microseconds, relative to the
* start of the current ad.
*/
@CheckResult @CheckResult
public AdPlaybackState withAdResumePositionUs(long adResumePositionUs) { public AdPlaybackState withAdResumePositionUs(long adResumePositionUs) {
if (this.adResumePositionUs == adResumePositionUs) { if (this.adResumePositionUs == adResumePositionUs) {
...@@ -471,14 +489,15 @@ public final class AdPlaybackState { ...@@ -471,14 +489,15 @@ public final class AdPlaybackState {
return result; return result;
} }
private boolean isPositionBeforeAdGroup(long positionUs, int adGroupIndex) { private boolean isPositionBeforeAdGroup(
long positionUs, long periodDurationUs, int adGroupIndex) {
if (positionUs == C.TIME_END_OF_SOURCE) { if (positionUs == C.TIME_END_OF_SOURCE) {
// The end of the content is at (but not before) any postroll ad, and after any other ads. // The end of the content is at (but not before) any postroll ad, and after any other ads.
return false; return false;
} }
long adGroupPositionUs = adGroupTimesUs[adGroupIndex]; long adGroupPositionUs = adGroupTimesUs[adGroupIndex];
if (adGroupPositionUs == C.TIME_END_OF_SOURCE) { if (adGroupPositionUs == C.TIME_END_OF_SOURCE) {
return contentDurationUs == C.TIME_UNSET || positionUs < contentDurationUs; return periodDurationUs == C.TIME_UNSET || positionUs < periodDurationUs;
} else { } else {
return positionUs < adGroupPositionUs; return positionUs < adGroupPositionUs;
} }
......
...@@ -15,6 +15,7 @@ ...@@ -15,6 +15,7 @@
*/ */
package com.google.android.exoplayer2.source.chunk; package com.google.android.exoplayer2.source.chunk;
import android.os.Looper;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C; import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.Format; import com.google.android.exoplayer2.Format;
...@@ -130,13 +131,19 @@ public class ChunkSampleStream<T extends ChunkSource> implements SampleStream, S ...@@ -130,13 +131,19 @@ public class ChunkSampleStream<T extends ChunkSource> implements SampleStream, S
int[] trackTypes = new int[1 + embeddedTrackCount]; int[] trackTypes = new int[1 + embeddedTrackCount];
SampleQueue[] sampleQueues = new SampleQueue[1 + embeddedTrackCount]; SampleQueue[] sampleQueues = new SampleQueue[1 + embeddedTrackCount];
primarySampleQueue = new SampleQueue(allocator, drmSessionManager); primarySampleQueue = new SampleQueue(
allocator,
/* playbackLooper= */ Assertions.checkNotNull(Looper.myLooper()),
drmSessionManager);
trackTypes[0] = primaryTrackType; trackTypes[0] = primaryTrackType;
sampleQueues[0] = primarySampleQueue; sampleQueues[0] = primarySampleQueue;
for (int i = 0; i < embeddedTrackCount; i++) { for (int i = 0; i < embeddedTrackCount; i++) {
SampleQueue sampleQueue = SampleQueue sampleQueue =
new SampleQueue(allocator, DrmSessionManager.getDummyDrmSessionManager()); new SampleQueue(
allocator,
/* playbackLooper= */ Assertions.checkNotNull(Looper.myLooper()),
DrmSessionManager.getDummyDrmSessionManager());
embeddedSampleQueues[i] = sampleQueue; embeddedSampleQueues[i] = sampleQueue;
sampleQueues[i + 1] = sampleQueue; sampleQueues[i + 1] = sampleQueue;
trackTypes[i + 1] = embeddedTrackTypes[i]; trackTypes[i + 1] = embeddedTrackTypes[i];
......
...@@ -1990,6 +1990,10 @@ public class DefaultTrackSelector extends MappingTrackSelector { ...@@ -1990,6 +1990,10 @@ public class DefaultTrackSelector extends MappingTrackSelector {
int maxVideoHeight, int maxVideoHeight,
int maxVideoFrameRate, int maxVideoFrameRate,
int maxVideoBitrate) { int maxVideoBitrate) {
if ((format.roleFlags & C.ROLE_FLAG_TRICK_PLAY) != 0) {
// Ignore trick-play tracks for now.
return false;
}
return isSupported(formatSupport, false) return isSupported(formatSupport, false)
&& ((formatSupport & requiredAdaptiveSupport) != 0) && ((formatSupport & requiredAdaptiveSupport) != 0)
&& (mimeType == null || Util.areEqual(format.sampleMimeType, mimeType)) && (mimeType == null || Util.areEqual(format.sampleMimeType, mimeType))
...@@ -2013,9 +2017,13 @@ public class DefaultTrackSelector extends MappingTrackSelector { ...@@ -2013,9 +2017,13 @@ public class DefaultTrackSelector extends MappingTrackSelector {
params.viewportWidth, params.viewportHeight, params.viewportOrientationMayChange); params.viewportWidth, params.viewportHeight, params.viewportOrientationMayChange);
@Capabilities int[] trackFormatSupport = formatSupports[groupIndex]; @Capabilities int[] trackFormatSupport = formatSupports[groupIndex];
for (int trackIndex = 0; trackIndex < trackGroup.length; trackIndex++) { for (int trackIndex = 0; trackIndex < trackGroup.length; trackIndex++) {
Format format = trackGroup.getFormat(trackIndex);
if ((format.roleFlags & C.ROLE_FLAG_TRICK_PLAY) != 0) {
// Ignore trick-play tracks for now.
continue;
}
if (isSupported(trackFormatSupport[trackIndex], if (isSupported(trackFormatSupport[trackIndex],
params.exceedRendererCapabilitiesIfNecessary)) { params.exceedRendererCapabilitiesIfNecessary)) {
Format format = trackGroup.getFormat(trackIndex);
boolean isWithinConstraints = boolean isWithinConstraints =
selectedTrackIndices.contains(trackIndex) selectedTrackIndices.contains(trackIndex)
&& (format.width == Format.NO_VALUE || format.width <= params.maxVideoWidth) && (format.width == Format.NO_VALUE || format.width <= params.maxVideoWidth)
......
...@@ -32,6 +32,7 @@ import java.lang.annotation.Documented; ...@@ -32,6 +32,7 @@ import java.lang.annotation.Documented;
import java.lang.annotation.Retention; import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy; import java.lang.annotation.RetentionPolicy;
import java.util.concurrent.ExecutorService; import java.util.concurrent.ExecutorService;
import java.util.concurrent.atomic.AtomicBoolean;
/** /**
* Manages the background loading of {@link Loadable}s. * Manages the background loading of {@link Loadable}s.
...@@ -56,6 +57,21 @@ public final class Loader implements LoaderErrorThrower { ...@@ -56,6 +57,21 @@ public final class Loader implements LoaderErrorThrower {
/** /**
* Cancels the load. * Cancels the load.
*
* <p>Loadable implementations should ensure that a currently executing {@link #load()} call
* will exit reasonably quickly after this method is called. The {@link #load()} call may exit
* either by returning or by throwing an {@link IOException}.
*
* <p>If there is a currently executing {@link #load()} call, then the thread on which that call
* is being made will be interrupted immediately after the call to this method. Hence
* implementations do not need to (and should not attempt to) interrupt the loading thread
* themselves.
*
* <p>Although the loading thread will be interrupted, Loadable implementations should not use
* the interrupted status of the loading thread in {@link #load()} to determine whether the load
* has been canceled. This approach is not robust [Internal ref: b/79223737]. Instead,
* implementations should use their own flag to signal cancelation (for example, using {@link
* AtomicBoolean}).
*/ */
void cancelLoad(); void cancelLoad();
...@@ -309,10 +325,9 @@ public final class Loader implements LoaderErrorThrower { ...@@ -309,10 +325,9 @@ public final class Loader implements LoaderErrorThrower {
private static final String TAG = "LoadTask"; private static final String TAG = "LoadTask";
private static final int MSG_START = 0; private static final int MSG_START = 0;
private static final int MSG_CANCEL = 1; private static final int MSG_FINISH = 1;
private static final int MSG_END_OF_SOURCE = 2; private static final int MSG_IO_EXCEPTION = 2;
private static final int MSG_IO_EXCEPTION = 3; private static final int MSG_FATAL_ERROR = 3;
private static final int MSG_FATAL_ERROR = 4;
public final int defaultMinRetryCount; public final int defaultMinRetryCount;
...@@ -323,8 +338,8 @@ public final class Loader implements LoaderErrorThrower { ...@@ -323,8 +338,8 @@ public final class Loader implements LoaderErrorThrower {
@Nullable private IOException currentError; @Nullable private IOException currentError;
private int errorCount; private int errorCount;
@Nullable private volatile Thread executorThread; @Nullable private Thread executorThread;
private volatile boolean canceled; private boolean canceled;
private volatile boolean released; private volatile boolean released;
public LoadTask(Looper looper, T loadable, Loader.Callback<T> callback, public LoadTask(Looper looper, T loadable, Loader.Callback<T> callback,
...@@ -356,16 +371,21 @@ public final class Loader implements LoaderErrorThrower { ...@@ -356,16 +371,21 @@ public final class Loader implements LoaderErrorThrower {
this.released = released; this.released = released;
currentError = null; currentError = null;
if (hasMessages(MSG_START)) { if (hasMessages(MSG_START)) {
// The task has not been given to the executor yet.
canceled = true;
removeMessages(MSG_START); removeMessages(MSG_START);
if (!released) { if (!released) {
sendEmptyMessage(MSG_CANCEL); sendEmptyMessage(MSG_FINISH);
} }
} else { } else {
canceled = true; // The task has been given to the executor.
loadable.cancelLoad(); synchronized (this) {
Thread executorThread = this.executorThread; canceled = true;
if (executorThread != null) { loadable.cancelLoad();
executorThread.interrupt(); @Nullable Thread executorThread = this.executorThread;
if (executorThread != null) {
executorThread.interrupt();
}
} }
} }
if (released) { if (released) {
...@@ -384,8 +404,12 @@ public final class Loader implements LoaderErrorThrower { ...@@ -384,8 +404,12 @@ public final class Loader implements LoaderErrorThrower {
@Override @Override
public void run() { public void run() {
try { try {
executorThread = Thread.currentThread(); boolean shouldLoad;
if (!canceled) { synchronized (this) {
shouldLoad = !canceled;
executorThread = Thread.currentThread();
}
if (shouldLoad) {
TraceUtil.beginSection("load:" + loadable.getClass().getSimpleName()); TraceUtil.beginSection("load:" + loadable.getClass().getSimpleName());
try { try {
loadable.load(); loadable.load();
...@@ -393,8 +417,13 @@ public final class Loader implements LoaderErrorThrower { ...@@ -393,8 +417,13 @@ public final class Loader implements LoaderErrorThrower {
TraceUtil.endSection(); TraceUtil.endSection();
} }
} }
synchronized (this) {
executorThread = null;
// Clear the interrupted flag if set, to avoid it leaking into a subsequent task.
Thread.interrupted();
}
if (!released) { if (!released) {
sendEmptyMessage(MSG_END_OF_SOURCE); sendEmptyMessage(MSG_FINISH);
} }
} catch (IOException e) { } catch (IOException e) {
if (!released) { if (!released) {
...@@ -404,7 +433,7 @@ public final class Loader implements LoaderErrorThrower { ...@@ -404,7 +433,7 @@ public final class Loader implements LoaderErrorThrower {
// The load was canceled. // The load was canceled.
Assertions.checkState(canceled); Assertions.checkState(canceled);
if (!released) { if (!released) {
sendEmptyMessage(MSG_END_OF_SOURCE); sendEmptyMessage(MSG_FINISH);
} }
} catch (Exception e) { } catch (Exception e) {
// This should never happen, but handle it anyway. // This should never happen, but handle it anyway.
...@@ -453,10 +482,7 @@ public final class Loader implements LoaderErrorThrower { ...@@ -453,10 +482,7 @@ public final class Loader implements LoaderErrorThrower {
return; return;
} }
switch (msg.what) { switch (msg.what) {
case MSG_CANCEL: case MSG_FINISH:
callback.onLoadCanceled(loadable, nowMs, durationMs, false);
break;
case MSG_END_OF_SOURCE:
try { try {
callback.onLoadCompleted(loadable, nowMs, durationMs); callback.onLoadCompleted(loadable, nowMs, durationMs);
} catch (RuntimeException e) { } catch (RuntimeException e) {
......
...@@ -109,7 +109,6 @@ public final class CacheUtil { ...@@ -109,7 +109,6 @@ public final class CacheUtil {
* *
* @param dataSpec Defines the data to be cached. * @param dataSpec Defines the data to be cached.
* @param cache A {@link Cache} to store the data. * @param cache A {@link Cache} to store the data.
* @param cacheKeyFactory An optional factory for cache keys.
* @param upstream A {@link DataSource} for reading data not in the cache. * @param upstream A {@link DataSource} for reading data not in the cache.
* @param progressListener A listener to receive progress updates, or {@code null}. * @param progressListener A listener to receive progress updates, or {@code null}.
* @param isCanceled An optional flag that will interrupt caching if set to true. * @param isCanceled An optional flag that will interrupt caching if set to true.
...@@ -120,7 +119,6 @@ public final class CacheUtil { ...@@ -120,7 +119,6 @@ public final class CacheUtil {
public static void cache( public static void cache(
DataSpec dataSpec, DataSpec dataSpec,
Cache cache, Cache cache,
@Nullable CacheKeyFactory cacheKeyFactory,
DataSource upstream, DataSource upstream,
@Nullable ProgressListener progressListener, @Nullable ProgressListener progressListener,
@Nullable AtomicBoolean isCanceled) @Nullable AtomicBoolean isCanceled)
...@@ -128,7 +126,7 @@ public final class CacheUtil { ...@@ -128,7 +126,7 @@ public final class CacheUtil {
cache( cache(
dataSpec, dataSpec,
cache, cache,
cacheKeyFactory, /* cacheKeyFactory= */ null,
new CacheDataSource(cache, upstream), new CacheDataSource(cache, upstream),
new byte[DEFAULT_BUFFER_SIZE_BYTES], new byte[DEFAULT_BUFFER_SIZE_BYTES],
/* priorityTaskManager= */ null, /* priorityTaskManager= */ null,
...@@ -139,14 +137,14 @@ public final class CacheUtil { ...@@ -139,14 +137,14 @@ public final class CacheUtil {
} }
/** /**
* Caches the data defined by {@code dataSpec} while skipping already cached data. Caching stops * Caches the data defined by {@code dataSpec}, skipping already cached data. Caching stops early
* early if end of input is reached and {@code enableEOFException} is false. * if end of input is reached and {@code enableEOFException} is false.
* *
* <p>If a {@link PriorityTaskManager} is given, it's used to pause and resume caching depending * <p>If a {@link PriorityTaskManager} is provided, it's used to pause and resume caching
* on {@code priority} and the priority of other tasks registered to the PriorityTaskManager. * depending on {@code priority} and the priority of other tasks registered to the
* Please note that it's the responsibility of the calling code to call {@link * PriorityTaskManager. Please note that it's the responsibility of the calling code to call
* PriorityTaskManager#add} to register with the manager before calling this method, and to call * {@link PriorityTaskManager#add} to register with the manager before calling this method, and to
* {@link PriorityTaskManager#remove} afterwards to unregister. * call {@link PriorityTaskManager#remove} afterwards to unregister.
* *
* <p>This method may be slow and shouldn't normally be called on the main thread. * <p>This method may be slow and shouldn't normally be called on the main thread.
* *
......
...@@ -15,8 +15,10 @@ ...@@ -15,8 +15,10 @@
*/ */
package com.google.android.exoplayer2.upstream.cache; package com.google.android.exoplayer2.upstream.cache;
import static com.google.android.exoplayer2.util.Assertions.checkArgument;
import static com.google.android.exoplayer2.util.Assertions.checkState;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
import com.google.android.exoplayer2.util.Assertions;
import com.google.android.exoplayer2.util.Log; import com.google.android.exoplayer2.util.Log;
import java.io.File; import java.io.File;
import java.util.TreeSet; import java.util.TreeSet;
...@@ -115,12 +117,18 @@ import java.util.TreeSet; ...@@ -115,12 +117,18 @@ import java.util.TreeSet;
* @return the length of the cached or not cached data block length. * @return the length of the cached or not cached data block length.
*/ */
public long getCachedBytesLength(long position, long length) { public long getCachedBytesLength(long position, long length) {
checkArgument(position >= 0);
checkArgument(length >= 0);
SimpleCacheSpan span = getSpan(position); SimpleCacheSpan span = getSpan(position);
if (span.isHoleSpan()) { if (span.isHoleSpan()) {
// We don't have a span covering the start of the queried region. // We don't have a span covering the start of the queried region.
return -Math.min(span.isOpenEnded() ? Long.MAX_VALUE : span.length, length); return -Math.min(span.isOpenEnded() ? Long.MAX_VALUE : span.length, length);
} }
long queryEndPosition = position + length; long queryEndPosition = position + length;
if (queryEndPosition < 0) {
// The calculation rolled over (length is probably Long.MAX_VALUE).
queryEndPosition = Long.MAX_VALUE;
}
long currentEndPosition = span.position + span.length; long currentEndPosition = span.position + span.length;
if (currentEndPosition < queryEndPosition) { if (currentEndPosition < queryEndPosition) {
for (SimpleCacheSpan next : cachedSpans.tailSet(span, false)) { for (SimpleCacheSpan next : cachedSpans.tailSet(span, false)) {
...@@ -151,7 +159,7 @@ import java.util.TreeSet; ...@@ -151,7 +159,7 @@ import java.util.TreeSet;
*/ */
public SimpleCacheSpan setLastTouchTimestamp( public SimpleCacheSpan setLastTouchTimestamp(
SimpleCacheSpan cacheSpan, long lastTouchTimestamp, boolean updateFile) { SimpleCacheSpan cacheSpan, long lastTouchTimestamp, boolean updateFile) {
Assertions.checkState(cachedSpans.remove(cacheSpan)); checkState(cachedSpans.remove(cacheSpan));
File file = cacheSpan.file; File file = cacheSpan.file;
if (updateFile) { if (updateFile) {
File directory = file.getParentFile(); File directory = file.getParentFile();
......
...@@ -30,6 +30,13 @@ public interface Clock { ...@@ -30,6 +30,13 @@ public interface Clock {
*/ */
Clock DEFAULT = new SystemClock(); Clock DEFAULT = new SystemClock();
/**
* Returns the current time in milliseconds since the Unix Epoch.
*
* @see System#currentTimeMillis()
*/
long currentTimeMillis();
/** @see android.os.SystemClock#elapsedRealtime() */ /** @see android.os.SystemClock#elapsedRealtime() */
long elapsedRealtime(); long elapsedRealtime();
......
...@@ -16,13 +16,39 @@ ...@@ -16,13 +16,39 @@
package com.google.android.exoplayer2.util; package com.google.android.exoplayer2.util;
/** /**
* An interruptible condition variable whose {@link #open()} and {@link #close()} methods return * An interruptible condition variable. This class provides a number of benefits over {@link
* whether they resulted in a change of state. * android.os.ConditionVariable}:
*
* <ul>
* <li>Consistent use of ({@link Clock#elapsedRealtime()} for timing {@link #block(long)} timeout
* intervals. {@link android.os.ConditionVariable} used {@link System#currentTimeMillis()}
* prior to Android 10, which is not a correct clock to use for interval timing because it's
* not guaranteed to be monotonic.
* <li>Support for injecting a custom {@link Clock}.
* <li>The ability to query the variable's current state, by calling {@link #isOpen()}.
* <li>{@link #open()} and {@link #close()} return whether they changed the variable's state.
* </ul>
*/ */
public final class ConditionVariable { public final class ConditionVariable {
private final Clock clock;
private boolean isOpen; private boolean isOpen;
/** Creates an instance using {@link Clock#DEFAULT}. */
public ConditionVariable() {
this(Clock.DEFAULT);
}
/**
* Creates an instance.
*
* @param clock The {@link Clock} whose {@link Clock#elapsedRealtime()} method is used to
* determine when {@link #block(long)} should time out.
*/
public ConditionVariable(Clock clock) {
this.clock = clock;
}
/** /**
* Opens the condition and releases all threads that are blocked. * Opens the condition and releases all threads that are blocked.
* *
...@@ -60,18 +86,27 @@ public final class ConditionVariable { ...@@ -60,18 +86,27 @@ public final class ConditionVariable {
} }
/** /**
* Blocks until the condition is opened or until {@code timeout} milliseconds have passed. * Blocks until the condition is opened or until {@code timeoutMs} have passed.
* *
* @param timeout The maximum time to wait in milliseconds. * @param timeoutMs The maximum time to wait in milliseconds. If {@code timeoutMs <= 0} then the
* call will return immediately without blocking.
* @return True if the condition was opened, false if the call returns because of the timeout. * @return True if the condition was opened, false if the call returns because of the timeout.
* @throws InterruptedException If the thread is interrupted. * @throws InterruptedException If the thread is interrupted.
*/ */
public synchronized boolean block(long timeout) throws InterruptedException { public synchronized boolean block(long timeoutMs) throws InterruptedException {
long now = android.os.SystemClock.elapsedRealtime(); if (timeoutMs <= 0) {
long end = now + timeout; return isOpen;
while (!isOpen && now < end) { }
wait(end - now); long nowMs = clock.elapsedRealtime();
now = android.os.SystemClock.elapsedRealtime(); long endMs = nowMs + timeoutMs;
if (endMs < nowMs) {
// timeoutMs is large enough for (nowMs + timeoutMs) to rollover. Block indefinitely.
block();
} else {
while (!isOpen && nowMs < endMs) {
wait(endMs - nowMs);
nowMs = clock.elapsedRealtime();
}
} }
return isOpen; return isOpen;
} }
......
...@@ -21,9 +21,17 @@ import android.os.Looper; ...@@ -21,9 +21,17 @@ import android.os.Looper;
import androidx.annotation.Nullable; import androidx.annotation.Nullable;
/** /**
* The standard implementation of {@link Clock}. * The standard implementation of {@link Clock}, an instance of which is available via {@link
* SystemClock#DEFAULT}.
*/ */
/* package */ final class SystemClock implements Clock { public class SystemClock implements Clock {
protected SystemClock() {}
@Override
public long currentTimeMillis() {
return System.currentTimeMillis();
}
@Override @Override
public long elapsedRealtime() { public long elapsedRealtime() {
......
...@@ -727,6 +727,24 @@ public final class Util { ...@@ -727,6 +727,24 @@ public final class Util {
} }
/** /**
* Returns the index of the first occurrence of {@code value} in {@code array}, or {@link
* C#INDEX_UNSET} if {@code value} is not contained in {@code array}.
*
* @param array The array to search.
* @param value The value to search for.
* @return The index of the first occurrence of value in {@code array}, or {@link C#INDEX_UNSET}
* if {@code value} is not contained in {@code array}.
*/
public static int linearSearch(long[] array, long value) {
for (int i = 0; i < array.length; i++) {
if (array[i] == value) {
return i;
}
}
return C.INDEX_UNSET;
}
/**
* Returns the index of the largest element in {@code array} that is less than (or optionally * Returns the index of the largest element in {@code array} that is less than (or optionally
* equal to) a specified {@code value}. * equal to) a specified {@code value}.
* *
......
...@@ -81,7 +81,7 @@ track 0: ...@@ -81,7 +81,7 @@ track 0:
flags = 1 flags = 1
data = length 520, hash FEE56928 data = length 520, hash FEE56928
sample 13: sample 13:
time = 520000 time = 519999
flags = 1 flags = 1
data = length 599, hash 41F496C5 data = length 599, hash 41F496C5
sample 14: sample 14:
......
...@@ -57,7 +57,7 @@ track 0: ...@@ -57,7 +57,7 @@ track 0:
flags = 1 flags = 1
data = length 520, hash FEE56928 data = length 520, hash FEE56928
sample 7: sample 7:
time = 520000 time = 519999
flags = 1 flags = 1
data = length 599, hash 41F496C5 data = length 599, hash 41F496C5
sample 8: sample 8:
......
...@@ -33,7 +33,7 @@ track 0: ...@@ -33,7 +33,7 @@ track 0:
flags = 1 flags = 1
data = length 520, hash FEE56928 data = length 520, hash FEE56928
sample 1: sample 1:
time = 520000 time = 519999
flags = 1 flags = 1
data = length 599, hash 41F496C5 data = length 599, hash 41F496C5
sample 2: sample 2:
......
...@@ -107,7 +107,7 @@ track 0: ...@@ -107,7 +107,7 @@ track 0:
crypto mode = 1 crypto mode = 1
encryption key = length 16, hash 9FDDEA52 encryption key = length 16, hash 9FDDEA52
sample 13: sample 13:
time = 520000 time = 519999
flags = 1073741825 flags = 1073741825
data = length 616, hash 3F657E23 data = length 616, hash 3F657E23
crypto mode = 1 crypto mode = 1
......
...@@ -71,7 +71,7 @@ track 0: ...@@ -71,7 +71,7 @@ track 0:
crypto mode = 1 crypto mode = 1
encryption key = length 16, hash 9FDDEA52 encryption key = length 16, hash 9FDDEA52
sample 7: sample 7:
time = 520000 time = 519999
flags = 1073741825 flags = 1073741825
data = length 616, hash 3F657E23 data = length 616, hash 3F657E23
crypto mode = 1 crypto mode = 1
......
...@@ -35,7 +35,7 @@ track 0: ...@@ -35,7 +35,7 @@ track 0:
crypto mode = 1 crypto mode = 1
encryption key = length 16, hash 9FDDEA52 encryption key = length 16, hash 9FDDEA52
sample 1: sample 1:
time = 520000 time = 519999
flags = 1073741825 flags = 1073741825
data = length 616, hash 3F657E23 data = length 616, hash 3F657E23
crypto mode = 1 crypto mode = 1
......
seekMap:
isSeekable = true
duration = 526000
getPosition(0) = [[timeUs=0, position=1161]]
numberOfTracks = 1
track 0:
format:
bitrate = -1
id = 1
containerMimeType = null
sampleMimeType = video/avc
maxInputSize = 34686
width = 1280
height = 720
frameRate = 13.307984
rotationDegrees = 0
pixelWidthHeightRatio = 1.0
channelCount = -1
sampleRate = -1
pcmEncoding = -1
encoderDelay = 0
encoderPadding = 0
subsampleOffsetUs = 9223372036854775807
selectionFlags = 0
language = null
drmInitData = -
metadata = entries=[mdta: key=com.android.capture.fps]
initializationData:
data = length 22, hash 4CF81805
data = length 9, hash FBAFBA1C
total output bytes = 42320
sample count = 7
sample 0:
time = 0
flags = 1
data = length 34656, hash D92B66FF
sample 1:
time = 325344
flags = 0
data = length 768, hash D0C3B229
sample 2:
time = 358677
flags = 0
data = length 1184, hash C598EFC0
sample 3:
time = 392011
flags = 0
data = length 576, hash 667AEC2C
sample 4:
time = 425344
flags = 0
data = length 1456, hash 430D1498
sample 5:
time = 458677
flags = 0
data = length 1280, hash 12267E0E
sample 6:
time = 492011
flags = 536870912
data = length 2400, hash FBCB42C
tracksEnded = true
seekMap:
isSeekable = true
duration = 526000
getPosition(0) = [[timeUs=0, position=1161]]
numberOfTracks = 1
track 0:
format:
bitrate = -1
id = 1
containerMimeType = null
sampleMimeType = video/avc
maxInputSize = 34686
width = 1280
height = 720
frameRate = 13.307984
rotationDegrees = 0
pixelWidthHeightRatio = 1.0
channelCount = -1
sampleRate = -1
pcmEncoding = -1
encoderDelay = 0
encoderPadding = 0
subsampleOffsetUs = 9223372036854775807
selectionFlags = 0
language = null
drmInitData = -
metadata = entries=[mdta: key=com.android.capture.fps]
initializationData:
data = length 22, hash 4CF81805
data = length 9, hash FBAFBA1C
total output bytes = 42320
sample count = 7
sample 0:
time = 0
flags = 1
data = length 34656, hash D92B66FF
sample 1:
time = 325344
flags = 0
data = length 768, hash D0C3B229
sample 2:
time = 358677
flags = 0
data = length 1184, hash C598EFC0
sample 3:
time = 392011
flags = 0
data = length 576, hash 667AEC2C
sample 4:
time = 425344
flags = 0
data = length 1456, hash 430D1498
sample 5:
time = 458677
flags = 0
data = length 1280, hash 12267E0E
sample 6:
time = 492011
flags = 536870912
data = length 2400, hash FBCB42C
tracksEnded = true
seekMap:
isSeekable = true
duration = 526000
getPosition(0) = [[timeUs=0, position=1161]]
numberOfTracks = 1
track 0:
format:
bitrate = -1
id = 1
containerMimeType = null
sampleMimeType = video/avc
maxInputSize = 34686
width = 1280
height = 720
frameRate = 13.307984
rotationDegrees = 0
pixelWidthHeightRatio = 1.0
channelCount = -1
sampleRate = -1
pcmEncoding = -1
encoderDelay = 0
encoderPadding = 0
subsampleOffsetUs = 9223372036854775807
selectionFlags = 0
language = null
drmInitData = -
metadata = entries=[mdta: key=com.android.capture.fps]
initializationData:
data = length 22, hash 4CF81805
data = length 9, hash FBAFBA1C
total output bytes = 42320
sample count = 7
sample 0:
time = 0
flags = 1
data = length 34656, hash D92B66FF
sample 1:
time = 325344
flags = 0
data = length 768, hash D0C3B229
sample 2:
time = 358677
flags = 0
data = length 1184, hash C598EFC0
sample 3:
time = 392011
flags = 0
data = length 576, hash 667AEC2C
sample 4:
time = 425344
flags = 0
data = length 1456, hash 430D1498
sample 5:
time = 458677
flags = 0
data = length 1280, hash 12267E0E
sample 6:
time = 492011
flags = 536870912
data = length 2400, hash FBCB42C
tracksEnded = true
seekMap:
isSeekable = true
duration = 526000
getPosition(0) = [[timeUs=0, position=1161]]
numberOfTracks = 1
track 0:
format:
bitrate = -1
id = 1
containerMimeType = null
sampleMimeType = video/avc
maxInputSize = 34686
width = 1280
height = 720
frameRate = 13.307984
rotationDegrees = 0
pixelWidthHeightRatio = 1.0
channelCount = -1
sampleRate = -1
pcmEncoding = -1
encoderDelay = 0
encoderPadding = 0
subsampleOffsetUs = 9223372036854775807
selectionFlags = 0
language = null
drmInitData = -
metadata = entries=[mdta: key=com.android.capture.fps]
initializationData:
data = length 22, hash 4CF81805
data = length 9, hash FBAFBA1C
total output bytes = 42320
sample count = 7
sample 0:
time = 0
flags = 1
data = length 34656, hash D92B66FF
sample 1:
time = 325344
flags = 0
data = length 768, hash D0C3B229
sample 2:
time = 358677
flags = 0
data = length 1184, hash C598EFC0
sample 3:
time = 392011
flags = 0
data = length 576, hash 667AEC2C
sample 4:
time = 425344
flags = 0
data = length 1456, hash 430D1498
sample 5:
time = 458677
flags = 0
data = length 1280, hash 12267E0E
sample 6:
time = 492011
flags = 536870912
data = length 2400, hash FBCB42C
tracksEnded = true
...@@ -31,123 +31,123 @@ track 0: ...@@ -31,123 +31,123 @@ track 0:
total output bytes = 85933 total output bytes = 85933
sample count = 30 sample count = 30
sample 0: sample 0:
time = 66000 time = 66733
flags = 1 flags = 1
data = length 38070, hash B58E1AEE data = length 38070, hash B58E1AEE
sample 1: sample 1:
time = 199000 time = 200199
flags = 0 flags = 0
data = length 8340, hash 8AC449FF data = length 8340, hash 8AC449FF
sample 2: sample 2:
time = 132000 time = 133466
flags = 0 flags = 0
data = length 1295, hash C0DA5090 data = length 1295, hash C0DA5090
sample 3: sample 3:
time = 100000 time = 100100
flags = 0 flags = 0
data = length 469, hash D6E0A200 data = length 469, hash D6E0A200
sample 4: sample 4:
time = 166000 time = 166832
flags = 0 flags = 0
data = length 564, hash E5F56C5B data = length 564, hash E5F56C5B
sample 5: sample 5:
time = 332000 time = 333666
flags = 0 flags = 0
data = length 6075, hash 8756E49E data = length 6075, hash 8756E49E
sample 6: sample 6:
time = 266000 time = 266933
flags = 0 flags = 0
data = length 847, hash DCC2B618 data = length 847, hash DCC2B618
sample 7: sample 7:
time = 233000 time = 233566
flags = 0 flags = 0
data = length 455, hash B9CCE047 data = length 455, hash B9CCE047
sample 8: sample 8:
time = 299000 time = 300299
flags = 0 flags = 0
data = length 467, hash 69806D94 data = length 467, hash 69806D94
sample 9: sample 9:
time = 466000 time = 467133
flags = 0 flags = 0
data = length 4549, hash 3944F501 data = length 4549, hash 3944F501
sample 10: sample 10:
time = 399000 time = 400399
flags = 0 flags = 0
data = length 1087, hash 491BF106 data = length 1087, hash 491BF106
sample 11: sample 11:
time = 367000 time = 367033
flags = 0 flags = 0
data = length 380, hash 5FED016A data = length 380, hash 5FED016A
sample 12: sample 12:
time = 433000 time = 433766
flags = 0 flags = 0
data = length 455, hash 8A0610 data = length 455, hash 8A0610
sample 13: sample 13:
time = 599000 time = 600599
flags = 0 flags = 0
data = length 5190, hash B9031D8 data = length 5190, hash B9031D8
sample 14: sample 14:
time = 533000 time = 533866
flags = 0 flags = 0
data = length 1071, hash 684E7DC8 data = length 1071, hash 684E7DC8
sample 15: sample 15:
time = 500000 time = 500500
flags = 0 flags = 0
data = length 653, hash 8494F326 data = length 653, hash 8494F326
sample 16: sample 16:
time = 566000 time = 567232
flags = 0 flags = 0
data = length 485, hash 2CCC85F4 data = length 485, hash 2CCC85F4
sample 17: sample 17:
time = 733000 time = 734066
flags = 0 flags = 0
data = length 4884, hash D16B6A96 data = length 4884, hash D16B6A96
sample 18: sample 18:
time = 666000 time = 667333
flags = 0 flags = 0
data = length 997, hash 164FF210 data = length 997, hash 164FF210
sample 19: sample 19:
time = 633000 time = 633966
flags = 0 flags = 0
data = length 640, hash F664125B data = length 640, hash F664125B
sample 20: sample 20:
time = 700000 time = 700699
flags = 0 flags = 0
data = length 491, hash B5930C7C data = length 491, hash B5930C7C
sample 21: sample 21:
time = 866000 time = 867533
flags = 0 flags = 0
data = length 2989, hash 92CF4FCF data = length 2989, hash 92CF4FCF
sample 22: sample 22:
time = 800000 time = 800799
flags = 0 flags = 0
data = length 838, hash 294A3451 data = length 838, hash 294A3451
sample 23: sample 23:
time = 767000 time = 767433
flags = 0 flags = 0
data = length 544, hash FCCE2DE6 data = length 544, hash FCCE2DE6
sample 24: sample 24:
time = 833000 time = 834166
flags = 0 flags = 0
data = length 329, hash A654FFA1 data = length 329, hash A654FFA1
sample 25: sample 25:
time = 1000000 time = 1000999
flags = 0 flags = 0
data = length 1517, hash 5F7EBF8B data = length 1517, hash 5F7EBF8B
sample 26: sample 26:
time = 933000 time = 934266
flags = 0 flags = 0
data = length 803, hash 7A5C4C1D data = length 803, hash 7A5C4C1D
sample 27: sample 27:
time = 900000 time = 900900
flags = 0 flags = 0
data = length 415, hash B31BBC3B data = length 415, hash B31BBC3B
sample 28: sample 28:
time = 967000 time = 967632
flags = 0 flags = 0
data = length 415, hash 850DFEA3 data = length 415, hash 850DFEA3
sample 29: sample 29:
time = 1033000 time = 1034366
flags = 0 flags = 0
data = length 619, hash AB5E56CA data = length 619, hash AB5E56CA
track 1: track 1:
...@@ -181,183 +181,183 @@ track 1: ...@@ -181,183 +181,183 @@ track 1:
flags = 1 flags = 1
data = length 18, hash 96519432 data = length 18, hash 96519432
sample 1: sample 1:
time = 23000 time = 23219
flags = 1 flags = 1
data = length 4, hash EE9DF data = length 4, hash EE9DF
sample 2: sample 2:
time = 46000 time = 46439
flags = 1 flags = 1
data = length 4, hash EEDBF data = length 4, hash EEDBF
sample 3: sample 3:
time = 69000 time = 69659
flags = 1 flags = 1
data = length 157, hash E2F078F4 data = length 157, hash E2F078F4
sample 4: sample 4:
time = 92000 time = 92879
flags = 1 flags = 1
data = length 371, hash B9471F94 data = length 371, hash B9471F94
sample 5: sample 5:
time = 116000 time = 116099
flags = 1 flags = 1
data = length 373, hash 2AB265CB data = length 373, hash 2AB265CB
sample 6: sample 6:
time = 139000 time = 139319
flags = 1 flags = 1
data = length 402, hash 1295477C data = length 402, hash 1295477C
sample 7: sample 7:
time = 162000 time = 162539
flags = 1 flags = 1
data = length 455, hash 2D8146C8 data = length 455, hash 2D8146C8
sample 8: sample 8:
time = 185000 time = 185759
flags = 1 flags = 1
data = length 434, hash F2C5D287 data = length 434, hash F2C5D287
sample 9: sample 9:
time = 208000 time = 208979
flags = 1 flags = 1
data = length 450, hash 84143FCD data = length 450, hash 84143FCD
sample 10: sample 10:
time = 232000 time = 232199
flags = 1 flags = 1
data = length 429, hash EF769D50 data = length 429, hash EF769D50
sample 11: sample 11:
time = 255000 time = 255419
flags = 1 flags = 1
data = length 450, hash EC3DE692 data = length 450, hash EC3DE692
sample 12: sample 12:
time = 278000 time = 278639
flags = 1 flags = 1
data = length 447, hash 3E519E13 data = length 447, hash 3E519E13
sample 13: sample 13:
time = 301000 time = 301859
flags = 1 flags = 1
data = length 457, hash 1E4F23A0 data = length 457, hash 1E4F23A0
sample 14: sample 14:
time = 325000 time = 325079
flags = 1 flags = 1
data = length 447, hash A439EA97 data = length 447, hash A439EA97
sample 15: sample 15:
time = 348000 time = 348299
flags = 1 flags = 1
data = length 456, hash 1E9034C6 data = length 456, hash 1E9034C6
sample 16: sample 16:
time = 371000 time = 371519
flags = 1 flags = 1
data = length 398, hash 99DB7345 data = length 398, hash 99DB7345
sample 17: sample 17:
time = 394000 time = 394739
flags = 1 flags = 1
data = length 474, hash 3F05F10A data = length 474, hash 3F05F10A
sample 18: sample 18:
time = 417000 time = 417959
flags = 1 flags = 1
data = length 416, hash C105EE09 data = length 416, hash C105EE09
sample 19: sample 19:
time = 441000 time = 441179
flags = 1 flags = 1
data = length 454, hash 5FDBE458 data = length 454, hash 5FDBE458
sample 20: sample 20:
time = 464000 time = 464399
flags = 1 flags = 1
data = length 438, hash 41A93AC3 data = length 438, hash 41A93AC3
sample 21: sample 21:
time = 487000 time = 487619
flags = 1 flags = 1
data = length 443, hash 10FDA652 data = length 443, hash 10FDA652
sample 22: sample 22:
time = 510000 time = 510839
flags = 1 flags = 1
data = length 412, hash 1F791E25 data = length 412, hash 1F791E25
sample 23: sample 23:
time = 534000 time = 534058
flags = 1 flags = 1
data = length 482, hash A6D983D data = length 482, hash A6D983D
sample 24: sample 24:
time = 557000 time = 557278
flags = 1 flags = 1
data = length 386, hash BED7392F data = length 386, hash BED7392F
sample 25: sample 25:
time = 580000 time = 580498
flags = 1 flags = 1
data = length 463, hash 5309F8C9 data = length 463, hash 5309F8C9
sample 26: sample 26:
time = 603000 time = 603718
flags = 1 flags = 1
data = length 394, hash 21C7321F data = length 394, hash 21C7321F
sample 27: sample 27:
time = 626000 time = 626938
flags = 1 flags = 1
data = length 489, hash 71B4730D data = length 489, hash 71B4730D
sample 28: sample 28:
time = 650000 time = 650158
flags = 1 flags = 1
data = length 403, hash D9C6DE89 data = length 403, hash D9C6DE89
sample 29: sample 29:
time = 673000 time = 673378
flags = 1 flags = 1
data = length 447, hash 9B14B73B data = length 447, hash 9B14B73B
sample 30: sample 30:
time = 696000 time = 696598
flags = 1 flags = 1
data = length 439, hash 4760D35B data = length 439, hash 4760D35B
sample 31: sample 31:
time = 719000 time = 719818
flags = 1 flags = 1
data = length 463, hash 1601F88D data = length 463, hash 1601F88D
sample 32: sample 32:
time = 743000 time = 743038
flags = 1 flags = 1
data = length 423, hash D4AE6773 data = length 423, hash D4AE6773
sample 33: sample 33:
time = 766000 time = 766258
flags = 1 flags = 1
data = length 497, hash A3C674D3 data = length 497, hash A3C674D3
sample 34: sample 34:
time = 789000 time = 789478
flags = 1 flags = 1
data = length 419, hash D3734A1F data = length 419, hash D3734A1F
sample 35: sample 35:
time = 812000 time = 812698
flags = 1 flags = 1
data = length 474, hash DFB41F9 data = length 474, hash DFB41F9
sample 36: sample 36:
time = 835000 time = 835918
flags = 1 flags = 1
data = length 413, hash 53E7CB9F data = length 413, hash 53E7CB9F
sample 37: sample 37:
time = 859000 time = 859138
flags = 1 flags = 1
data = length 445, hash D15B0E39 data = length 445, hash D15B0E39
sample 38: sample 38:
time = 882000 time = 882358
flags = 1 flags = 1
data = length 453, hash 77ED81E4 data = length 453, hash 77ED81E4
sample 39: sample 39:
time = 905000 time = 905578
flags = 1 flags = 1
data = length 545, hash 3321AEB9 data = length 545, hash 3321AEB9
sample 40: sample 40:
time = 928000 time = 928798
flags = 1 flags = 1
data = length 317, hash F557D0E data = length 317, hash F557D0E
sample 41: sample 41:
time = 952000 time = 952018
flags = 1 flags = 1
data = length 537, hash ED58CF7B data = length 537, hash ED58CF7B
sample 42: sample 42:
time = 975000 time = 975238
flags = 1 flags = 1
data = length 458, hash 51CDAA10 data = length 458, hash 51CDAA10
sample 43: sample 43:
time = 998000 time = 998458
flags = 1 flags = 1
data = length 465, hash CBA1EFD7 data = length 465, hash CBA1EFD7
sample 44: sample 44:
time = 1021000 time = 1021678
flags = 1 flags = 1
data = length 446, hash D6735B8A data = length 446, hash D6735B8A
sample 45: sample 45:
time = 1044000 time = 1044897
flags = 1 flags = 1
data = length 10, hash A453EEBE data = length 10, hash A453EEBE
tracksEnded = true tracksEnded = true
...@@ -31,123 +31,123 @@ track 0: ...@@ -31,123 +31,123 @@ track 0:
total output bytes = 85933 total output bytes = 85933
sample count = 30 sample count = 30
sample 0: sample 0:
time = 66000 time = 66733
flags = 1 flags = 1
data = length 38070, hash B58E1AEE data = length 38070, hash B58E1AEE
sample 1: sample 1:
time = 199000 time = 200199
flags = 0 flags = 0
data = length 8340, hash 8AC449FF data = length 8340, hash 8AC449FF
sample 2: sample 2:
time = 132000 time = 133466
flags = 0 flags = 0
data = length 1295, hash C0DA5090 data = length 1295, hash C0DA5090
sample 3: sample 3:
time = 100000 time = 100100
flags = 0 flags = 0
data = length 469, hash D6E0A200 data = length 469, hash D6E0A200
sample 4: sample 4:
time = 166000 time = 166832
flags = 0 flags = 0
data = length 564, hash E5F56C5B data = length 564, hash E5F56C5B
sample 5: sample 5:
time = 332000 time = 333666
flags = 0 flags = 0
data = length 6075, hash 8756E49E data = length 6075, hash 8756E49E
sample 6: sample 6:
time = 266000 time = 266933
flags = 0 flags = 0
data = length 847, hash DCC2B618 data = length 847, hash DCC2B618
sample 7: sample 7:
time = 233000 time = 233566
flags = 0 flags = 0
data = length 455, hash B9CCE047 data = length 455, hash B9CCE047
sample 8: sample 8:
time = 299000 time = 300299
flags = 0 flags = 0
data = length 467, hash 69806D94 data = length 467, hash 69806D94
sample 9: sample 9:
time = 466000 time = 467133
flags = 0 flags = 0
data = length 4549, hash 3944F501 data = length 4549, hash 3944F501
sample 10: sample 10:
time = 399000 time = 400399
flags = 0 flags = 0
data = length 1087, hash 491BF106 data = length 1087, hash 491BF106
sample 11: sample 11:
time = 367000 time = 367033
flags = 0 flags = 0
data = length 380, hash 5FED016A data = length 380, hash 5FED016A
sample 12: sample 12:
time = 433000 time = 433766
flags = 0 flags = 0
data = length 455, hash 8A0610 data = length 455, hash 8A0610
sample 13: sample 13:
time = 599000 time = 600599
flags = 0 flags = 0
data = length 5190, hash B9031D8 data = length 5190, hash B9031D8
sample 14: sample 14:
time = 533000 time = 533866
flags = 0 flags = 0
data = length 1071, hash 684E7DC8 data = length 1071, hash 684E7DC8
sample 15: sample 15:
time = 500000 time = 500500
flags = 0 flags = 0
data = length 653, hash 8494F326 data = length 653, hash 8494F326
sample 16: sample 16:
time = 566000 time = 567232
flags = 0 flags = 0
data = length 485, hash 2CCC85F4 data = length 485, hash 2CCC85F4
sample 17: sample 17:
time = 733000 time = 734066
flags = 0 flags = 0
data = length 4884, hash D16B6A96 data = length 4884, hash D16B6A96
sample 18: sample 18:
time = 666000 time = 667333
flags = 0 flags = 0
data = length 997, hash 164FF210 data = length 997, hash 164FF210
sample 19: sample 19:
time = 633000 time = 633966
flags = 0 flags = 0
data = length 640, hash F664125B data = length 640, hash F664125B
sample 20: sample 20:
time = 700000 time = 700699
flags = 0 flags = 0
data = length 491, hash B5930C7C data = length 491, hash B5930C7C
sample 21: sample 21:
time = 866000 time = 867533
flags = 0 flags = 0
data = length 2989, hash 92CF4FCF data = length 2989, hash 92CF4FCF
sample 22: sample 22:
time = 800000 time = 800799
flags = 0 flags = 0
data = length 838, hash 294A3451 data = length 838, hash 294A3451
sample 23: sample 23:
time = 767000 time = 767433
flags = 0 flags = 0
data = length 544, hash FCCE2DE6 data = length 544, hash FCCE2DE6
sample 24: sample 24:
time = 833000 time = 834166
flags = 0 flags = 0
data = length 329, hash A654FFA1 data = length 329, hash A654FFA1
sample 25: sample 25:
time = 1000000 time = 1000999
flags = 0 flags = 0
data = length 1517, hash 5F7EBF8B data = length 1517, hash 5F7EBF8B
sample 26: sample 26:
time = 933000 time = 934266
flags = 0 flags = 0
data = length 803, hash 7A5C4C1D data = length 803, hash 7A5C4C1D
sample 27: sample 27:
time = 900000 time = 900900
flags = 0 flags = 0
data = length 415, hash B31BBC3B data = length 415, hash B31BBC3B
sample 28: sample 28:
time = 967000 time = 967632
flags = 0 flags = 0
data = length 415, hash 850DFEA3 data = length 415, hash 850DFEA3
sample 29: sample 29:
time = 1033000 time = 1034366
flags = 0 flags = 0
data = length 619, hash AB5E56CA data = length 619, hash AB5E56CA
track 1: track 1:
...@@ -181,183 +181,183 @@ track 1: ...@@ -181,183 +181,183 @@ track 1:
flags = 1 flags = 1
data = length 18, hash 96519432 data = length 18, hash 96519432
sample 1: sample 1:
time = 23000 time = 23219
flags = 1 flags = 1
data = length 4, hash EE9DF data = length 4, hash EE9DF
sample 2: sample 2:
time = 46000 time = 46439
flags = 1 flags = 1
data = length 4, hash EEDBF data = length 4, hash EEDBF
sample 3: sample 3:
time = 69000 time = 69659
flags = 1 flags = 1
data = length 157, hash E2F078F4 data = length 157, hash E2F078F4
sample 4: sample 4:
time = 92000 time = 92879
flags = 1 flags = 1
data = length 371, hash B9471F94 data = length 371, hash B9471F94
sample 5: sample 5:
time = 116000 time = 116099
flags = 1 flags = 1
data = length 373, hash 2AB265CB data = length 373, hash 2AB265CB
sample 6: sample 6:
time = 139000 time = 139319
flags = 1 flags = 1
data = length 402, hash 1295477C data = length 402, hash 1295477C
sample 7: sample 7:
time = 162000 time = 162539
flags = 1 flags = 1
data = length 455, hash 2D8146C8 data = length 455, hash 2D8146C8
sample 8: sample 8:
time = 185000 time = 185759
flags = 1 flags = 1
data = length 434, hash F2C5D287 data = length 434, hash F2C5D287
sample 9: sample 9:
time = 208000 time = 208979
flags = 1 flags = 1
data = length 450, hash 84143FCD data = length 450, hash 84143FCD
sample 10: sample 10:
time = 232000 time = 232199
flags = 1 flags = 1
data = length 429, hash EF769D50 data = length 429, hash EF769D50
sample 11: sample 11:
time = 255000 time = 255419
flags = 1 flags = 1
data = length 450, hash EC3DE692 data = length 450, hash EC3DE692
sample 12: sample 12:
time = 278000 time = 278639
flags = 1 flags = 1
data = length 447, hash 3E519E13 data = length 447, hash 3E519E13
sample 13: sample 13:
time = 301000 time = 301859
flags = 1 flags = 1
data = length 457, hash 1E4F23A0 data = length 457, hash 1E4F23A0
sample 14: sample 14:
time = 325000 time = 325079
flags = 1 flags = 1
data = length 447, hash A439EA97 data = length 447, hash A439EA97
sample 15: sample 15:
time = 348000 time = 348299
flags = 1 flags = 1
data = length 456, hash 1E9034C6 data = length 456, hash 1E9034C6
sample 16: sample 16:
time = 371000 time = 371519
flags = 1 flags = 1
data = length 398, hash 99DB7345 data = length 398, hash 99DB7345
sample 17: sample 17:
time = 394000 time = 394739
flags = 1 flags = 1
data = length 474, hash 3F05F10A data = length 474, hash 3F05F10A
sample 18: sample 18:
time = 417000 time = 417959
flags = 1 flags = 1
data = length 416, hash C105EE09 data = length 416, hash C105EE09
sample 19: sample 19:
time = 441000 time = 441179
flags = 1 flags = 1
data = length 454, hash 5FDBE458 data = length 454, hash 5FDBE458
sample 20: sample 20:
time = 464000 time = 464399
flags = 1 flags = 1
data = length 438, hash 41A93AC3 data = length 438, hash 41A93AC3
sample 21: sample 21:
time = 487000 time = 487619
flags = 1 flags = 1
data = length 443, hash 10FDA652 data = length 443, hash 10FDA652
sample 22: sample 22:
time = 510000 time = 510839
flags = 1 flags = 1
data = length 412, hash 1F791E25 data = length 412, hash 1F791E25
sample 23: sample 23:
time = 534000 time = 534058
flags = 1 flags = 1
data = length 482, hash A6D983D data = length 482, hash A6D983D
sample 24: sample 24:
time = 557000 time = 557278
flags = 1 flags = 1
data = length 386, hash BED7392F data = length 386, hash BED7392F
sample 25: sample 25:
time = 580000 time = 580498
flags = 1 flags = 1
data = length 463, hash 5309F8C9 data = length 463, hash 5309F8C9
sample 26: sample 26:
time = 603000 time = 603718
flags = 1 flags = 1
data = length 394, hash 21C7321F data = length 394, hash 21C7321F
sample 27: sample 27:
time = 626000 time = 626938
flags = 1 flags = 1
data = length 489, hash 71B4730D data = length 489, hash 71B4730D
sample 28: sample 28:
time = 650000 time = 650158
flags = 1 flags = 1
data = length 403, hash D9C6DE89 data = length 403, hash D9C6DE89
sample 29: sample 29:
time = 673000 time = 673378
flags = 1 flags = 1
data = length 447, hash 9B14B73B data = length 447, hash 9B14B73B
sample 30: sample 30:
time = 696000 time = 696598
flags = 1 flags = 1
data = length 439, hash 4760D35B data = length 439, hash 4760D35B
sample 31: sample 31:
time = 719000 time = 719818
flags = 1 flags = 1
data = length 463, hash 1601F88D data = length 463, hash 1601F88D
sample 32: sample 32:
time = 743000 time = 743038
flags = 1 flags = 1
data = length 423, hash D4AE6773 data = length 423, hash D4AE6773
sample 33: sample 33:
time = 766000 time = 766258
flags = 1 flags = 1
data = length 497, hash A3C674D3 data = length 497, hash A3C674D3
sample 34: sample 34:
time = 789000 time = 789478
flags = 1 flags = 1
data = length 419, hash D3734A1F data = length 419, hash D3734A1F
sample 35: sample 35:
time = 812000 time = 812698
flags = 1 flags = 1
data = length 474, hash DFB41F9 data = length 474, hash DFB41F9
sample 36: sample 36:
time = 835000 time = 835918
flags = 1 flags = 1
data = length 413, hash 53E7CB9F data = length 413, hash 53E7CB9F
sample 37: sample 37:
time = 859000 time = 859138
flags = 1 flags = 1
data = length 445, hash D15B0E39 data = length 445, hash D15B0E39
sample 38: sample 38:
time = 882000 time = 882358
flags = 1 flags = 1
data = length 453, hash 77ED81E4 data = length 453, hash 77ED81E4
sample 39: sample 39:
time = 905000 time = 905578
flags = 1 flags = 1
data = length 545, hash 3321AEB9 data = length 545, hash 3321AEB9
sample 40: sample 40:
time = 928000 time = 928798
flags = 1 flags = 1
data = length 317, hash F557D0E data = length 317, hash F557D0E
sample 41: sample 41:
time = 952000 time = 952018
flags = 1 flags = 1
data = length 537, hash ED58CF7B data = length 537, hash ED58CF7B
sample 42: sample 42:
time = 975000 time = 975238
flags = 1 flags = 1
data = length 458, hash 51CDAA10 data = length 458, hash 51CDAA10
sample 43: sample 43:
time = 998000 time = 998458
flags = 1 flags = 1
data = length 465, hash CBA1EFD7 data = length 465, hash CBA1EFD7
sample 44: sample 44:
time = 1021000 time = 1021678
flags = 1 flags = 1
data = length 446, hash D6735B8A data = length 446, hash D6735B8A
sample 45: sample 45:
time = 1044000 time = 1044897
flags = 1 flags = 1
data = length 10, hash A453EEBE data = length 10, hash A453EEBE
tracksEnded = true tracksEnded = true
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or sign in to comment