- 12 Apr, 2023 11 commits
-
-
PiperOrigin-RevId: 523675327
sheenachhabra committed -
The setter command is only used for setPlaylistMetadata and can be named COMMAND_SET_PLAYLIST_METADATA. The getter commnad is used to access getMediaMetadata and getPlaylistMetadata and can be better named COMMAND_GET_METADATA to reflect this usage. PiperOrigin-RevId: 523673286
tonihei committed -
Previously `ChannelMixingAudioProcessor` output float because it was implemented using the audio mixer's float mixing support. Move the implementation over to just using the `ChannelMixingMatrix` and make it publicly visible in the common module so it can be used by apps for both playback and export. Also resolve a TODO that no longer had a bug attached by implementing support for putting multiple mixing matrices to handle different input audio channel counts, and fix some nits in the test code. Tested via unit tests and manually configuring a `ChannelMixingAudioProcessor` in the transformer demo app and playing an audio stream that identifies channels, and verifying that they are remapped as expected. PiperOrigin-RevId: 523653901
andrewlewis committed -
It is not possible to provide a safe deprecation path because BaseTrackSelection can't easily know which of the methods is implemented by subclasses. PiperOrigin-RevId: 523471578
tonihei committed -
Change what format is logged from MediaCodecAudioRenderer when AudioSink throws InitializationException. We printed the AudioSink's format, which most of the times is audio/raw (PCM) and not the renderer's format. With this change both formats are logged. #minor-release Issue: google/ExoPlayer#11066 PiperOrigin-RevId: 523456840
christosts committed -
In addition to the changes in https://github.com/google/ExoPlayer/commit/b18fb368cca9843aeca2cc4d5a01aa4fa41b4bd7 This change essentially reverts https://github.com/google/ExoPlayer/commit/30e5bc9837e2423cd2bb426c5797211e0f6ad76b (Merged Jul 2022). From this CL on, `VideoFrameProcessor` takes in non-offset, monotonically increasing timestamps. For example, with one 5s and one 10s video, - `VideoFrameProcessor`'s input should start from 0 - On switching to the second video (10s), the timestamp of the first frame in the second video should be at 5s. In ExoPlayer however, `streamOffset` is managed differently and thus needs correction before sending the frames to `VideoFrameProcessor`: - The timestamp of the first video is offset by a large int, so the first frame of the first media item has timestamp (assuming) 10000000000000000 - The last frame of the first media item has 10000005000000000 - At this point the stream off set is updated to 10000005000000000 - The pts of the first frame of the second video starts from 0 again. PiperOrigin-RevId: 523444236
claincly committed -
Implement HDR input support for texture output, and add HDR pixel tests. PiperOrigin-RevId: 523417701
huangdarwin committed -
PiperOrigin-RevId: 523413988
Googler committed -
Simplify the audio encoder input timestamp calculation. The new calculation avoids drifting by tracking the total number of bytes encoded rather than tracking the timestamp and remainder separately, and also makes the timestamps match the decoder output buffer timestamps. Also switch one of the export tests that was passing through AMR samples over to using WAVE audio. The problem with using AMR is that the compressed samples are not necessarily an integer number of audio frames and the shadow decoder would pass them from input to output, so the audio encoder was receiving non-integer numbers of audio frames. Tested by logging the timestamps at the decoder output and encoder input with forcing transcoding audio, and verifying that after this change the audio timestamps are no longer off by one. PiperOrigin-RevId: 523409869
andrewlewis committed -
The video asset loader renders decoder output to a surface texture, and if the video sample pipeline is in the process of updating the surface texture image at the moment when the asset loader video decoder is released this seems to cause `MediaCodec.release` to get stuck. Swap the release order so that we stop updating the texture before trying to release the codec. PiperOrigin-RevId: 523401619
andrewlewis committed -
Log at debug level immediately when MediaCodec throws. This logging will be output closer to the time when the error actually happened so should make it easier to identify the order of components failing. Downgrade logging of errors after export ends to warning level, as output may still be fine if there was a problem after exporting completed (though it's still worth logging a warning as the device may not be in a good state). PiperOrigin-RevId: 523370457
andrewlewis committed
-
- 11 Apr, 2023 8 commits
-
-
Turns out these could have been private, so not sure why they were public. PiperOrigin-RevId: 522545698
huangdarwin committed -
This javadoc is also clear from the Builder.setEnableColorTransfers javadoc, so omit it here to avoid duplication. PiperOrigin-RevId: 522404884
huangdarwin committed -
Before this CL, SurfaceTexture.onFrameAvailable was used to tell whether a frame was available in the VideoFrameProcessor's output texture. This was incorrect, as it would rely on having the texture be written to before the SurfaceTexture.onFrameAvailableListener is invoked, leading to null-pointer- exceptions on timeouts. Instead of using DefaultVideoFrameProcessor different interfaces to set that we want to output to a texture, and get that output texture, use one interface that sets a listener, and renders to a texture iff that listener is set. As this listener is executed on the GL thread, this also allows us to no longer need to expand visibility for the GL task executor and tasks. PiperOrigin-RevId: 522362101
huangdarwin committed -
PiperOrigin-RevId: 522347729
tofunmi committed -
There is a race with the ad period preparation having completed and `onDownstreamFormatChanged` being called when a live stream is joined in an ad period. In this case the stream event metadata of the period is immediately emitted and causing an ad media period being created that is selected in `getMediaPeriodForEvent` before being prepared (1 out of 4). Using an `isPrepared` flag makes sure we don't hand out the media period to early in `getMediaPeriodForEvent`. PiperOrigin-RevId: 522340046
bachinger committed -
To set the chroma format and depth information for H265 format,the csd-0 data needs to be parsed. The previous implementation skipped parsing csd-0 data and hard coded values based on "profile" field in MediaFormat. Along with above mention changes, corrected some of the comments as per spec. PiperOrigin-RevId: 522335595
sheenachhabra committed -
This change improves `ImaUtil.maybeCorrectPreviouslyUnknownAdDuration` to handles the case when the timeline moves forward more than a single period while an ad group with unknown period duration is being played. PiperOrigin-RevId: 522292612
bachinger committed -
PiperOrigin-RevId: 522058915
tonihei committed
-
- 05 Apr, 2023 20 commits
-
-
PiperOrigin-RevId: 522046876
bachinger committed -
`sed` has a different in-place command line syntax in FreeBsd than GNU tools. This change makes the `sed` commands work with FreeBSD `sed` on OSX in bash and zsh shells. Issue: androidx/media#217 PiperOrigin-RevId: 522043938
bachinger committed -
PiperOrigin-RevId: 522039856
christosts committed -
PiperOrigin-RevId: 522010318
tofunmi committed -
NPE in toneMap_hlgFrame_matchesGoldenFile and toneMap_pqFrame_matchesGoldenFile was created because a uEnableColorTransfer uniform was being created on the HDR path, when HDR shader files don't have this uniform. (they don't support disable color transfers right now) Fix: only create the uniform when input is SDR. manually tested on failing tests PiperOrigin-RevId: 522002603
tofunmi committed -
PiperOrigin-RevId: 521805477
tofunmi committed -
PiperOrigin-RevId: 521790733
samrobinson committed -
Need to use NAL unit util methods in muxer module. PiperOrigin-RevId: 521772831
sheenachhabra committed -
This ensures that anybody implementing `Player` (which is relatively unusual) must override at least one `@UnstableApi` method, and therefore opt-in to the unstable API. PiperOrigin-RevId: 521769675
ibaker committed -
PiperOrigin-RevId: 521731485
sheenachhabra committed -
PiperOrigin-RevId: 521476767
bachinger committed -
* Add a new event `onAudioCapabilitiesChanged` in `AudioSink.Listener` interface. * Add an interface `RendererCapabilities.Listener`, which will listen to `onRendererCapabilitiesChanged` events from the renderer. * Add `getRendererCapabilitiesReceiver` method for `TrackSelector`, and register/unregister the `TrackSelector` as the `RendererCapabilitiesReceiver` (if implemented) when the `ExoPlayer` is initialized/released. * Trigger the `AudioSink.Listener.onAudioCapabilitiesChanged` and further `RendererCapabilities.Listener.onRendererCapabilitiesChanged` events when the audio capabilities changes are detected in `DefaultAudioSink`. PiperOrigin-RevId: 521427567
tianyifeng committed -
PiperOrigin-RevId: 521427239
huangdarwin committed -
This was broken and has been fixed in <unknown commit>. PiperOrigin-RevId: 521380415
kimvde committed -
This includes: - Add an ad for each LOADED event of the SDK by taking the duration of the ad from the media structure to exactly match the start position of ads and then use `addLiveAdBreak()` that is used for HLS live already. - When the refreshed content timeline arrives, possibly correct the duration of an ad that has been inserted while the period duration was still unknown (last period of the live timeline). - When an ad period is removed the ad group needs to be put into a condition that allows continuing playback. PiperOrigin-RevId: 520919236
bachinger committed -
Handling of the stream offset and start position was unnecessarily complex and even incorrect. It was going to be an issue for concatenation of video and image input. The stream offset is the offset added before decoding/encoding to make sure it doesn’t fail in case of negative timestamps (which do rarely occur). The start position is equal to the stream offset, plus the clipping start time if the media is clipped. Before this change: - Samples were offset by the start position before decoding, and this offset was removed before muxing. - The startPosition of the first MediaItem in a sequence was used for all the MediaItems in this sequence (which is incorrect). - The stream offset was removed before applying the GL effects and added back before encoding so that it was not visible to the OpenGL processing. After this change: - The start position is subtracted in the AssetLoader, so that the downstream components don’t have to deal with the stream offsets and start positions. - Decoded samples with negative timestamps are not passed to the SamplePipelines. The MediaMuxer doesn’t handle negative timestamps well. If a stream is 10 secondes long and starts at timestamp -2 seconds, the output will only contain the samples corresponding to the first 8 (10 - 2) seconds. It won’t contain the last 2 seconds of the stream. It seems acceptable to remove the first 2 seconds instead. PiperOrigin-RevId: 520916464
kimvde committed -
PiperOrigin-RevId: 520886975
sheenachhabra committed -
Releasing the player once a sequence has ended seems to make our emulator tests flaky. Comment out until we find the cause. The player will still be released from TransformerInternal, when the export ends. PiperOrigin-RevId: 520886181
kimvde committed -
Issue: google/ExoPlayer#11008 PiperOrigin-RevId: 520864579
ibaker committed -
The `DashMediaSource` wrongly added an offset to the media times set to the `MediaLoadData`. With this the `startTimeMS` and `endTimeMs` don't represent the positions in the period but in the stream. `DashMediaSource` was the only call site that was setting the offset to a non-zero value. So if we are using 0 for the `DashMediaSource` as well, the offset is redundant and we can remove it everywhere. PiperOrigin-RevId: 520682026
bachinger committed
-
- 30 Mar, 2023 1 commit
-
-
PiperOrigin-RevId: 520663415
kimvde committed
-