On Android, we would like to use MediaCodec output buffers to hold decoded frames until they can be rendered to a texture. There can only be one texture buffer used at the same time and therefore the calculated decode time in VCMTiming will be wrong since that calculation will also include the time where the decoder waited for the upper layers (that depend on network jitter and actual render time) to release the frame.
This new method will be used in
https://codereview.webrtc.org/1422963003/
BUG=webrtc:4993
R=stefan@webrtc.orgTBR=mflodman@webrtc.org
Review URL: https://codereview.webrtc.org/1414693006 .
Cr-Commit-Position: refs/heads/master@{#10576}
Removes direct VideoCodec use from the new API, exposes VideoDecoders
through webrtc/video_decoder.h similar to VideoEncoders.
Also includes some preparation for wiring up external decoders in
WebRtcVideoEngine2 by adding AllocatedDecoders that specify whether they
were allocated internally or externally.
Additionally addresses a data race in VideoReceiver that was exposed with this change.
R=mflodman@webrtc.org, stefan@webrtc.orgTBR=pthatcher@webrtc.org
BUG=2854,1667
Review URL: https://webrtc-codereview.appspot.com/27829004
git-svn-id: http://webrtc.googlecode.com/svn/trunk@7560 4adac7df-926f-26a2-2b94-8c16560cd09d