H.264 video codec support using OpenH264 (http://www.openh264.org/) for encoding and FFmpeg (https://www.ffmpeg.org/) for decoding.
It works on all platforms except Android and iOS (FFmpeg limitation). Implemented behind compile time flags, off by default. The plan is to have it enabled in Chrome (see bug), but not in Chromium/webrtc by default. Flags to turn it on: - rtc_use_h264 = true - ffmpeg_branding = "Chrome" (or other brand that includes H.264 decoder) Tests using H264: - video_loopback --codec=H264 - screenshare_loopback --codec=H264 - video_engine_tests (EndToEndTest.SendsAndReceivesH264) NOTRY=True BUG=500605, 468365 BUG=https://bugs.chromium.org/p/webrtc/issues/detail?id=5424 Review URL: https://codereview.webrtc.org/1306813009 Cr-Commit-Position: refs/heads/master@{#11390}
This commit is contained in:
@ -173,7 +173,8 @@ class VideoProcessorImpl : public VideoProcessor {
|
||||
private:
|
||||
// Invoked by the callback when a frame has completed encoding.
|
||||
void FrameEncoded(webrtc::VideoCodecType codec,
|
||||
const webrtc::EncodedImage& encodedImage);
|
||||
const webrtc::EncodedImage& encodedImage,
|
||||
const webrtc::RTPFragmentationHeader* fragmentation);
|
||||
// Invoked by the callback when a frame has completed decoding.
|
||||
void FrameDecoded(const webrtc::VideoFrame& image);
|
||||
// Used for getting a 32-bit integer representing time
|
||||
|
||||
Reference in New Issue
Block a user