Fixing a bug regarding VOE packet loss rate feedback to ACM
Phenomenon: When packet loss rate was fed to a codec that does not implement packet loss adaptive encoding, VoE logs an error. Reason: The ACM function SetPacketLossRate(int rate) return -1 unnecessarily too often. It was intended for more severe errors like 1. codec is not ready 2. input rate is out of range BUG=webrtc:3413 R=henrik.lundin@webrtc.org Review URL: https://webrtc-codereview.appspot.com/16599004 git-svn-id: http://webrtc.googlecode.com/svn/trunk@6283 4adac7df-926f-26a2-2b94-8c16560cd09d
This commit is contained in:
@ -595,10 +595,10 @@ class ACMGenericCodec {
|
||||
// -loss_rate : expected packet loss rate (0 -- 100 inclusive).
|
||||
//
|
||||
// Return value:
|
||||
// -1 if failed, or codec does not support packet loss gnostic encoding,
|
||||
// 0 if succeeded.
|
||||
// -1 if failed,
|
||||
// 0 if succeeded or packet loss rate is ignored.
|
||||
//
|
||||
virtual int SetPacketLossRate(int /* loss_rate */) { return -1; }
|
||||
virtual int SetPacketLossRate(int /* loss_rate */) { return 0; }
|
||||
|
||||
protected:
|
||||
///////////////////////////////////////////////////////////////////////////
|
||||
|
Reference in New Issue
Block a user