This CL:
* Abstracts the functions in GlRectDrawer to an interface.
* Adds viewport location as argument to the draw() functions, because this information may be needed by some shaders. This also moves the responsibility of calling GLES20.glViewport() to the drawer.
* Moves uploadYuvData() into a separate helper class.
* Adds new SurfaceViewRenderer.init() function and new VideoRendererGui.create() function that takes a custom drawer as argument. Each YuvImageRenderer in VideoRendererGui now has their own drawer instead of a common one.
BUG=b/25694445
R=nisse@webrtc.org, perkj@webrtc.org
Review URL: https://codereview.webrtc.org/1520243003 .
Cr-Commit-Position: refs/heads/master@{#11031}
Ie, rotation is applied in C++ in the VideoFrameFactory is apply_rotation_ is set. If not, rotation is sent in RTP.
BUG=webrtc:4993
R=nisse@chromium.org
Review URL: https://codereview.webrtc.org/1493913007 .
Cr-Commit-Position: refs/heads/master@{#10986}
do the conversion using an opengl fragment shader.
BUG=webrtc:4993
Review URL: https://codereview.webrtc.org/1460703002
Cr-Commit-Position: refs/heads/master@{#10972}
This fix an issue seen on Huawei Y300 where the camera feed is black and white if we capture to textures and setpreviewformat is called.
BUG=webrtc:4993
Review URL: https://codereview.webrtc.org/1502223002
Cr-Commit-Position: refs/heads/master@{#10941}
Delete EglBase.ConfigType, instead pass arrays of attributes, and define
constant arrays for the common cases.
Both in progress NativeToI420 and extending GlRectDrawer to other shapes (with alpha) needs this.
BUG=b/25694445
Review URL: https://codereview.webrtc.org/1498003002
Cr-Commit-Position: refs/heads/master@{#10908}
Dropping the first frame intended to fix a problem when switching cameras on N6 when we are capturing to textures but due to a silly bug fixed in this cl the frame was not dropped...
BUG=webrtc:5262
TBR=magjed@webrtc.org
Review URL: https://codereview.webrtc.org/1489363002
Cr-Commit-Position: refs/heads/master@{#10867}
The reason we want to use EGL14 is to be able to use EGLExt.eglPresentationTimeANDROID when writing textures to MediaEncoder.
BUG=webrtc:4993
TBR=glaznew@webrtc.org
Review URL: https://codereview.webrtc.org/1461083002
Cr-Commit-Position: refs/heads/master@{#10864}
Insted of using a fixed frame rate, we allow the camera to use a lower frame rate. The camera will choose depending on lightning condition.
TESTED= In a room with low light on N5, N6 N7, Galaxy 4.
BUG=webrtc:5262
R=magjed@webrtc.org
Review URL: https://codereview.webrtc.org/1479563004 .
Cr-Commit-Position: refs/heads/master@{#10807}
SurfaceViewRenderer currently stores widthSpec/heightSpec internally, and triggers requestLayout() from renderFrameOnRenderThread()->checkConsistentLayout() when it detects a change using widthSpec/heightSpec. This is not reliable, because onMeasure() might be called several times during the layout process negotiation. For example it might look like this:
-> onMeasure(at most 1920, at most 1080)
<- setMeasuredDimension(1080, 1080)
-> onMeasure(exactly 1080, exactly 1080)
<- setMeasuredDimension(1080, 1080)
Then we store (exactly 1080, exactly 1080) even though we are allowed to be bigger than this, and requestLayout() will never be triggered.
This CL moves the requestLayout() trigger to updateFrameDimensionsAndReportEvents() when the frame size changes.
Other small changes in this CL are:
* Replace with/height variables with Point.
* Add logging in updateFrameDimensionsAndReportEvents() even when rendererEvents is null.
* Use Math.round() in RendererCommon.getDisplaySize() instead of integer cast.
R=hbos@webrtc.org
Review URL: https://codereview.webrtc.org/1453413005 .
Cr-Commit-Position: refs/heads/master@{#10774}
eglCreateSurface() calls are posted to the render thread from both init() and surfaceCreated(). If the render thread does not process the eglCreateSurface() message from init() before surfaceCreated() is called, eglCreateSurface() will be called twice resulting in a crash.
This CL makes sure eglCreateSurface() is only called once.
BUG=b/25815604
R=hbos@webrtc.org
Review URL: https://codereview.webrtc.org/1466133002 .
Cr-Commit-Position: refs/heads/master@{#10769}
This method should be used when the SurfaceTextureHelper is created to use a specific handler.
This now guarantee that the looper used by handler is destroyed after a frame has been returned.
Review URL: https://codereview.webrtc.org/1465163003
Cr-Commit-Position: refs/heads/master@{#10767}
It do the following:
The SurfaceTexture.updateTexImage() calls are moved from the video renderers into MediaCodecVideoDecoder, and the destructor of the texture frames will signal MediaCodecVideoDecoder that the frame has returned. This CL also removes the SurfaceTexture from the native handle and only exposes the texture matrix instead, because only the video source should access the SurfaceTexture.
It moves the responsibility of calculating the decode time to Java.
Patchset2 Refactor MediaCodecVideoDecoder to drop frames if a texture is not released.
R=magjed@webrtc.org
Review URL: https://codereview.webrtc.org/1440343002 .
Cr-Commit-Position: refs/heads/master@{#10706}
The original purpose with keeping one pending frame in SurfaceViewRenderer was to reduce latency for the first rendered frame when we are waiting for the Surface to be created. However, it is very dangerous to hold a pending frame indefinitely when used with a SurfaceTexture, because the SurfaceTexture only has one frame and thus holding a frame in the renderer will freeze everything and typically cause timeout crashes.
Review URL: https://codereview.webrtc.org/1435413006
Cr-Commit-Position: refs/heads/master@{#10638}
Reason for revert:
Causes fallback to SW decoder if a renderer is put in the background.
Original issue's description:
> Patchset 1 is a pure
> revert of "Revert of "Android MediaCodecVideoDecoder: Manage lifetime of texture frames" https://codereview.webrtc.org/1378033003/
>
> Following patchsets move the responsibility of calculating the decode time to Java.
>
> TESTED= Apprtc loopback using H264 and VP8 on N5, N6, N7, S5
>
> Committed: https://crrev.com/9cb8982e64f08d3d630bf7c3d2bcc78c10db88e2
> Cr-Commit-Position: refs/heads/master@{#10597}
TBR=magjed@webrtc.org,glaznev@webrtc.org
NOPRESUBMIT=true
NOTREECHECKS=true
Review URL: https://codereview.webrtc.org/1441363002 .
Cr-Commit-Position: refs/heads/master@{#10637}
This CL attempts to annotate accesses on >16 API levels using as
small scopes as possible. The TargetApi notations mean "yes, I know
I'm accessing a higher API and I take responsibility for gating the
call on Android API level". The Encoder/Decoder classes are annotated
on the whole class, but they're only accessed through JNI; we should
annotate on method level otherwise and preferably on private methods.
This patch also fixes some compiler-level deprecation warnings (i.e.
-Xlint:deprecation), but probably not all of them.
BUG=webrtc:5063
R=henrika@webrtc.org, kjellander@webrtc.org, magjed@webrtc.org
Review URL: https://codereview.webrtc.org/1412673008 .
Cr-Commit-Position: refs/heads/master@{#10624}
This make sure that the texture copy is syncronized.
To reproduce the problem I:
Reverted "Revert of "Android MediaCodecVideoDecoder: Manage lifetime of texture frames" https://codereview.webrtc.org/1378033003/"
commit 543b6ca30a43eeb069c699291460ce6bacc7959d.
Reverted "Enable SurfaceViewRenderer for AppRTCDemo"
commit 7076729c57c27aa813760d2038be02c36f4d7649.
and ran ApprtDemo in loopback and changed the orientation a couple of times.
TBR=glaznev@webrtc.org
Review URL: https://codereview.webrtc.org/1437823002
Cr-Commit-Position: refs/heads/master@{#10598}
revert of "Revert of "Android MediaCodecVideoDecoder: Manage lifetime of texture frames" https://codereview.webrtc.org/1378033003/
Following patchsets move the responsibility of calculating the decode time to Java.
TESTED= Apprtc loopback using H264 and VP8 on N5, N6, N7, S5
Review URL: https://codereview.webrtc.org/1422963003
Cr-Commit-Position: refs/heads/master@{#10597}
Add resource name to log outputs to distinguish local renderer from remote renderer.
This Cl also adds some thread checks and factors out a small helper function makeBlack().
Review URL: https://codereview.webrtc.org/1420203003
Cr-Commit-Position: refs/heads/master@{#10596}
This change VideoCapturerAndroid to attempt 3 times with a period of 300ms to open the camera if it fails.
This is so that if another application have it already opened, it would have more time to release it.
BUG=b/25190234
Review URL: https://codereview.webrtc.org/1422023007
Cr-Commit-Position: refs/heads/master@{#10559}
If SurfaceViewRenderer can't keep up with the stream of incoming frames it has to drop frames. Currently, new frames are dropped until the old pending frame is rendered. This CL drops the old pending frame instead.
Review URL: https://codereview.webrtc.org/1417063005
Cr-Commit-Position: refs/heads/master@{#10558}
Also distinguish between camera failures and failures due to that buffers has not been returned.
Adds unit tests for making sure CameraEventHandler.onError is triggered if frames are not returned.
BUG=b/25514149
Review URL: https://codereview.webrtc.org/1415013006
Cr-Commit-Position: refs/heads/master@{#10555}
MediaCodec.stop() call may hang in some rear cases. To avoid
application hang this call need to be done on separate thread and
possible error reported back to application.
Application may elect to continue executing and use another codec
instance for encoding/decoding or stop the call and exit.
BUG=b/24339249
R=magjed@webrtc.org
Review URL: https://codereview.webrtc.org/1425143005 .
Cr-Commit-Position: refs/heads/master@{#10467}
I replaced quitSafely() with a CountDownLatch. The reason for not using ThreadUtils.invokeUninterruptibly() is that I want to stop accepting frames asap, and invokeUninterruptibly() would still accept frames during the waiting time.
BUG=webrtc:4742
Review URL: https://codereview.webrtc.org/1418223002
Cr-Commit-Position: refs/heads/master@{#10393}
The purpose with this change is to support older API levels by replacing EGL14 (API lvl 17) with EGL10 (API lvl 1). The main purpose is to lower API lvl requirement for SurfaceViewRenderer from API lvl 17 to API lvl 15. Also, camera texture capture will work on API lvl < 17 (and texture encode/decode in MediaCodec, but we don't use MediaCodec below API lvl 18?).
GLSurfaceView/VideoRendererGui is already using EGL10.
EGL 1.1 - 1.4 added new functionality, but won't affect performance. We don't need the functionality, so there should be no reason to not use EGL 1.0.
I have profiled AppRTCDemo with Qualcomm Trepn Profiler on a Nexus 5 and Nexus 6 and couldn't see any difference.
Specifically, this CL:
* Update EglBase to use EGL10 instead of EGL14.
* Update imports from EGL14 to EGL10 in a lot of files (plus changing import order in some cases).
* Update VideoCapturerAndroid to always support texture capture.
Review URL: https://codereview.webrtc.org/1396013004
Cr-Commit-Position: refs/heads/master@{#10378}
Add events to track when camera is requested to open,
when first camera frame is available and when camera is
closed.
BUG=b/24271359
R=perkj@webrtc.org
Review URL: https://codereview.webrtc.org/1398793005 .
Cr-Commit-Position: refs/heads/master@{#10306}
The code that depends on the reverted CL is disabled but not removed. NativeHandleImpl is reverted to the previous implementation, and the new implementation is renamed to NativeTextureHandleImpl. Texture capture can not be used anymore, because it will crash in peerconnection_jni.cc.
Reason for revert:
Increased HW decoder latency and crashes related to that. Also suspected cause of video tearing.
Original issue's description:
> This CL should be the last one in a series to finally
> unblock camera texture capture.
>
> The SurfaceTexture.updateTexImage() calls are moved from
> the video renderers into MediaCodecVideoDecoder, and the
> destructor of the texture frames will signal
> MediaCodecVideoDecoder that the frame has returned. This
> CL also removes the SurfaceTexture from the native handle
> and only exposes the texture matrix instead, because only
> the video source should access the SurfaceTexture.
>
> BUG=webrtc:4993
> R=glaznev@webrtc.org, perkj@webrtc.org
>
> Committed: https://crrev.com/91b348c7029d843e06868ed12b728a809c53176c
> Cr-Commit-Position: refs/heads/master@{#10203}
TBR=glaznev
BUG=webrtc:4993
Review URL: https://codereview.webrtc.org/1394103005
Cr-Commit-Position: refs/heads/master@{#10288}
SurfaceViewHelper requires EGL14 that was added in API level 17. Since the SurfaceViewHelper is only neeed when we capture to textures, this cl change back to not use it when we are capturing to byte buffers.
Also, thread.quitsafely was added in level 18. Instead a new ThreadUtil method has been added for this.
BUG=b/24782220
TEST = run
ninja -C out/Debug libjingle_peerconnection_android_unittest && CHECKOUT_SOURCE_ROOT=`pwd` build/android/adb_install_apk.py --debug out/Debug/apks/libjingle_peerconnection_android_unittest.apk && ./third_party/android_tools/sdk/platform-tools/adb shell am instrument -w -e class org.webrtc.VideoCapturerAndroidTest org.webrtc.test/android.test.InstrumentationTestRunner on a device running Android 4.1 (I tried Nexus 7, the first version)
Review URL: https://codereview.webrtc.org/1401023003
Cr-Commit-Position: refs/heads/master@{#10265}