同步操作将从 Gitee 极速下载/scrcpy 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
This application is composed of two parts:
scrcpy-server.jar
), to be executed on the device,scrcpy
binary), executed on the host computer.The client is responsible to push the server to the device and start its execution.
Once the client and the server are connected to each other, the server initially sends device information (name and initial screen dimensions), then starts to send a raw H.264 video stream of the device screen. The client decodes the video frames, and display them as soon as possible, without buffering, to minimize latency. The client is not aware of the device rotation (which is handled by the server), it just knows the dimensions of the video frames.
The client captures relevant keyboard and mouse events, that it transmits to the server, which injects them to the device.
Capturing the screen requires some privileges, which are granted to shell
.
The server is a Java application (with a public static void main(String... args)
method), compiled against the Android framework, and executed as
shell
on the Android device.
To run such a Java application, the classes must be dexed (typically,
to classes.dex
). If my.package.MainClass
is the main class, compiled to
classes.dex
, pushed to the device in /data/local/tmp
, then it can be run
with:
adb shell CLASSPATH=/data/local/tmp/classes.dex \
app_process / my.package.MainClass
The path /data/local/tmp
is a good candidate to push the server, since it's
readable and writable by shell
, but not world-writable, so a malicious
application may not replace the server just before the client executes it.
Instead of a raw dex file, app_process
accepts a jar containing
classes.dex
(e.g. an APK). For simplicity, and to benefit from the gradle
build system, the server is built to an (unsigned) APK (renamed to
scrcpy-server.jar
).
Although compiled against the Android framework, hidden methods and classes are not directly accessible (and they may differ from one Android version to another).
They can be called using reflection though. The communication with hidden components is provided by wrappers classes and aidl.
The server uses 2 threads:
Since the video encoding is typically hardware, there would be no benefit in encoding and streaming in two different threads.
The encoding is managed by ScreenEncoder
.
The video is encoded using the MediaCodec
API. The codec takes its input
from a surface associated to the display, and writes the resulting H.264
stream to the provided output stream (the socket connected to the client).
On device rotation, the codec, surface and display are reinitialized, and a new video stream is produced.
New frames are produced only when changes occur on the surface. This is good because it avoids to send unnecessary frames, but there are drawbacks:
Both problems are solved by the flag
KEY_REPEAT_PREVIOUS_FRAME_AFTER
.
Control events are received from the client by the EventController
(run in
a separate thread). There are 5 types of input events:
KeyEvent
),All of them may need to inject input events to the system. To do so, they use
the hidden method InputManager.injectInputEvent
(exposed by our
InputManager
wrapper).
The client relies on SDL, which provides cross-platform API for UI, input events, threading, etc.
The video stream is decoded by libav (FFmpeg).
On startup, in addition to libav and SDL initialization, the client must push and start the server on the device, and open a socket so that they may communicate.
Note that the client-server roles are expressed at the application level:
However, the roles are inverted at the network level:
This role inversion guarantees that the connection will not fail due to race conditions, and avoids polling.
Once the server is connected, it sends the device information (name and initial screen dimensions). Thus, the client may init the window and renderer, before the first frame is available.
To minimize startup time, SDL initialization is performed while listening for the connection from the server (see commit 90a46b4).
The client uses 3 threads:
The decoder runs in a separate thread. It uses libav to decode the H.264 stream from the socket, and notifies the main thread when a new frame is available.
There are two frames simultaneously in memory:
When a new decoded frame is available, the decoder swaps the decoding and rendering frame (with proper synchronization). Thus, it immediatly starts to decode a new frame while the main thread renders the last one.
The controller is responsible to send control events to the device. It runs in a separate thread, to avoid I/O on the main thread.
On SDL event, received on the main thread, the input manager creates appropriate control events. It is responsible to convert SDL events to Android events (using convert). It pushes the control events to a blocking queue hold by the controller. On its own thread, the controller takes events from the queue, that it serializes and sends to the client.
Initialization, input events and rendering are all managed in the main thread.
Events are handled in the event loop, which either updates the screen or delegates to the input manager.
For more details, go read the code!
If you find a bug, or have an awesome idea to implement, please discuss and contribute ;-)
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。