网网站建设站建设国内知名广告公司有哪些

当前位置: 首页 > news >正文

网网站建设站建设,国内知名广告公司有哪些,怎么防止网站被注册机,手机介绍网站Qt连接scrcpy-server 测试环境如何启动scrcpy-server1. 连接设备2. 推送scrcpy-server到手机上3. 建立Adb隧道连接4. 启动服务5. 关闭服务 使用QTcpServer与scrcpy-server建立连接建立连接并视频推流完整流程1. 开启视频推流过程2. 关闭视频推流过程 视频流的解码1. 数据包协议… Qt连接scrcpy-server 测试环境如何启动scrcpy-server1. 连接设备2. 推送scrcpy-server到手机上3. 建立Adb隧道连接4. 启动服务5. 关闭服务 使用QTcpServer与scrcpy-server建立连接建立连接并视频推流完整流程1. 开启视频推流过程2. 关闭视频推流过程 视频流的解码1. 数据包协议解析2. 解码流程3. 视频帧转QImage 使用OpenGL渲染显示视频流控制命令的下发 测试环境 首先放一些测试环境不保证其他环境也能够这样使用 Qt库5.12.2mscv2019_64scrcpy2.3.1FFmpegffmpeg-n5.1.4-1-gae14d9c06b-win64-gpl-shared-5.1Adb34.0.5Android环境MuMu模拟器12 如何启动scrcpy-server 首先需要说明的是我们是与scrcpy-server建立连接而单纯想显示手机上的画面与控制作者github发布有scrcpy.exe可以直接运行使用而这里我们相当于做另一个scrcpy从而达到一些自定义控制的目的。与scrcpy-server建立连接github上开发文档也说明了https://github.com/Genymobile/scrcpy/blob/master/doc/develop.md这里更详细的说明下与scrcpy-server建立连接的具体细节。为了更好的描述细节下面所有操作使用Qt代码演示。

  1. 连接设备 启动scrcpy-server的所有操作都是经过Adb进行的不了解Adb命令建议先学习一下相关命令因此连接设备前先确保手机上打开了“USB调试”开关。连接设备使用命令adb connectQt中执行Adb命令使用QProcess类这里我们封装一个Adb工具类以方便的执行命令 //头文件 #pragma once#include qobject.h #include qprocess.h/*** brief Adb命令执行封装类/ class AdbCommandRunner { public:explicit AdbCommandRunner(const QString deviceName QString());~AdbCommandRunner();/** brief 执行Adb命令* param cmds 参数列表* param waitForFinished 是否等待执行完成/void runAdb(const QStringList cmds, bool waitForFinished true);/** brief 获取执行结果的错误* return*/QString getLastErr();QString lastFeedback; //执行结果返回的字符串private:QProcess process;QString deviceName; };//cpp #include adbcommandrunner.h#include qdebug.hAdbCommandRunner::AdbCommandRunner(const QString deviceName): deviceName(deviceName) {}AdbCommandRunner::~AdbCommandRunner() {if (process.isOpen()) {process.kill();process.waitForFinished();} }void AdbCommandRunner::runAdb(const QStringList cmds, bool waitForFinished) {if (deviceName.isEmpty()) {process.start(adb/adb, cmds);} else {process.start(adb/adb, QStringList({-s, deviceName}) cmds);}qDebug() do adb execute command: adb cmds.join( );if (waitForFinished) {process.waitForFinished();}lastFeedback process.readAllStandardOutput(); }QString AdbCommandRunner::getLastErr() {QString failReason process.readAllStandardError();if (failReason.isEmpty()) {failReason lastFeedback;}return failReason; }需要注意的是Adb服务是后台运行的我们可以直接执行adb connect命令连接设备adb会自动启动服务然而启动服务是需要个几秒钟直接QProcess执行会有个等待时间正确的做法是先使用adb start-server启动服务这个过程可以在线程中执行 QThread::create([] {QProcess process;process.start(adb/adb, {start-server});process.waitForFinished();if (process.exitCode() 0 process.exitStatus() QProcess::NormalExit) {qDebug() adb server start finished!;} else {qDebug() adb server start failed: process.readAll();} ))-start();如果服务启动成功并且设备存在连接时几乎没有等待时间 bool connectToDevice() {AdbCommandRunner runner;runner.runAdb({connect, deviceAddress});if (runner.lastFeedback.contains(cannot connect to)) {qDebug() connect device: deviceAddress failed, error: runner.getLastErr();return false;}qInfo() connect device: deviceAddress success!;return true; }2. 推送scrcpy-server到手机上 推送文件自然是使用adb push命令建议是推送到临时目录/data/local/tmp下 bool pushServiceToDevice() {auto scrcpyFilePath QDir::currentPath() /scrcpy/scrcpy-server;qDebug() scrcpy path: scrcpyFilePath;AdbCommandRunner runner;runner.runAdb({-s, deviceAddress, push, scrcpyFilePath, /data/local/tmp/scrcpy-server.jar});if (!runner.lastFeedback.contains(1 file pushed)) {qDebug() runner.getLastErr();return false;}return true; }3. 建立Adb隧道连接 默认情况下scrcpy-server是作为客户端通过adb隧道连接到电脑端的本地Tcp服务器如开发者文档上描述这个角色也是可以反转的只需要在启动服务命令里面添加tunnelforwardtrue注意不是启动scrcpy.exe的命令行参数。默认角色下使用adb reverse命令开启隧道连接需要注意的是隧道名中需要携带一个8位字符串scid作为标识这里我们可以使用时间戳代替 scid QString::asprintf(%08x, QDateTime::currentSecsSinceEpoch()); AdbCommandRunner runner; runner.runAdb({-s, deviceAddress, reverse, localabstract:scrcpy scid, tcp:27183)});记住这个27183端口下面使用QTcpServer时正是使用这个端口监听服务的连接。
  2. 启动服务 scrcpy-server本身是一个可执行的jar包启动这个jar包使用adb shell命令 serverRunner new AdbCommandRunner; QStringList scrcpyServiceOpt; scrcpyServiceOpt -s deviceAddress shell; scrcpyServiceOpt CLASSPATH/data/local/tmp/scrcpy-server.jar; scrcpyServiceOpt app_process; scrcpyServiceOpt /; scrcpyServiceOpt com.genymobile.scrcpy.Server; scrcpyServiceOpt SCRCPY_VERSION; scrcpyServiceOpt scid scid; scrcpyServiceOpt audiofalse; //不传输音频 scrcpyServiceOpt max_fps QString::number(maxFrameRate); //最大帧率 scrcpyServiceOpt max_size1920; //视频帧最大尺寸 serverRunner-runAdb(scrcpyServiceOpt, false);需要注意的是这里QProcess对象需要保存关闭服务时需要杀死对应的adb shell子进程。在上面参数中scid以及之前的参数是必要的如果版本号和scid对应不上无法启动服务。更多的控制参数可以参考源代码scrcpy\app\src\server.c第212行开始其中参数的默认值在scrcpy\app\src\options.c中启动成功后就会立即通过adb与电脑端本地服务建立连接。
  3. 关闭服务 关闭服务时首先需要结束shell进程然后关闭隧道即可 if (serverRunner) {delete serverRunner;serverRunner nullptr; }AdbCommandRunner runner; runner.runAdb({-s, deviceAddress, reverse, –remove, localabstract:scrcpy_ scid});关闭服务之后scrcpy-server会自己在设备中删除重新启动服务需要从第2步骤推送文件开始。 使用QTcpServer与scrcpy-server建立连接 上面说了默认情况下电脑端作为tcp服务器scrcpy-server作为客户端建立连接因此使用QTcpServer监听本地adb隧道连接端口即可 ScrcpyServer::ScrcpyServer(QObject *parent): QObject(parent) {//tcp服务tcpServer new QTcpServer(this);connect(tcpServer, QTcpServer::acceptError, this, {qCritical() scrcpy server accept error: socketError;});connect(tcpServer, QTcpServer::newConnection, this, ScrcpyServer::handleNewConnection); }void ScrcpyServer::handleNewConnection() {auto socket tcpServer-nextPendingConnection();//第一个socket为视频流if (!videoSocket) {videoSocket socket;connect(socket, QTcpSocket::readyRead, this, ScrcpyServer::receiveVideoBuffer);qInfo() video socket pending connect…;} else if (!controlSocket) {controlSocket socket;connect(socket, QTcpSocket::readyRead, this, ScrcpyServer::receiveControlBuffer);qInfo() control socket pending connect…;} else {qWarning() unexpect socket appending…;}connect(socket, QTcpSocket::stateChanged, this, {qDebug() socket state changed: state;if (state QAbstractSocket::UnconnectedState) {socket-deleteLater();}}); }bool ScrcpyServer::start() {if (!tcpServer-isListening()) {bool success tcpServer-listen(QHostAddress::AnyIPv4, 27183);if (!success) {qDebug() tcp server listen failed: tcpServer-errorString();}} }根据开发者文档描述scrcpy-server连接到QTcpServer后会有3个tcp连接分别用来传输视频、音频、控制命令这里我们在启动时设置了audiofalse关闭了音频传输因此第2个为控制socket。 建立连接并视频推流完整流程 上面讲了启动scrcpy-server和使用QTcpServer建立连接事实上建立连接和启动tcp服务是需要按照顺序进行的
  4. 开启视频推流过程 开启QTcpServer服务监听指定端口如27183推送scrcpy-server到手机上使用tcp服务监听的端口和8位随机字符串作为scid建立Adb隧道连接使用adb shell命令启动scrcpy-server服务QTcpServer等待视频流和控制socket连接
  5. 关闭视频推流过程 结束adb shell子进程关闭Adb隧道连接关闭Tcp服务 视频流的解码
  6. 数据包协议解析 文档中详细描述了视频流的数据组成最开始视频流会传输64字节表示设备的名称然后依次传输4字节编码方式、4字节帧图像宽度、4字节帧图像高度接着开始传输视频帧其中视频帧由帧头和数据组成帧头中包含有PTS标志8字节和帧数据长度4字节两个信息后面接收帧数据长度的数据即可然后等待接收下一帧数据。视频默认编码为H.264可以通过启动服务参数更改编码类型这里我们使用FFmpeg来解析视频帧。 由于解码是个耗时任务需要放到线程中运行这里就需要与QTcpSocket接收到的数据进行线程同步处理为了让解码线程看起来像是以同步方式读取数据编写一个工具类来接收QTcpSocket发送来的数据 //头文件 #pragma once#include qobject.h #include qmutex.h #include qwaitcondition.h#include byteutil.hclass BufferReceiver : public QObject { public:explicit BufferReceiver(QObject parent nullptr);void sendBuffer(const QByteArray data);void endCache();templatetypename TT receive() {enum {T_Size sizeof(T)};T value T();receive((void)value, T_Size);ByteUtil::swapBits(value);return value;}void receive(void* data, int len);bool isEndReceive() const {return endBufferCache;}private:QByteArray receiveBuffer;QMutex mutex;QWaitCondition receiveWait;bool endBufferCache; };//cpp #include bufferreceiver.hBufferReceiver::BufferReceiver(QObject *parent): QObject(parent), endBufferCache(false) {}void BufferReceiver::sendBuffer(const QByteArray data) {QMutexLocker locker(mutex);receiveBuffer.append(data);receiveWait.notify_all(); }void BufferReceiver::endCache() {QMutexLocker locker(mutex);endBufferCache true;receiveWait.notify_all(); }void BufferReceiver::receive(void data, int len) {mutex.lock();if (endBufferCache) {mutex.unlock();return;}while (receiveBuffer.size() len !endBufferCache) {receiveWait.wait(mutex);}if (!endBufferCache) {memcpy(data, receiveBuffer.data(), len);receiveBuffer receiveBuffer.mid(len);}mutex.unlock(); }在主线程中收到视频流数据就缓存到BufferReceiver中 void ScrcpyServer::receiveVideoBuffer() {if (videoDecoder) {videoDecoder-appendBuffer(videoSocket-readAll());} }解码器线程按照协议依次接收数据包 void VideoDecoder::run() {QByteArray remoteDeviceName(64, \0);bufferReceiver.receive(remoteDeviceName.data(), remoteDeviceName.size());auto name QString::fromUtf8(remoteDeviceName);if (!name.isEmpty()) {qInfo() device name received: name;}if (bufferReceiver.isEndReceive()) {return;}if (codecCtx nullptr) {auto codecId bufferReceiver.receiveuint32_t();auto width bufferReceiver.receiveint();auto height bufferReceiver.receiveint();if (!codecInit(codecId, width, height)) {codecRelease();qCritical() video decode init failed!;return;}}qInfo() video decode is running…;for (;;) {if (!frameReceive()) {break;}if (!frameMerge()) {av_packet_unref(packet);break;}frameUnpack();av_packet_unref(packet);}//释放资源codecRelease();qInfo() video decoder exit…; }2. 解码流程 注意上面解码线程的读取数据步骤在读取到解码器和帧大小时就可以进行解码器初始化了 //初始化解码器 auto codec avcodec_find_decoder(AV_CODEC_ID_H264); if (!codec) {qDebug() find codec h264 fail!;return false; }//初始化解码器上下文 codecCtx avcodec_alloc_context3(codec); if (!codecCtx) {qDebug() allocate codec context fail!;return false; }codecCtx-width width; codecCtx-height height; codecCtx-pix_fmt AV_PIX_FMT_YUV420P;int ret avcodec_open2(codecCtx, codec, nullptr); if (ret 0) {qDebug() open codec fail!;return false; }packet av_packet_alloc(); if (!packet) {qDebug() alloc packet fail!;return false; }decodeFrame av_frame_alloc(); if (!decodeFrame) {qDebug() alloc frame fail!;return false; }获取到帧数据时依次读取PTS和帧数据大小设置到AVPacket中 bool VideoDecoder::frameReceive() {auto ptsFlags bufferReceiver.receiveuint64_t();auto frameLen bufferReceiver.receiveint32_t();if (bufferReceiver.isEndReceive()) {return false;}Q_ASSERT(frameLen ! 0);if (av_new_packet(packet, frameLen)) {qDebug() av new packet failed!;return false;}bufferReceiver.receive(packet-data, frameLen);if (bufferReceiver.isEndReceive()) {return false;}if (ptsFlags SC_PACKET_FLAG_CONFIG) {packet-pts AV_NOPTS_VALUE;} else {packet-pts ptsFlags SC_PACKET_PTS_MASK;}if (ptsFlags SC_PACKET_FLAG_KEY_FRAME) {packet-flags | AV_PKT_FLAG_KEY;}packet-dts packet-pts;return true; }根据PTS判断是否需要进行帧合并 bool VideoDecoder::frameMerge() {bool isConfig packet-pts AV_NOPTS_VALUE;if (isConfig) {free(mergeBuffer);mergeBuffer (uint8_t)malloc(packet-size);if (!mergeBuffer) {qDebug() merge buffer malloc failed! required size: packet-size;return false;}memcpy(mergeBuffer, packet-data, packet-size);mergedSize packet-size;}else if (mergeBuffer) {if (av_grow_packet(packet, mergedSize)) {qDebug() av grow packet failed!;return false;}memmove(packet-data mergedSize, packet-data, packet-size);memcpy(packet-data, mergeBuffer, mergedSize);free(mergeBuffer);mergeBuffer nullptr;}return true; }视频帧解包分别使用avcodec_send_packet和avcodec_receive_frame下面代码中演示了如何循环解包然后转换为QVideoFrame对象供后面视频渲染使用注意这里图像格式为YUV420P void VideoDecoder::frameUnpack() {if (packet-pts AV_NOPTS_VALUE) {return;}int ret avcodec_send_packet(codecCtx, packet);if (ret 0 ret ! AVERROR(EAGAIN)) {qCritical() send packet error: ret;} else {//循环解析数据帧for (;;) {ret avcodec_receive_frame(codecCtx, decodeFrame);if (ret AVERROR(EAGAIN) || ret AVERROR_EOF) {break;}if (ret) {qCritical() could not receive video frame: ret;break;}QVideoFrame cachedFrame(codecCtx-width * codecCtx-height * 3 / 2,QSize(codecCtx-width, codecCtx-height),codecCtx-width, QVideoFrame::Format_YUV420P);int imageSize av_image_get_buffer_size(codecCtx-pix_fmt, codecCtx-width, codecCtx-height, 1);if (cachedFrame.map(QAbstractVideoBuffer::WriteOnly)) {uchar *dstData cachedFrame.bits();av_image_copy_to_buffer(dstData, imageSize, decodeFrame-data, decodeFrame-linesize,codecCtx-pix_fmt,codecCtx-width, codecCtx-height, 1);cachedFrame.unmap();emit frameDecoded(cachedFrame);}av_frame_unref(decodeFrame);}} }3. 视频帧转QImage 有时候我们需要提取视频的一帧图像例如截图操作需要直接转RGB图像这时候有两种方法一是直接对AVFrame进行转换也就是上面提到的decodeFrame使用sws_scale函数但是需要先初始化一个SwsContext初始化可以在CodecContext初始化之后进行 swsContext sws_getContext(codecCtx-width, codecCtx-height, codecCtx-pix_fmt,codecCtx-width, codecCtx-height, AV_PIX_FMT_RGB24, SWS_BILINEAR,nullptr, nullptr, nullptr);转换时根据codecCtx信息先构造一个QImage再调用sws_scale即可 QImage image QImage(codecCtx-width, codecCtx-height, QImage::Format_RGB888); auto imagePtr image.bits(); //将YUV420p转换为RGB24 const int lineSize[4] {3codecCtx-width, 0, 0, 0}; sws_scale(swsContext, (const uint8_t const*)decodeFrame-data, decodeFrame-linesize,0, codecCtx-height, (uint8_t)imagePtr, lineSize);第二种方法就比较简单了通过Qt内置的方法转换正如上面提到通过av_image_copy_to_buffer函数将AVFrame转成了QVideoFrame最后发送了出来获取QImage直接调用image函数即可此时转换出来的格式是ARGB32 QImage image cachedFrame.image();使用OpenGL渲染显示视频流 显示视频最好的办法就是使用OpenGL渲染这样不会消耗大量的CPU资源并且原视频帧解码出来的YUV420P也可以在OpenGL中计算。Qt中使用OpenGL自然是继承QOpenGLWidgetQt官方正好有一个显示视频的控件QVideoWidget只是没有提供直接设置视频流的方法仔细阅读Multimedia模块中的QVideoWidget源代码发现如果使用GLSL经过QPainterVideoSurface实例最终进行渲染使用的是QVideoSurfaceGlslPainter其中支持各种图像帧类型的渲染其中YUV420P也包含在内对于YUV420P转RGB使用的是BT709标准。复制源代码中multimediawidgets/qmediaopenglhelper_p.h、multimediawidgets/qpaintervideosurface_p.h、multimediawidgets/qpaintervideosurface.cpp3个文件自定义一个VideoWidget其中实例化一个QPainterVideoSurface刷新图片是使用QPainterVideoSurface::present函数即可 //.h #pragma once#include qwidget.h #include qopenglwidget.h#include qpaintervideosurface_p.hclass VideoWidget : public QOpenGLWidget { public:explicit VideoWidget(QWidget *parent nullptr);~VideoWidget();QPainterVideoSurface *videoSurface() const;QSize sizeHint() const override;public:void setAspectRatioMode(Qt::AspectRatioMode mode);protected:void hideEvent(QHideEvent *event) override;void resizeEvent(QResizeEvent *event) override;void paintEvent(QPaintEvent *event) override;private slots:void formatChanged(const QVideoSurfaceFormat format);void frameChanged();private:void updateRects();private:QPainterVideoSurface *m_surface;Qt::AspectRatioMode m_aspectRatioMode;QRect m_boundingRect;QRectF m_sourceRect;QSize m_nativeSize;bool m_updatePaintDevice; };//.cpp #include videowidget.h#include qevent.h #include qvideosurfaceformat.hVideoWidget::VideoWidget(QWidget *parent): QOpenGLWidget(parent), m_aspectRatioMode(Qt::KeepAspectRatio), m_updatePaintDevice(true) {m_surface new QPainterVideoSurface(this);connect(m_surface, QPainterVideoSurface::frameChanged, this, VideoWidget::frameChanged);connect(m_surface, QPainterVideoSurface::surfaceFormatChanged, this, VideoWidget::formatChanged); }QPainterVideoSurface *VideoWidget::videoSurface() const {return m_surface; }VideoWidget::~VideoWidget() {delete m_surface; }void VideoWidget::setAspectRatioMode(Qt::AspectRatioMode mode) {m_aspectRatioMode mode;updateGeometry(); }QSize VideoWidget::sizeHint() const {return m_surface-surfaceFormat().sizeHint(); }void VideoWidget::hideEvent(QHideEvent *event) {m_updatePaintDevice true; }void VideoWidget::resizeEvent(QResizeEvent *event) {updateRects(); }void VideoWidget::paintEvent(QPaintEvent *event) {QPainter painter(this);if (testAttribute(Qt::WA_OpaquePaintEvent)) {QRegion borderRegion event-region();borderRegion borderRegion.subtracted(m_boundingRect);QBrush brush palette().window();for (const QRect r : borderRegion)painter.fillRect(r, brush);}if (m_surface-isActive() m_boundingRect.intersects(event-rect())) {m_surface-paint(painter, m_boundingRect, m_sourceRect);m_surface-setReady(true);} else {if (m_updatePaintDevice (painter.paintEngine()-type() QPaintEngine::OpenGL|| painter.paintEngine()-type() QPaintEngine::OpenGL2)) {m_updatePaintDevice false;m_surface-updateGLContext();if (m_surface-supportedShaderTypes() QPainterVideoSurface::GlslShader) {m_surface-setShaderType(QPainterVideoSurface::GlslShader);} else {m_surface-setShaderType(QPainterVideoSurface::FragmentProgramShader);}}} }void VideoWidget::formatChanged(const QVideoSurfaceFormat format) {m_nativeSize format.sizeHint();updateRects();updateGeometry();update(); }void VideoWidget::frameChanged() {update(m_boundingRect); }void VideoWidget::updateRects() {QRect rect this-rect();if (m_nativeSize.isEmpty()) {m_boundingRect QRect();} else if (m_aspectRatioMode Qt::IgnoreAspectRatio) {m_boundingRect rect;m_sourceRect QRectF(0, 0, 1, 1);} else if (m_aspectRatioMode Qt::KeepAspectRatio) {QSize size m_nativeSize;size.scale(rect.size(), Qt::KeepAspectRatio);m_boundingRect QRect(0, 0, size.width(), size.height());m_boundingRect.moveCenter(rect.center());m_sourceRect QRectF(0, 0, 1, 1);} else if (m_aspectRatioMode Qt::KeepAspectRatioByExpanding) {m_boundingRect rect;QSizeF size rect.size();size.scale(m_nativeSize, Qt::KeepAspectRatio);m_sourceRect QRectF(0, 0, size.width() / m_nativeSize.width(), size.height() / m_nativeSize.height());m_sourceRect.moveCenter(QPointF(0.5, 0.5));} }开始视频推流之前初始化Surface设置使用OpenGL渲染并指定视频格式为YUV420P videoWidget-videoSurface()-setShaderType(QPainterVideoSurface::GlslShader); videoWidget-videoSurface()-start(QVideoSurfaceFormat(QSize(1920, 1080), QVideoFrame::Format_YUV420P));从VideoDecoder获取到视频帧时发送到Surface connect(decorder, VideoDecoder::frameDecoded, this, {videoWidget-videoSurface()-present(frame); });关闭推流时同时关闭Surface渲染 videoWidget-videoSurface()-stop();控制命令的下发 命令的控制是通过第二个socket发送数据其命令的编码协议定义和编码在源代码scrcpy\app\src\control_msg.h、scrcpy\app\src\control_msg.c这两个文件中。例如发送一个点击事件 namespace ByteUtil {/* brief 字节序交换* tparam T 数值类型* param data 转换目标数值* param size 字节序交换大小/templatetypename Tstatic void swapBits(T data, size_t size sizeof(T)) {for (size_t i 0; i size / 2; i) {char pl (char)data i;char pr (char*)data (size - i - 1);if (*pl ! *pr) {*pl ^ *pr;*pr ^ *pl;*pl ^ pr;}}}/** brief char转指定数值类型大端序 tparam T 数值类型* param data 转换目标数值* param src 原字节数组* param srcSize 原字节数组大小/templatetypename Tstatic void bitConvert(T data, const void src, int srcSize sizeof(T)) {memcpy(data, src, srcSize);swapBits(data, srcSize);} }class ControlMsg { public:static QByteArray injectTouchEvent(android_motionevent_action action, android_motionevent_buttons actionButton,android_motionevent_buttons buttons, uint64_t pointerId,const QSize screenSize, const QPoint point, float pressure) {char bytes[32];bytes[0] SC_CONTROL_MSG_TYPE_INJECT_TOUCH_EVENT;bytes[1] action;ByteUtil::bitConvert((uint64_t)(bytes 2), pointerId);uint32_t x point.x();ByteUtil::bitConvert((uint32_t)(bytes 10), x);uint32_t y point.y();ByteUtil::bitConvert((uint32_t)(bytes 14), y);uint16_t w screenSize.width();ByteUtil::bitConvert((uint16_t)(bytes 18), w);uint16_t h screenSize.height();ByteUtil::bitConvert((uint16_t)(bytes 20), h);uint16_t pressureValue sc_float_to_u16fp(pressure);ByteUtil::bitConvert((uint16_t)(bytes 22), pressureValue);ByteUtil::bitConvert((uint32_t)(bytes 24), actionButton);ByteUtil::bitConvert((uint32_t)(bytes 28), buttons);return { bytes, 32 };} };注册videoWidget事件过滤器模拟发送鼠标事件 bool App::eventFilter(QObject *watched, QEvent event) {if (watched videoWidget) {if (auto mouseEvent dynamic_castQMouseEvent(event)) {auto dstPos QPoint(qRound(mouseEvent-x() * framePixmapRatio.width()), qRound(mouseEvent-y() * framePixmapRatio.height()));if (mouseEvent-type() QEvent::MouseButtonPress) {scrcpyServer-sendControl(ControlMsg::injectTouchEvent(AMOTION_EVENT_ACTION_DOWN, AMOTION_EVENT_BUTTON_PRIMARY,AMOTION_EVENT_BUTTON_PRIMARY, 0,frameSrcSize, dstPos, 1.0));} else if (mouseEvent-type() QEvent::MouseButtonRelease) {scrcpyServer-sendControl(ControlMsg::injectTouchEvent(AMOTION_EVENT_ACTION_UP, AMOTION_EVENT_BUTTON_PRIMARY,AMOTION_EVENT_BUTTON_PRIMARY, 0,frameSrcSize, dstPos, 0.0));} else if (mouseEvent-type() QEvent::MouseMove) {scrcpyServer-sendControl(ControlMsg::injectTouchEvent(AMOTION_EVENT_ACTION_MOVE, AMOTION_EVENT_BUTTON_PRIMARY,AMOTION_EVENT_BUTTON_PRIMARY, 0,frameSrcSize, dstPos, 1.0));}}}return QObject::eventFilter(watched, event); }//ScrcpyServer void ScrcpyServer::sendControl(const QByteArray controlMsg) {if (controlSocket) {controlSocket-write(controlMsg);} }需要注意的是screenSize参数必须为原视频发送来的图片帧大小如果界面上的控件进行了缩放需要按照比例映射到原图片帧位置才能正确的点击。 demo程序的源代码https://github.com/daonvshu/qt-scrcpyservice