天道酬勤,学无止境

apprtcdemo

如何修改(添加过滤器)WebRTC 发送到其他对等点/服务器的摄像头流(How to modify (add filters to) the camera stream that WebRTC is sending to other peers/server)

问题 范围 我正在使用 RTCCameraPreviewView 来显示本地相机流 let videoSource = self.pcFactory.avFoundationVideoSource(with: nil) let videoTrack = self.pcFactory.videoTrack(with: sVideoSource, trackId: "video0") //setting the capture session to my RTCCameraPreviewView: (self.previewView as! RTCCameraPreviewView).captureSession = (videoTrack.source as! RTCAVFoundationVideoSource).captureSession stream = self.pcFactory.mediaStream(withStreamId: "unique_label") audioTrack = self.pcFactory.audioTrack(withTrackId: "audio0") stream.addAudioTrack(audioTrack) var device: AVCaptureDevice? for captureDevice in

2021-10-09 00:03:14    分类:技术分享    ios   iphone   swift   webrtc   apprtcdemo

如何在 AppRTC iOS 应用程序中获取帧数据以进行视频修改?(How to get frame data in AppRTC iOS app for video modifications?)

问题 我目前正在尝试对 Swift 中 iOS 的 AppRTC 应用程序中传入的 WebRTC 视频流进行一些修改(这又基于此 Objective-C 版本)。 为此,我需要访问存储在 RTCI420Frame 类(它是 libWebRTC 的 Objective-C 实现的基本类)的框架对象中的数据。 特别是,我需要一个字节数组: [UInt8]和帧的大小。 此数据将用于进一步处理和添加一些过滤器。 问题是, RTCVideoTrack / RTCEAGLVideoView 上的所有操作都是在预编译的libWebRTC.a的引擎盖下完成的,它是从上面链接的官方 WebRTC 存储库编译的,并且获得自定义构建相当复杂,所以我会更喜欢使用示例 iOS 项目中可用的构建; 根据我的理解,它应该包含所有可用的功能。 我正在研究 RTCVideoChatViewController 类,特别是remoteView / remoteVideoTrack ,但在访问帧本身方面没有成功,花了很多时间研究官方存储库中的 libWebRTC 源,但仍然无法解决访问框架数据用于自己的操作。 很高兴得到任何帮助! 回答1 在发布问题后,我很幸运地找到了偷偷摸摸的数据! 您必须将以下属性添加到 RTCEAGLVideoView.h 文件中: @property(atomic, strong)

2021-10-08 04:11:45    分类:技术分享    swift   video-streaming   webrtc   apprtcdemo   apprtc

如何自定义 AppRTC 接听电话(how to customise AppRTC to receive call)

问题 我已经在我的项目中集成了AppRTC ,我可以使用 RoomName 在 https://apprtc.appspot.com 的房间中加入 WebRTC 调用 appClient = ARDAppClient(delegate: self) appClient?.createLocalMediaStream() appClient?.connectToRoomWithId(String(roomId), options: nil) 我可以在我的应用程序中对RoomName进行硬编码,并将应用程序安装在两个设备 A 和 B 中,如果我在 A 和 B 中同时从两个设备上拨打电话,那么我就可以在它们之间成功进行 WebRTC 调用。 现在我想做一个真正的 VOIP 呼叫,即从设备 A 中的应用程序拨打电话,并在设备 B 中的应用程序上接听电话。 AFAIK 我必须在这里做信号部分才能从设备 A 连接到设备 B 中的应用程序。 任何帮助都非常感谢! 回答1 您需要的是一个signaling server 。 双方都连接到它,他们可以在开始通话之前通过它协商房间名称。 确定房间名称后,两个对等方只需连接到该房间即可看到对方。 有一些专为 WebRTC 设计的信令服务器可以在线使用,或者您可以自己制作,这并不复杂。 它真正需要做的就是注册客户并充当他们之间的邮递员。

2021-10-07 02:57:57    分类:技术分享    ios   webrtc   apprtcdemo   signaling   apprtc

how to customise AppRTC to receive call

I have integrated AppRTC in my project and I am able to join the WebRTC call in the room of https://apprtc.appspot.com using RoomName appClient = ARDAppClient(delegate: self) appClient?.createLocalMediaStream() appClient?.connectToRoomWithId(String(roomId), options: nil) I am able to hardcode a RoomName in my app and install the app in two devices A & B and if I make a call from both the devices at same time in A & B then I am able to successfully have a WebRTC call between them. Now I'd like to do a real VOIP call that is make a call from my app in device A and receive the call at my app in

2021-07-01 20:19:37    分类:问答    ios   webrtc   apprtcdemo   signaling   apprtc

How to get frame data in AppRTC iOS app for video modifications?

I am currently trying to make some modifications to the incoming WebRTC video stream in the AppRTC app for iOS in Swift (which in turn is based on this Objective-C version). To do so, I need access to the data which is stored in the frame objects of class RTCI420Frame (which is a basic class for the Objective-C implementation of libWebRTC). In particular, I need an array of bytes: [UInt8] and Size of the frames. This data is to be used for further processing & addition of some filters. The problem is, all the operations on RTCVideoTrack / RTCEAGLVideoView are done under the hood of pre

2021-06-24 01:56:30    分类:问答    swift   video-streaming   webrtc   apprtcdemo   apprtc

How to modify (add filters to) the camera stream that WebRTC is sending to other peers/server

Scope I am using RTCCameraPreviewView to show the local camera stream let videoSource = self.pcFactory.avFoundationVideoSource(with: nil) let videoTrack = self.pcFactory.videoTrack(with: sVideoSource, trackId: "video0") //setting the capture session to my RTCCameraPreviewView: (self.previewView as! RTCCameraPreviewView).captureSession = (videoTrack.source as! RTCAVFoundationVideoSource).captureSession stream = self.pcFactory.mediaStream(withStreamId: "unique_label") audioTrack = self.pcFactory.audioTrack(withTrackId: "audio0") stream.addAudioTrack(audioTrack) var device: AVCaptureDevice? for

2021-06-03 11:02:08    分类:问答    ios   iphone   swift   webrtc   apprtcdemo

openwebrtc演示在Chrome中不起作用(openwebrtc demo is not working in Chrome)

问题 Chrome支持WebRTC。 但是我无法运行openwebrtc http://demo.openwebrtc.org:38080/ 在apprtc期间,我可以运行 https://apprtc.appspot.com/ 那为什么这个问题呢? 那么openwebrtc和apprtc有什么区别? 在e中有什么不同的实现吗? 我知道两者都使用WebRTC API 所以我的直觉是我用于openwebrtc的演示样本不是https,所以chrome不允许它访问相机,所以它不起作用。而Apprtc样本是https 回答1 打开Web控制台: getUserMedia()不再适用于不安全的来源。 要使用此功能,您应该考虑将应用程序切换到安全来源,例如HTTPS。 有关更多详细信息,请参见https://goo.gl/rStTGz。 这是Chrome中的限制。 该页面可以在Firefox中正常运行。

2021-05-09 01:16:42    分类:技术分享    webrtc   apprtcdemo   apprtc   openwebrtcdemo   openwebrtc

带有本地服务器的ApprtcDemo可在浏览器之间运行,但不适用于浏览器本地的Android(ApprtcDemo with local server works between browsers but not Android native to browser)

问题 我正在开发一个聊天应用程序并完成它。 现在我也想实现视频聊天。 经过大量研究,我决定使用“ WebRTC”库。 我做了什么? 1)能够在本地服务器上运行AppRtcDemo,并且可以在浏览器之间正常工作。 参考:http://www.webrtc.org/reference/getting-started 2)可以构建Android AppRtcDemo。但是当我运行它时,说“跨源不支持”。 经过研究,我在webrtc讨论中发现要解决此问题,我需要设置自己的转弯服务器。 3)因此,我安装了webrtc推荐的最新rfc5766TurnServer。 我成功运行了转弯服务器。 参考:http://code.google.com/p/rfc5766-turn-server/ 我对ApprtcDemo(web)和(Android)进行了以下更改以与Turn服务器一起使用 1)apprtc.py 代替: turn_url = 'https://computeengineondemand.appspot.com/' turn_url = turn_url + 'turn?' + 'username=' + user + '&key=4080218913' 指向我的转弯服务器: turn_url = 'http://192.168.5.85:3478/?service=turn

2021-04-29 22:30:29    分类:技术分享    android   webrtc   apprtcdemo   rfc5766turnserver

ApprtcDemo with local server works between browsers but not Android native to browser

I am developing a chat application and done with it. Now I want to implement video chat also. After research a lot I decided to go with "WebRTC" library. What I have done? 1) Able to run AppRtcDemo at local server and Its working fine between browsers. Reference : http://www.webrtc.org/reference/getting-started 2) Able to build Android AppRtcDemo.But when I run it say "Cross origin does not support". After research I found in webrtc discussion that to resolve this issue I need to set-up own turn server. 3) So I install latest rfc5766TurnServer recommended by webrtc. I got success to run turn

2021-04-13 20:04:16    分类:问答    android   webrtc   apprtcdemo   rfc5766turnserver

openwebrtc demo is not working in Chrome

Chrome supports WebRTC. But openwebrtc I am not able to run http://demo.openwebrtc.org:38080/ while apprtc I am able to run https://apprtc.appspot.com/ so why this problem ? so what is the difference between openwebrtc and apprtc ? is there any different implementation in these ? I know both use WebRTC APIs so my gut feeling is the demo sample that I use for openwebrtc is not https so chrome doesn’t let it access camera ,mike and so its not working.While Apprtc sample is https

2021-04-11 09:32:18    分类:问答    webrtc   apprtcdemo   apprtc   openwebrtcdemo   openwebrtc