For anyone curious, this line of comment shows how it's done:
> startCameraAndStream opens the webcam with GoCV, sends raw frames to FFmpeg (via stdin), reads IVF from FFmpeg (via stdout), and writes them into the WebRTC video track.
For anyone curious, this line of comment shows how it's done:
> startCameraAndStream opens the webcam with GoCV, sends raw frames to FFmpeg (via stdin), reads IVF from FFmpeg (via stdout), and writes them into the WebRTC video track.
If people want to build it another way you could do any of these, examples all on the repo!
* Use FFMPEG lib directly * GStreamer * Use libvpx, examples is in mediadevices repo