Swift send compressed video frames using GPUImage

I’m writing a Swift app that sends an iPhone camera video input (frames) through the network, so I can later display them on a macOS app.

Currently, I’m grabbing video frames from an AVCaputreSession, and get a PixelBuffer from the captureOutput method.
Since each frame is huge (RAW pixels) I’m converting the CVPixelBuffer that to a CGImage with VTCreateCGImageFromCVPixelBuffer and later to a UIImage with JPEG compression (50%). I then send that JPEG through the network and display it on the Mac OS app.

As you can see this is far from ideal and runs at ~25 FPS on an iPhone 11. After some research, I came up with GPU Image 2. It seems that I could get the data from the camera and apply something like this (so that the transformation is done in GPU):

camera = try Camera(sessionPreset:AVCaptureSessionPreset640x480)
let pictureOutput = PictureOutput()
pictureOutput.encodedImageFormat = .JPEG
pictureOutput.imageAvailableCallback = {image in
    // Send the picture through the network here
}
camera --> pictureOutput

And I should be able to transmit that UIImage and display it on the macOS app. Is there a better way to implement this whole process? Maybe I could use the iPhone’s H264 hardware encoding instead of converting images to JPEG, but it seems that it’s not that straightforward (and it seems that GPUImage does something like that from what I read).

Any help is appreciated, thanks in advance!

Source: Ios