Implement Real-Time Filtering With CIFilter | by Bahadır Sönmez | Nov, 2022

Perform Core Image Filtering on AVFoundation

photo by habib dadkhah Feather unsplash

in my previous articleI talked about making a custom filter with CIFilter, In this article, I will talk about how to use CIFilter Filters for real-time filtering. Camera access and camera access permission is required for the app to work. Make sure you ask for Privacy – Camera usage details permission Info.plist,

First, let’s create a class named CameraCapture To process and transfer the images captured by the camera. This class is initialized with the position of the camera and a callback off.

typealias Callback = (CIImage?) -> ()
private let position: AVCaptureDevice.Position
private let callback: Callback
init(position: AVCaptureDevice.Position = .front, callback: @escaping Callback)
self.position = position
self.callback = callback

define a AVCaptureSession and initiated by a user DispatchQueue in class. it is important to define userInitiatedBecause it will always be visible in the UI.

private let session = AVCaptureSession()
private let bufferQueue = DispatchQueue(label: "someLabel", qos: .userInitiated)

Since session is private, write two public functions to start and end the session.

func start() 
func stop() 

create a function for session configuration and call it later super.init(), To process the images captured by the camera, CameraCapture must conform to AVCaptureVideoDataOutputSampleBufferDelegate Drafting

private func configureSession() 
// 1
session.sessionPreset = .hd1280x720
// 2
let discovery = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInDualCamera, .builtInWideAngleCamera], mediaType: .video, position: position)
guard let camera = discovery.devices.first, let input = try? AVCaptureDeviceInput(device: camera) else
// Error handling

// 3
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: bufferQueue)

Let’s take a step-by-step look at what’s inside the function.

1. Determination of image quality.
2. Finding and Configuring Appropriate Video-Capturing Elements AVCaptureDevice.DiscoverySession and making capture input with AVCaptureDeviceInput
3. Create output with AVCaptureVideoDataOutput and add the delegate to the class

Captured image needs to be converted CIImage and fed into callback off. type extension for CameraCapture he suits AVCaptureVideoDataOutputSampleBufferDelegate For this.

extension CameraCapture:AVCaptureVideoDataOutputSampleBufferDelegate 
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else return
let image = CIImage(cvImageBuffer: imageBuffer)
self.callback(image.transformed(by: CGAffineTransform(rotationAngle: 3 * .pi / 2)))

make CIImage with sampleBuffer from delegate function and pass it callback, Since the incoming image is skewed, it is necessary to rotate it by 270 degrees. As a result, the following class has been created.

after making CameraCapture Filtering can be done using this class, without any problems. make ViewController with UIImageView And CameraCapture Example.

class RealtimeFilterViewController: UIViewController 
var imageView: UIImageView!
var cameraCapture: CICameraCapture?
override func viewDidLoad()
imageView = UIImageView(frame: view.bounds)
cameraCapture = CICameraCapture(cameraPosition: .front, callback: image in )

Now it’s time to filter and show the image callback, Select and apply any of the built-in filters. let’s choose xRay filter. filter in callback off. eventually, cameraCapture looks like this:

cameraCapture = CICameraCapture(cameraPosition: .front, callback:  image in
guard let image = image else return
let filter = CIFilter.xRay()
filter.inputImage = image
let uiImage = UIImage(ciImage: (filter.outputImage!.cropped(to: image.extent)))
self.imageView.image = uiImage

Let’s run it like this. But what is that? Nothing appears, and a message is constantly logged on the console.

2022-11-08 15:06:14.829234+0300 RealtimeFiltering[2903:883376] [api] -[CIContext(CIRenderDestination) _startTaskToRender:toDestination:forPrepareRender:forClear:error:] The image extent and destination extent do not intersect.

The message is quite clear. The image range and destination range do not intersect. We should define a function to transform and scale the image to the extent of our view. Create an extension and use this function:

import CoreImage
extension CIImage
func transformToOrigin(withSize size: CGSize) -> CIImage
let originX = extent.origin.x
let originY = extent.origin.y
let scaleX = size.width / extent.width
let scaleY = size.height / extent.height
let scale = max(scaleX, scaleY)
return transformed(by: CGAffineTransform(translationX: -originX, y: -originY)).transformed(by: CGAffineTransform(scaleX: scale, y: scale))

Now, let’s use this function to define uiImage, and bam! We have created a working real-time filtering application.

let uiImage = UIImage(ciImage: (filter.outputImage!.cropped(to: image.extent).transformToOrigin(withSize: self.view.bounds.size)))

eventually, RealtimeFilterViewController Should look like this:

This works perfectly for a simple filter. The output image looks like this:

input image → output image

But what if multiple filters are used as a chain? let’s try it. shift cameraCapture The definition is as follows:

cameraCapture = CICameraCapture(cameraPosition: .front, callback:  image in
guard let image = image else return
let filter = CIFilter.thermal()
let filter2 = CIFilter.xRay()
let filter3 = CIFilter.motionBlur()
filter.inputImage = image
filter2.inputImage = filter.outputImage!
filter3.inputImage = filter2.outputImage!
let uiImage = UIImage(ciImage: (filter3.outputImage!.cropped(to: image.extent).transformToOrigin(withSize: self.view.bounds.size)))
self.imageView.image = uiImage

It still works, but looking at the resource consumption, it seems to be literally draining.

CPU, memory, and energy usage of realtime filtering apps

This method is not effective at all. so what to do? Fortunately, Apple is aware of this and has provided a more efficient way. it is MTKView, create a class called MetalRenderView that is inherited MTKView,

The application will crash if the device does not support the Metal framework. most important part of MetalRenderView Is renderImage Celebration. This function is called when the image is assigned and appropriates the image MTKView, Apple for more information document For MTKView Can be used.

Now let’s show the filtered image with the help of MetalRenderView, First, replace imageView In RealtimeFilterViewController with MetalRenderView,

var metalView: MetalRenderView!

Secondly, change the following block in viewDidLoad,

imageView = UIImageView(frame: view.bounds)

…with this

metalView = MetalRenderView(frame: view.bounds, device: MTLCreateSystemDefaultDevice())

then change these two lines in callback ending

let uiImage = UIImage(ciImage: (filter3.outputImage!.cropped(to: image.extent).transformToOrigin(withSize: self.view.bounds.size)))
self.imageView.image = uiImage

with this

self.metalView.setImage(filter3.outputImage?.cropped(to: image.extent))

MetalRenderView the handle transformToOrigin method on its own. now, RealtimeFilterViewController Should look like this:

Now, let’s run the application again and see the difference. It looks a little better. But the slight difference will be more valuable when increasing the number of filters or working with more difficult filters.

CPU, memory, and energy usage of realtime filtering apps

Yes, we now have a fully functioning and more efficient real-time filtering application. The application can be developed with various filters and various UI enhancements. The app may be able to take pictures, but that’s a topic for another article.

Leave a Reply