A (quasi-) real-time video processing on iOS
In previous posts, I showed you how to create a custom camera using AVFoundation and how to process an image with the accelerate framework. Let’s now combine both results to create a (quasi-) real-time (I’ll explain later what I mean with quasi) video processing.
iOS image processing with the accelerate framework
Sometime ago, my friend John Fox asked me how to reproduce a blurring image effect in the iOS SDK. Core Image on the iOS does not provide that effect. You can find in the Internet a couple of solutions for the iOS performing the convolution as matrix multiplication. That’s an ok approach, but it does not take advantage of the hardware acceleration.