Drop Filter



It’s brand new series about filter on camera. Yeah, actually, it’s 4th blog i’ve written :D.

So I sure there is question in your mind. Why only mention about filter camera ? Why don’t you write another things, such as algorithm or common mistake when developing app or anything else ?

Yup, The answer is so simple, because i really love it :D, it makes worst things to better. poor photographer to professional, make everything see throught camera looks amazing 😀

Here is achievement, please take a look to know what we will do next.

^ So, is it awesome :)), are you ready to implement it ?


On entire blog, we’ll deal with GPUImage. Rather than being introduction to GPUImage, this blog assumes you are familiar with GPUImage, if not, just spent few mins and google. There is bunch of great articles talk about it, no reason to say again.

And if you’re step-by-step guy. Please create new projec with your Xcode, and follow my instruction 😀

Install GPUImage from CocoaPods

Navigate to root of your project. Create Podfile with content below

Open terminal, cd to project’s directory and run


…… Leave it, pod will do all for you, Just go away or make couple of coffee for yourself.

When it’s finished, we will see like this


and open your *.DropFilter.xcworkspace instead of *.xcodeproj.

Write simple live filter camera

Before going deeply, we should build simple live filter camera first to understand clearly.

Here is anatomy. We’re going to put GPUImageView on top of controller’s view.


GPUImageView is subclass of UIView, so it has useful UIView’s property. It just warps all of code to handle render data which come from camera or filter.

Open storyboard and add UIView on root view controller, and add constrains. To guarantee camera screen will full-screen even on any device.


Switch to Identity Inspector and configure. Class is GPUImage and Label is Top Camera View


It’s done with interface, we’re going to implement code.

1 – Create new outlet from GPUImage with name topCameraImageView.

2 – Added helper code in interface side.

We use GPUImageStillCamera – subclass of GPUVideoCamera. By reuse it, we don’t have any pain when working directly with AVFoundation and OpenGL ES. Many thanks to BrandLarson.

GPUImageGrayscaleFilter is just simple things to help us apply grayscale to camera easily.

3 – Configure them

^ as you read, it’s pretty straightforward. We created instance of camera with hight quality, back camera and Portrait mode as default.

Here is flow of raw data in app. GPUImageStillCamera gets raw data from camera and sent to sub-filter to process it. Otherwise, GPUImage implement by Decorator Pattern. It means you could add many filters as you expect. Finally, just send processed data to GPUImageView, and it will be present on screen.


So, Build and run on your iphone, here is result what we did.

Two live filters

We’re moving to harder part in this blog. From now, we will apply two live filters to camera at same time. It’s said One picture than million words. I prepared some useful photos to explain.

Flow of data : Instead of using one flow, currently, we use 2 flows. One for GrayScaleFilter, one for AmatorkaFilter. All of them will process and send to Top/BottomImageView and present on screen simultaneously.

I admit it costs x 2 CPU/GPU than one filter. But it isn’t problem now, 4S iphone still has enough power to show it fluently, around 30FPS. I benchmarked it, trust me 😀


And here is view hierarchy. We create new GPUImageView ( call as bottomImageView), and maskLayer.


Mask Layer is just CALayer with frame = { 0 , 0 , width / 2 , height }, and assign to topImageView’s layer. By using it as mask of TopImageView, the mask will hide the portion of the original layer. So we could see bottomImageView.


Theory is enough, time to implement it by your hand.

1 – Open storyboard, add new UIView (call as bottomImageView, subclass from GPUImageView), and below TopImageView.


2 – Open viewcontroller.m and add it as outlet, new mask Layer and new filter.

Finally, we change implement section with new code.

In configureFilter, we initial GPUImageAmatorkaFilter. In configureImageView, still initial bottomImageView. Please notice we should assign  _amatorkaFilter as _camera’s target, and bottomImagView is amatorkaFilter’s target.

Magic happens in initMask method. We created new CALayer and assign it into _topCamraImageView’s layer. Because Top/Bottom ImageView is same frame, it’s just overlay, two filter is rendered simultaneously, so people can’t notice the magic inside. It works as our expectation 😀


On action

Make it’s better

So, if you passed two section above, i guarantee you’re understand what i’m doing. But we shouldn’t release app with only noob feature. We should think and improve it professionally …..

I’m thinking about “What happen if we could experience or preview filters by our finger ?” and ” Instead of rectangle, should we try another shape in mask layer ? ”

Yeah, be inde-dev, we should ask ourself with those similar question. Each unique feature will attract people. Think difference is key of success.

1 – Triangle mask
I don’t prefer rectangle mask anymore, try triangle instead 😀 .So replace your initMask with new one.


2 – User’s gesture
As i said before, i wanna use my finger to preview filter as i want. it’ll be unique experience :D. Let add UIPanGestureRecognize and FeBasicAnimationBlock (helper for warping CABasicAnimation’s delegate to block)



Switch back to ViewController.m

Add PanGesture into @interface

And few new API

Finally, Run on your iphone, and preview live filter with your finger 😀

Filter name

It would be great if we add some filter name label in each side. Of course, it must be translate depend on your finger. So we should implement some helper API, to calculator center of cross line.

Yeah, to achieve it, time to get your notebook and pencil to come back to high school.
Sin, cos,tan,cotan are waiting you 😀

It’s easy to understand what i wrote. The final answer is find position of O point. It’s center of cross line.


Add two label into @interface

Initial two labels

Add 2 line of code in next of UIGestureRecognizerStateChanged.

And here is math.


On action

What’s next ?

“That’s one small step for you, one giant leap for your career” – Hear similar with Neli Armstrong’s quote when he was putting step in moon  😀 ? Yup, All we did is just is small steps, we if we understand clearly, we could build awesome live filter by your-self 😀

Many things to improve, complex instagram filter and swipe to preview many live filters in camera view 😀 Here is achievements i’ve done for 1 months 😀

Thanks for reading 😀


Leave a Reply

Your email address will not be published. Required fields are marked *