Did you every wonder how Snapchat is so accurately able to give you a dog nose? Well, Vox did the research so you don't have to!
The technology came from a Ukrainian startup called Looksery, which Snapchat acquired in September, 2015.
The filters use computer vision, which is the same technology you use when you deposit a check with your phone, how Facebook knows who's in your photos, and how self-driving cars can do their thang.
The first step is detection. How does the computer know which part of the image is a face when all it receives is the data for the color value of each individual pixel? Well, it uses areas of contrast between light and dark parts of the image to identify the face.
This is the Viola-Jones Algorithm. It scans through the image data to calculate pixel values and can conclude that there's a face there.
But to apply lipstick, eyeliner and all the other magical elements in Snapchat filters, the app has to also locate your facial features. It does this with an active shape model, a statistical model of a face shape that's been trained by people manually marking facial feature borders on tons of images.
Once your facial features have been located, those points are used as coordinates to create a mesh, a 3D mask that can rotate and scale with your movements.