back to top
Tech

Inside Apple’s Quest To Transform Photography

How does Apple think about iPhone camera design? Obsessively.

Posted on

This fall, when hundreds of gorgeous, expertly lit portrait shots of friends, relatives, and their pets inevitably begin to dominate your Instagram feed, feel free to thank 17th-century Dutch master painters like Vermeer.

It's the day after Apple's Sept. 12 iPhone event and Apple Senior Vice President Phil Schiller is enthusiastically explaining the origins of the Portrait Lighting feature in the new iPhone 8 Plus and iPhone X. "We didn't just study portrait photography. We went all the way back to paint," he explained.

Like the camera in every iPhone that preceded them, Apple is touting the cameras in the iPhone 8 Plus and the forthcoming iPhone X as its best ever. This year the company is particularly proud of these, which boast a marquee “Portrait Lighting” feature that brings a range of professional-looking effects to the already great photos the dual camera system on the iPhone 7 Plus is capable of taking.

This year's leap, however, feels particularly meaningful. A number of early reviews of the iPhone 8 obsess over the camera — TechCrunch, for example, chose to review the phone exclusively as a camera. And there's a decent argument to be made that the enhancements to the camera systems in the 8 Plus and the X are some of the biggest upgrades in the new line. The camera's effects don't rely on filters. They're the result of Apple's new dual camera system working in concert with machine learning to sense a scene, map it for depth, and then change lighting contours over the subject. It's all done in real time, and you can even preview the results thanks to the company’s enormously powerful new A11 Bionic chip. The result, when applied to Apple scale, has the power to be transformative for modern photography, with millions of amateur shots suddenly professionalized. In many ways it's the fullest realization of the democratization of high-quality imagery that the company has been working toward since the iPhone 4.

And to get it right, Apple relied on what it does best: enthusiastic study and deconstruction of the art form it wishes to mimic and advance. In the case of the iPhones 8 Plus and X, this meant poring over the way others have used lighting throughout history — Richard Avedon, Annie Leibovitz, Vermeer.

"If you look at the Dutch Masters and compare them to the paintings that were being done in Asia, stylistically they're different," Johnnie Manzari, a designer on Apple's Human Interface Team, says. "So we asked why are they different? And what elements of those styles can we recreate with software?"

And then Apple went into the studio and attempted to do just that. "We spent a lot of time shining light on people and moving them around — a lot of time," Manzari says. "We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work."

It's all a bit much, but it hints at the rigorous — perhaps even monomaniac — attention to detail that works for Apple when it is paired with the company's technological resources, like machine learning. Describing the design process, Schiller takes pains to note the collaboration between the esoteric studying and the raw tech.

"There’s the Augmented Reality team, saying, 'Hey, we need more from the camera because we want to make AR a better experience and the camera plays a role in that,'" Schiller says. "And the team that's creating Face ID, they need camera technology and hardware, software, sensors, and lenses to support on-device biometric identification. And so there are many roles the camera plays, either as a primary thing — to take a picture — or as a support thing, to help unlock your phone or enable an AR experience. And so there's a great deal of work between all the teams and all of these elements."

"It's never just 'let's make a better camera.' It's what camera can we create? What can we contribute to photography?"

And when all these sides work together there's the potential to create a new paradigm for phone-based photography. When I ask Schiller about the evolution of the iPhone's camera, he acknowledges that the company has been deliberately and incrementally working towards a professional-caliber camera. But he quickly follows up with an addendum that tells you most everything you need to know about Apple and camera design: "It's never just 'let's make a better camera,'" he says. "It's what camera can we create? What can we contribute to photography?"

The company's ubiquitous "Shot on iPhone" advertising campaign, a series of iPhone-shot Time magazine covers, and the professional photographers using the iPhone as they would a Canon or Leica already bear witness to what it can accomplish. And the debut of the updated dual lens camera systems in the iPhone 8 Plus and the iPhone X seems likely to reiterate it. While in beta, Portrait Lighting generally works well — miraculously well in some instances. My lone disappointment with it was failing to pull off a “Girl With a Pearl Earring”-style Stage Light portrait of my dopey dog Fergus. Turns out, portrait mode is designed for people.

Fashion and art photographer Kevin Lu (@sweatengine on Instagram) says he's been legitimately impressed by it, despite the occasional glitch. "It really opens the door to a lot of possibilities for me," he observes. And to be clear, there are sometimes glitches — after all, Apple is rolling portrait lighting out as a beta. But when it works — and in my limited experience it did so frequently — the results are often wonderful.

What Apple's doing is using its software to light a photo as a lighting person might and, more broadly, taking away the complexity of how the fancy cameras you’d typically need to do that stuff work.

"We're in a time where the greatest advances in camera technology are happening as much in the software as in the hardware," Schiller says. "And that obviously plays to Apple's strengths over traditional camera companies."

But isn't something lost when you use software to simplify and automate a process that's historically been artistic? After all, there's something a bit dystopian feeling about pushing a button and essentially flattening the playing field between professionals and amateurs.

"This is not about dumbing things down," Manzari observes, noting that as devices become more professional, they often become more intimidating. "This is about accessibility. It's about helping people take advantage of their own creativity."

Manzari's point is that there are a lot of great photographers who are not professionally trained. They don't need or want to deal with an array of lenses and tools to calibrate focus and depth of field when they're shooting pictures. Nor should they have to. So why not take all the stuff away and give them something that can take a great shot?

“I think the attempt was less to mimic any one specific style and more to try to touch on the range of styles that exist, so we can try to find something for everybody, from core elements of style, rather than a particular design point,” Schiller says when I observe that “Stage Light” really does look like Vermeer, when it blacks out a subject's background.

"Truthfully, it wasn’t ‘make this one look like X,’” he says. “It was more, ‘let’s get enough range so that everyone has some different choices for different situations that cover key use cases.’ So it's learning from the way others have used lighting throughout history, and around the world.”

It's worth noting that Apple has been working towards this in ways that are far less flashy than Portrait Lighting. The cameras on the 8 Plus and the X, for example, detect snow as a situation and automatically make adjustments to white balance, exposure, and whatnot so you don't need to worry about it. "It's all seamless; the camera just does what it needs to," says Schiller. "The software knows how to take care of it for you. There are no settings."

There are no settings. In other words, yes, "it just works."

Perhaps none of this should come as a surprise. Not only has Apple slowly been building toward this level of photography with the iPhone for the better part of a decade, but the blurring of the lines between professionalism and amateurism — laying a friendly, simple veneer atop dizzying and complex technology — has been a hallmark of Apple's innovation since its inception.

It's the principle behind the company's creation of a computer desktop and drag-and-drop file folder organization system. And it's the same ethos behind programs like iMovie, iPhoto, and GarageBand, which gave anyone with an interest and a bit of creativity tools that looked almost like the pros'. Inside the new iPhone cameras that's all taken a step farther. No skeuomorphic interfaces to navigate. Just a button, some lenses, and any number of complex algorithms and processors. And some striking results. Vermeer-esque, even.

"We think the best way to build a camera is by asking simple, foundational questions about photography," Manzari says. "What does it mean to be a photographer? What does it mean to capture a memory? If you start there — and not with a long list of possible features to check off — you often end up with something better. When you take away the complexity of how the camera works, the technology just disappears. Then people can apply all your creativity to that moment you're capturing. And you get some incredible photographs."

John Paczkowski is the managing editor for BuzzFeed San Francisco. Formerly deputy managing editor for Re/code and AllThingsD, he's been covering the intersection of technology and culture since 1997.

Contact John Paczkowski at John.Paczkowski@buzzfeed.com.

Got a confidential tip? Submit it here.