The future of photography is a camera made of code - Action News
Home WebMail Tuesday, November 26, 2024, 11:45 AM | Calgary | -13.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

The future of photography is a camera made of code

On devices like the iPhone X, what we think of as a camera is largely a collection of software algorithms ones that increasingly let us take photos no traditional camera could ever take.

A portrait photo feature new to the iPhone shows how far the field of computational photography has come

An Apple employee demonstrates Portrait Lighting on an iPhone X inside the Steve Jobs Theatre at Apple Park in Cupertino, Calif. The new feature uses facial recognition and augmented reality technologies to simulate different types of lighting on a person's face. (Matthew Braga/CBC News)

Back in 2010, a team from Stanford University's computer graphics lab got their hands on a Nokia N900. It had a pretty good camera by smartphone standards at the time, but the researchers thought they could make it better with a little bit of code.

The Stanford team, led by professor Mark Levoy, was working on the cutting edge of a nascent field known as computational photography. The theory was that software algorithms could do more than dutifully process photos, but actually make photos better in the process.

"The output of these techniques is an ordinary photograph, but one that could not have been taken by a traditional camera," is how the group described its efforts at the time.

Fast forward to today, and many of the techniques that Levoy and his team worked on yielding featureslike HDR and better photos in low light are now commonplace. And in Cupertino, Calif,. on Tuesday, Apple's iPhone event was another reminder of just how far smartphone technology hascome.

What we think of as a camera is largely acollection of software algorithms that expands with each passing year.

The iPhone X has a front-facing camera system that senses depth, and can be used to unlock the device using facial recognition. But it is also used for photo processing when taking selfies. (Matthew Braga/CBC News)

Take Portrait Lighting, a feature new to the iPhone 8 Plus and iPhone X. Apple says it "brings dramatic studio lighting effects to iPhone." And it's all done in software, of course. Here's how an Apple press release describes it:

"It uses thedual cameras and the Apple-designed image signal processor to recognize the scene, create a depth map and separate the subject from thebackground. Machine learning is then used to create facial landmarks and add lighting over contours of the face, all happening in real time."

In other words, Apple is combining techniques used in augmented reality and facial recognition to create a photo that, to paraphrase the Stanford team, no traditional camera could take. On the iPhone X, the company is also using its facial recognition camera system, which can sense depth, to do similar tricks.

While the underlying techniques behind many of these features aren't necessarily new, faster and more capable processors have made it feasible to do them on a phone. (Apple says its new phones even have a dedicated chip for machine learning tasks.)

The computational photography features found in the iPhone 8 Plus and iPhone X were demonstrated in the lobby outside the Steve Jobs Theatre following Tuesday's announcement. (Matthew Braga/CBC News)

With the iPhone 7 Plus, Apple introduced a feature called Portrait Mode, on which Portrait Lighting is built. It uses machine learning to blur the background of an image, creating the illusion of a portrait lens'shallow depth-of-field an effect called bokeh.Samsung introduced a similar feature called Live Focus on its recently announced Note 8.

And it probably won't come as a surprise thatLevoy, the Stanford professor, joined Google in 2011, not long after his team published a paper detailing their Nokia N900 work. He's still doing computational photography research, andrecent work on improving the quality of HDR images made its way into Google's most recent Pixel phone.

It used to be that those post-processing tricks put the emphasis on post. You'd take your photoand then have to bring your photo into an app on your phone or laptop to get a similar kind of effect, or wait as the smartphone's camera did the processing itself. But with each new generation of smartphone, thealgorithms get faster, more capable, andfade further into the background, turning code into its own kind of lens.