The iPhone 7 Plus’ new dual-camera setup is great, but there’s one feature that stands out more than any other: Portrait mode. Using both cameras, the iPhone 7 Plus is able to work out where things are in relation to each other and, without Photoshop, make a smartphone photo look like it was taken with a DSLR. Let’s look at how it works and how to use it.
Update: This article was originally written when Portrait Mode was first introduced on the iPhone 7 Plus. It’s since been included on other iPhones, including all iPhone 11 models as well as the iPhone XR, iPhone XS, and iPhone X.
How the iPhone Emulates Shallow Depth of Field and the Portrait Look
RELATED: How to Manipulate Depth of Field to Take Better Photos
In photography, an image with a shallow depth of field is one where the subject is in focus but everything else is blurry and out of focus. It creates a really pleasing look that’s especially flattering for portraits.
RELATED: What Is Aperture?
Normally, this effect is created using a wide aperture; the wider the aperture, the shallower the depth of field. Due to the limits of smartphone camera sensors, however, even with a wide aperture lens, it’s impossible to get a very shallow depth of field. The small sensor size and need for all the components to fit inside a phone just won’t allow it.
The iPhone 7 Plus’s Portrait mode, seen in the image above, fakes the effect. Instead of using a wide aperture lens, it uses two cameras to create a depth map of the scene and selectively blurs certain areas it knows are further away. When it gets it right, it emulates the look of a portrait shot with a DSLR really well.
To understand how this works, put your index about eighteen inches in front of your face and stare at it. First close your left eye. Then open your left eye and close your right eye. When you do, you should see your finger move in relation to the background. This is the parallax effect in action.
Your eyes each view the finger from a slightly different perspective. Your brain processes the two inputs into one so we rarely notice the effect in everyday life, but this binocular vision is what gives us our ability to perceive depth. The only reason you’re able to pick a glass up from the table without spilling it is that, because your eyes each see it from a different point, your brain is able to triangulate its position relative to you.
With two cameras, the iPhone 7 Plus also has a form of binocular vision. By analyzing how different the image looks between the two cameras, it’s able to create a depth map and work out where different objects are in the scene.
Hold your finger in front of your face again, this time a little closer. Notice how it moves more against the background than before? Now hold it as far away as you can. See how it moves less?
To the iPhone, objects that are close to the phone will shift more between the two images, while things that are far away, like the background, will barely move.
With the depth map built, all that’s left to be done is for the phone to work out what areas you want sharp and what should be blurred to create the portrait look. Through a combination of machine learning and the elements in the scene, the iPhone makes a best guess at what the subject of the shot is and keeps it in focus, everything else gets blurred to some degree. Most of the time, it gets it pretty right.
How to Use Portrait Mode
Open the Camera app on your iPhone. To get to Portrait mode, swipe once to the left or tap where it says Portrait above the shutter button.
Using Portrait mode is largely automatic. Frame your subject in the viewfinder. If you want to adjust exposure or specify a subject, tap on them.
When the effect is locked in you’ll see a preview. The Depth Effect box at the bottom of the screen will also go yellow.
Your subject needs to be between about 0.5m and 2.5m of the camera. If they’re too close or too far, you’ll get a warning and the effect won’t work.
When you’re ready, tap the shutter button to take a shot. You should get something that looks a little like the photo below.
As well as the photo with the Depth Effect, you’ll also have a regular photo without it applied in case things don’t work out.
It’s worth noting that while it’s being billed as Portrait mode, you can do a lot more with it. I love using it to take photos of small, nearby objects like this bumblebee.
The depth effects works really well here.
How Good is the Depth Effect?
Apple is quite clear that Portrait Mode is still in beta, and occasionally it shows. When there are soft edges between the subject and the background, it works great. However, when there are hard edges or transparent areas, like in the image below, the wrong areas can get blurred.
Similarly, the effect will never look identical to a photo taken with a DSLR and a wide aperture lens. It only approximates it. If you zoom in and check all the edges, you will probably be able to find some weird artifacts.
Overall though, Portrait mode is a great addition to the iPhone. It might not always look perfect, but the Depth Effect is a great way to isolate subjects in your images. It won’t work for every photo, but it can make your portraits and close up shots stand out.