When you begin developing an app with camera features, it’s tempting to choose the single back-facing wide lens (the “1x lens”), as it’s a safe choice available on all models of iPhones since their inception. But this choice comes with a problem: you are unintentionally making a choice for iOS by telling it to use and only use the wide lens for photo capturing.
This becomes a very real issue with the iPhone 13 Pro: not allowing iOS to pick the best lens to use results in poor (and often unusable) image quality. Here are two apps I frequently use on my iPhone:
Lululemon (left) cannot scan a credit card1 properly on the iPhone 13 Pro. Foodnoms (right) has issue scanning the barcode up close (it’s a box of Nespresso pods with a barcode equally small in size).
Why Is this happening?
Since the iPhone 7 Plus, the first with a dual camera system, iOS has been able to decide the best lens to use to capture the image, depending on a few factors (and possibly many more):
- The zoom factor, obviously: on the iPhone 13 Pro, a 3x lens cannot physically take a 2x photo, and is not considered
- A lens might be blocked, by a finger or some other object
- A lens might be out of focus for a close-up shot
- A telephoto lens might produce poor quality in low-light environment, and iOS decides it might as well crops the image from the 1x lens
The iPhone 13 Pro, with its macrophotography power, really kicked off the discussion on the “smart” lens switching. The 26 mm wide-angle lens (the “1x lens”) on the iPhone 13 Pro has a minimum focus distance of approximately 8–10 cm. That’s about as long as the diagonal of your credit cards, and a much longer distance than the minimum focus distance on older models of the iPhone.
Many developers, when starting off with AVFoundation or when copying-and-pasting code from StackOverflow, inadvertently tell iOS to stick to the 1x lens no matter what. They use it to scan barcodes, identifying credit cards, or even use it to take photos. It all worked well until hell broke loose with the iPhone 13 Pro.
What’s the solution, as a developer?
Unless you are developing a camera app that must strictly use the user-specified lens, you should defer lens choice to iOS. For the rear-facing cameras, you should be adopting the first available AVCaptureDevice object in the following order:
- Triple Camera, a system that switches automatically among the three lens
- Dual Wide Camera, a system that switches between 0.5x and 1x as appropriate
- Dual Camera, a system that switches between 1.0x and telephoto lens (2x, 2.5x, or 3x depending on the phone model)
- Wide Camera, the 1x single-lens camera
This table below lists out availabilities of the lenses:
|Triple Camera||Wide Dual Camera||Dual Camera||Wide Camera|
|iPhone 13 Pro||Yes||Yes||Yes||Yes|
Device type availability on key iPhone models, from the latest to the oldest still supported by iOS 15.
Many developers chose the wide lens to use because it’s “safe” and available on all iPhone models. They are wrong. Don’t do that.
To put all the above in code:
let deviceTypes: [AVCaptureDevice.DeviceType] = [ .builtInTripleCamera, .builtInDualWideCamera, .builtInDualCamera, .builtInWideAngleCamera, ] let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: deviceTypes, mediaType: .video, position: .back) let selectedDevice = discoverySession.devices.first return selectedDevice // this can be `nil`
Note that the discovery session returns
devices in the order you specified using
deviceTypes (as per Apple documentation). The first element in the
devices array would be the proper device for general purpose camera.
I traced a credit card on the iPad. I didn’t want to expose my card number; I can’t be redacting the number, either, as it would defeat the purpose of showing blur. In practice, the Lululemon app managed to scan 2 of my credit & debit cards and struggled with the others on the iPhone 13 Pro. The app scanned all cards OK—without breaking a sweat—on my iPhone XS. ↩