Home / NewTech / The iPhone 12 Pro’s cameras have some new tricks that serious photographers will love

The iPhone 12 Pro’s cameras have some new tricks that serious photographers will love

Apple's iPhone Pro phones have new camera features, including a larger image sensor, a faster main camera lens, improved image stabilization, a lidar sensor for autofocus in low light, and a longer range telephoto lens on the iPhone Pro Max.


7;s iPhone 12 Pro phones have new camera features, including a larger image sensor, a faster main camera lens, improved image stabilization, a lidar sensor for autofocus in low light, and a longer-range telephoto lens on the iPhone 12 Pro Max.

Screenshot by Stephen Shankland / CNET

Apples iPhone 12 and iPhone 12 Mini Add significant new photography capabilities, but camera hardware and computer photography software at the top end of the range iPhone 12 Pro Models really show how hard Apple is working to attract photo and video enthusiasts.

Changes to the iPhone 12 Pro models include new features for merging multiple images into one superior shot and a lidar sensor for improved autofocus. The iPhone 12 Pro Max also has a larger sensor on the main camera for better low-light performance, a telephoto camera that better magnifies distant subjects, and better stabilization to counteract your trembling hands.

Look at that:

iPhone 12, iPhone 12 Mini, Pro and Pro Max explained


The iPhone 12, iPhone 12 Mini, iPhone 12 Pro and iPhone 12 Pro max debuted at Apple’s iPhone 12 launch event Tuesday. The iPhone 12 (from USD 799, GBP 799, AU $ 1,349) and the 12 Mini (from USD 699, GBP 699, AU $ 1,199) stick to last year’s design with regular, ultrawide and selfie cameras.

The bigger improvements in photography come with the 12 Pro (from $ 999, £ 999, AU $ 1,699) and 12 Pro Max (from $ 1,099, £ 1,099, AU $ 1,849), which have a larger image sensor and fourth telephoto camera for more distant subjects. The iPhone 12 Pro has the same telephoto range with 2x zoom as previous iPhones – a focal length of 52mm – but the 12 Pro Max has a 2.5x zoom or equivalent lens of 65mm.

The iPhone 12 and iPhone 12 Mini are also significantly improved. You benefit from night mode photos, which now also work on ultrawide and selfie cameras, and an improved HDR mode for challenging scenes with light and dark elements.

Apple’s efforts in this area reflect the fact that consumers consider the camera to be one of the most important features of a smartphone, along with processor and network speed. We take photos and videos to document our lives, to share with friends and family and to enjoy artistic expression.

Computer photography tricks

HDR stands for High Dynamic Range – the ability to capture shadow details without turning highlights into a blurred mess. All new iPhones have third-generation HDR technology, which can better capture details like silhouetted faces, according to Apple. It also uses machine learning to assess processing options like increasing the brightness in dark areas.

Look at that:

The iPhone 12 includes Smart HDR 3 and an improved night mode


The iPhone 12 Pro models get another computer photography technology that Apple calls ProRaw. iPhones and Android phones have been able to take raw photos for years, an unprocessed alternative to JPEG that allows photographers to decide how best to manipulate an image. Apple ProRaw combines Apple computer photography with a raw format, so photographers can take advantage of noise reduction and dynamic range with the flexibility of raw images, according to Apple. It is similar to Google’s raw computational technology that was introduced with the Pixel 3 in 2018.

Google pioneered the processing tricks known as computational photography, helping to remove the comfortable edge in image quality that Apple’s early iPhones had for years.

With the iPhone 11, however, Apple used its own versions of some Google techniques, such as: For example, combining multiple low-exposure images into one shot to capture shadow detail without turning the sky into overexposed whiteout. Google calls it HDR + and Apple calls it Smart HDR. A related technology called Deep Fusion combines frames for better detail and textures, especially in low light.

On the iPhone 12, Apple’s Deep Fusion technology trains all of the major parts of the A14 Bionic chip, including the main CPU, image signal processor, graphics processor, neural engine, and other elements. That means Apple can apply Deep Fusion technology to all cameras on all iPhone models, according to Apple. And it means that the iPhone’s portrait shots now work in night mode and correspond to a corresponding ability Google added 5 with its Pixel.

Look at that:

Apple unveils iPhone 12 Pro and Pro Max


Better iPhone camera hardware

The larger sensor on the iPhone 12 Pro models – 47% larger than the main camera sensor on the iPhone 11 – increases the pixel size. This technical choice increases the cost of the sensor, but allows more light to be collected for better colors, less noise, and improved performance in low light conditions.

The Pro phones also stabilize images by moving the sensor rather than the lens elements. According to Apple, you can use it to take handheld pictures with a surprisingly long exposure time of 2 seconds.

All iPhone 12 models also benefit from a larger aperture of 1.6 on the main camera for better light gathering ability. And the ultrawide camera is now receiving optical image stabilization.

The phones also offer better video capabilities with 10-bit encoding to better capture color and brightness, and support for Dolby Vision HDR video technology. The iPhone 12 Pro models can record HDR at 60 frames per second, the iPhone 12 and 12 Mini with 30 frames per second.

Ordinary 4K and 1080p images can be recorded at up to 60 frames per second, but 1080p slow motion images can reach 240 frames per second. Time-lapse videos are now stabilized.

The Apple iPhone 12 brings night mode to ultrawide and selfie cameras, not just the main camera.

The Apple iPhone 12 brings night mode to ultrawide and selfie cameras, not just the main camera.

Screenshot by Stephen Shankland / CNET

What the iPhone doesn’t do

But Apple didn’t go as far as to make headlines in photography.

For example, the iPhone 12 does not use pixel binning, a technique that uses much higher resolution sensors for photographic flexibility. Pixel binning is the process of combining data from groups of four or nine neighboring pixels to obtain the color information for a single pixel in the captured photo. If there is enough light when the photo is taken, the phone can skip pixel binning and take a photo at a much higher resolution. That gives more detail or more flexibility to crop the important part of the scene.

Another new trick that the iPhone skipped is adding a telephoto camera at much higher magnification. The Huawei P40 Pro Plus, for example, has an impressive 10x optical zoom. This is difficult because the laws of physics make telephoto cameras physically big, but smartphone makers like Huawei and Samsung are trying to solve the problem with a mirror that bends the light path sideways into the inside of the phone.

Apple might have other tricks up its sleeve, however. In 2017, Apple acquired the image sensor startup InVisage, whose QuantumFilm technology showed great promise for downsizing image sensors or improving image quality.

A lot is achieved with computer photography alone, especially with a portrait mode that simulates the “bokeh” of blurred backgrounds from high-end cameras and the lighting effects that can be applied to those portraits.

Source link