Traditional Culture Encyclopedia - Photography and portraiture - A cell phone with good pixels (which cell phone has good pixels)

A cell phone with good pixels (which cell phone has good pixels)

Recently, Xiaomi brought a 1 inch camera phone, Lenovo Moto launched a 200-megapixel mobile phone, and the iPhone 14 series to be released in September will use a 48-megapixel sensor, which also caused a lot of discussion. Even Apple is moving towards the era of high pixels. It seems that the mobile phone image has begun to accelerate involution again.

As we know, as early as 20 18, after the Sony IMX586 sensor came out, many 48-megapixel models were continuously introduced. Coupled with the popularity of Huawei P20 series, the four-pixel integration technology of mobile phone sensors is widely known.

We also made a special article about this technology at that time. The so-called four-pixel integration is to arrange four pixels of the same color together through QuadBayer array to form a large pixel. With this technology, IMX586 can not only output high-resolution photos with 48 million pixels, but also output photos with better quality of12 million pixels through pixel merging.

Even then, four-pixel integration was nothing new. As photographers know, the integration of four pixels can be traced back to the launch of Nikon's first digital SLR D 1, 1999. Nikon and Sony jointly developed and exported the original 108 megapixel sensor with 2.7 million low pixels.

On mobile phones, earlier 20-megapixel cameras such as iPhone6 front, Xiaomi Note3 front, Nokia7Plus front, vivoX9 front, Huawei nova3e front, and OPPOR 1 1s rear all adopted four-pixel integration technology.

However, the pixel of the sensor at that time was not as big as 48 million, for the simple reason that the computing power of ISP at that time was not enough to process 48 million pixel photos, let alone meet the enhancement algorithms such as multi-frame noise reduction and AI scene recognition.

This is why 48 million pixels will be output in the mode of12 million pixels by default. Once the high pixel is turned on, the effect is not as good as the synthesized12 million pixels. This problem also exists in the later models with 64 million pixels and 65.438+0 billion pixels.

Fortunately, it didn't take long for the computing power of the mobile phone processor to improve, so that the development of large bottom and high pixels has no worries. For some time, customized sensors of mobile phone manufacturers have become popular, such as IMX689, IMX789, IMX766V and so on. On the basis of four pixels in one, the focusing performance is greatly enhanced, and the high-pixel Android phone has obviously improved both the image quality and the shooting experience.

On the other hand, Apple still insisted on using12 million pixels in the iPhone 13 series until 202 1. Is it because A 15 is not strong enough? Obviously not, it's just that we can see that Apple prefers to use computing power in computational photography functions, rather than increasing the number of pixels, such as constantly polishing and improving portrait mode, night scene mode and movie mode.

This year, the iPhone 14 series will finally be upgraded to 48 million pixels. It is almost certain that it will also adopt the four-in-one mode, and the default output is12 million pixels. One of the advantages of using 48 megapixels is that it can support 8K video shooting, which is expected to bring better night imaging, which is where Apple has been talking about Android. As for glare and ghosts, 48 million pixels will not bring any improvement.

So why did Apple choose to use 48 megapixels at this time? On the one hand, besides the computing power of A 16 has been improved again, another important reason is that there is not much polishing space for computational photography.

Except for Apple, almost all domestic mobile phone manufacturers are working hard on self-developed image chips, which is euphemistically called enhancing computing power. Obviously, although vivo, Xiaomi and OPPO have greatly improved their images, this is not only brought by self-developed chips, but also by cooperating with camera manufacturers to polish and enhance colors, which is actually the leading edge that Apple already has.

Unfortunately, Huawei P50 series Huawei pioneered a "global" image information restoration system in the industry. The most important ones are the concepts of XDOptics and "Primary Color Engine".

Unfortunately, Huawei could have deeply bound these technologies with the self-developed Kirin chip and presented them. However, when it had to switch to Qualcomm Snapdragon chip, many algorithms and adjustments had to be restarted, especially the shortage of image components, which also disrupted Huawei's original image upgrade route and slowed down the development of computational optics.

Is it really impossible to upgrade to the 48-megapixel iPhone 14 series? Of course not. Although everyone says that high pixel is not equal to high image quality, the leap of high pixel in 4800 Wan Chao will definitely encourage many people to buy and help Apple to further boost sales.

then what How to develop stagnant mobile phone images? Is computational photography the end point? Now we can only wait and see.