Traditional Culture Encyclopedia - Photography and portraiture - IPhone 12 in-depth evaluation details, what about IPhone 12?

IPhone 12 in-depth evaluation details, what about IPhone 12?

You should have seen all kinds of reports by now. We'll talk about some specific details later. Let's directly answer a question that everyone is most concerned about:

How about iPhone 12 photography?

One word, cow, two words, can't be broadcast.

Therefore, in this article, we will slowly interpret the image capabilities of 12 series.

The scary thing is that Apple began to pile up hardware.

For a long time, Apple has been "stingy" in hardware, while the whole Android system is engaged in a fierce "pixel war" and "telephoto war". But this time, a bigger sensor was finally adopted. The sensor area directly increased by 47%. The sensor area is basically decisive for the image quality of the whole photo, especially in low light. This is the fundamental reason why the quality of Apple's night shots is not as good as other Android models in recent years. This time, the sensor will be added directly, so the quality of the whole night will definitely improve greatly. According to the official, the shooting ability of low-light video has even been directly improved by 87%, not to mention it is excellent for the naked eye.

More importantly, the iPhone is a machine that does not pursue high pixels. This time, Pro Max remains at12 million pixels, so it also brings a larger pixel size of 1.7 micron, which can effectively control this weak light noise.

However, when it comes to this, the composition of the three cameras of the Pro series is not introduced.

It should be emphasized that this year's Pro and Pro Max cameras are different.

The parameters of the two are exactly the same only at ultra-wide angle, but since it comes to ultra-wide angle, let's say one thing. When we evaluated 1 1 last year, we pointed out that although the super wide-angle lens has a good look and feel, the image quality is obviously degraded in low light. This year, Apple directly filled this short board, focusing on improving night performance and improving the overall low-light image quality.

In the wide-angle lens, which is often referred to as the main lens, Max has a larger sensor area than Pro. As mentioned above, there will obviously be better picture quality.

But the more obvious difference is that Max's telephoto focal length is 65mm, while Pro's telephoto equivalent focal length is 52 mm ... In other words, Max's focal length is longer. Don't underestimate 13mm, because 65mm will enter the "dessert focal length" of portrait photos. At this focal length, the portrait will have a very good spatial depth of field and better facial distortion control.

In terms of photos, you can shoot pure "portrait works", and in terms of film, you can shoot more pure close-ups

I know, from a technical point of view, Max doesn't have a longer focal length like many Android machines. But the focal length of the camera is not the size, but the content is judged according to the subject matter. The focal length of 65mm is a very thoughtful choice. Because this can ensure that people can take pictures even indoors.

On the contrary, if the focal length is longer, unless your home is a few hundred flat floors, the portrait is basically just a face. In other words, 65mm is a very practical telephoto end. Judging from the actual proofs of Apple, the texture of the picture is also very wonderful.

Here, there is another big trick that I didn't say, because this time Pro Max has a real unique secret, that is:

Sensor displacement optical image anti-shake

I know you must be at a loss for such a description. But in fact, this technology is not complicated, and it has been popular in the field of professional cameras for many years. This time, if I remember correctly, it should be the first time to apply this technology to mobile phones.

To understand this technology, we must first be familiar with a concept called "safety shutter" in the field of photography. That is to say, when the shutter speed is too low, the jitter of our hands will blur the picture and reduce the quality of the whole picture. The so-called sensor displacement optical anti-shake is to let the sensor of the mobile phone move with your rhythm. For example, if your hand moves down, the sensor will actively move up to keep the sensor stable. Although this sounds simple, it needs a lot of calculation and adjustment in a short time, and this value is as high as 5000 times per second.

Through this technology, two very important photographic pain points can be solved:

1. In the photo, we can expose it for a long time. Without this technology, the safety shutter of hand shake is the reciprocal of the focal length, for example, 26mm, then the safety shutter is1/26 s. Below this shutter, it will "paste". But with this technology, Apple officially claimed that the hand-held shutter speed can reach 2 seconds. What is the concept of 2 seconds? It is enough for you to shoot a busy track video, and even some extreme environments can shoot stars.

2. On the video, anti-shake ability will be a new beginning. In the past, the anti-shake of iPhone video was still based on calculation, that is, through the cutting and alignment of the picture. Now, through the movement of the sensor, the picture can be sacrificed less and the transition is more natural. Simply put, hand-held shooting can also get a better mirror effect.

Of course, in addition to the above changes, there are some changes in details, such as the brand-new seven-mirror wide-angle lens and the larger aperture of f 1.6, which makes the whole picture sharper.

Compared with previous models, Max has the longest span in hardware. There are not only the conventional ideas of upgrading the lens, but also the new gadgets of camera shaking. Not only keep restraint in pixels and focal length, but also keep the forefront in camera technology.

Even more frightening is that Apple's computational photography has taken another big step.

At least for now, no one knows more about "computational photography" than Apple.

Today, Apple has three axes in optical computing photography, corresponding to different lighting conditions.

The first is the night scene mode in low light, which improves the brightness, purity and color of the whole picture through the superposition of multiple pictures. The night scene mode I want to talk about first is really not the first thing Apple did. However, judging from the night scene mode that I use various mobile phone lenses of different brands all the year round, the night scene mode of iP Apple is "the commanding height of aesthetics".

Last year, the night scene mode was just launched at 1 1. It is suggested that the night scene mode of iPhone should pay more attention to "color". It pursues color reproduction in low light environment, not higher brightness. On the contrary, it maintains a high degree of restraint in brightness, focusing on recreating the atmosphere of the night scene. On top of this, this year's night scene mode still maintains such tonality. For example, the following sample is the best example:

The two biggest changes in the night scene mode this year are that the night scene mode is not only supported by the main lens, but also by the ultra-wide angle. The more interesting front lens also supports the night scene mode, making the self-portrait at night "more beautiful".

Another bigger change is that the night scene mode is embedded with "time-lapse photography". This has always been the function I wanted. Because the iPhone's time-lapse photography has always been the most stable time-lapse creation method. But don't say that at night, when the sun goes down, the picture quality will drop significantly. This time, we have a good helper "night scene mode", which allows us to shoot the delay of "day to night" without restraint. It is equivalent to replacing those night scenes with high-quality frames in night scene mode and inserting them directly into the delayed video, which makes the night scene delay more abundant.

Then there is deep fusion in low light, that is, deep fusion, that is, through multi-frame synthesis, the details of the main body can be restored in low light environment. It is more like an advanced sharpening of different areas, which makes this picture more textured. An important change this time is that all cameras on Max support deep fusion.

Of course, there is also the smart HDR that the iPhone is best at, and it has also come to the latest HDR 3. In the past, our understanding of HDR technology still stayed in the sub-regional brightness of light, such as brightening the black places in the picture. But in the smart HDR 3, the iPhone directly subdivided the scene. It is to learn the scene you shot and make real light adjustment. If it is a portrait, it will recognize your face and divide it into regions. In addition, it will also identify various natural scenes, such as making the sky purer after identifying the sky, and making the cracks in the land more textured after identifying the land. Therefore, it is conceivable that intelligent HDR 3 has become a key weapon to improve the texture of the picture.

Of course, all of the above are "the past", and this time it is to improve and improve. The next thing to say is:

A new beginning.

From now on, 12 all supports Dolby Vision shooting. This is also the first camera system that can shoot Dolby Vision. Simply put, Dolby Vision has two advantages:

1. The color is excellent, because the color is directly upgraded from 8 bits to 10 bits. Don't underestimate the change of only two digits. Students in the film and television industry should understand that these two changes are exponential, and the number of colors is actually a great leap, so that the iPhone can directly capture 700 million colors, 60 times that of the past. In other words, we will shoot videos with iPhone 12 series in the future, which will have more delicate and realistic colors.

2. Better dynamic range.

It should be noted that in the past, this technology required a professional team to complete and required a lot of later steps. But now 12 series can directly color Dolby horizon when shooting, in real time and frame by frame. For example, Pro will take two different pictures of exposure value, and then analyze them with our customized image signal processor to create a histogram representing the hue value of each frame. Then, according to this histogram, Dolby Vision video metadata is generated.

Not only shooting, 12 series is also equipped with the editing ability of Dolby Vision, which allows you to edit Dolby video in real time.

Obviously, the above requires very strong image processing ability and computing ability.

So this is why only iPhone is good at all of the above, because 12 has the strongest mobile phone processor on the surface-A14 bionic chip.

This is actually not its first appearance, but here you only need to know why the iPhone can be so fast in computing photography, which is the core meaning.

The most terrible thing is that iPhone photography takes a different approach.

Lidar is a very ambitious attempt. Lidar is used in all product lines of Apple, and it first appeared on the latest iPad Pro. Simply put, radar can let us know the "three-dimensional relationship" of the surrounding environment. In other words, in the past, our mobile phone could only judge the picture in two-dimensional space, but in three-dimensional space, there was nothing we could do. But now with radar, we can get three-dimensional information.

It sounds very metaphysical. Indeed, before today's 12, the actual use scenarios of lidar for users were very few. But Apple skillfully applied this function to the photography of 12, which became the unique skill of iPhone.

First of all, since we know the three-dimensional data, it is obvious that one function will be a great progress, and that is portrait mode. Because the so-called portrait mode is to identify the spatial level in the scene, peel off different scenes, and thus judge which plane to blur. Basically, it mainly relies on machine learning "guessing" and "position calculation" of multiple lenses. But this has a serious drawback, because it passively accepts light, so in the weak light environment, portrait mode is easy to collapse. Because the picture can't be taken, you can't guess it naturally. But now because of the radar, take the initiative to attack, not to mention the weak light, even if it is all black, it can still obtain the three-dimensional relationship in space.

Therefore, with the lidar system, Pro series still has amazing performance at night. With the huge success rate of portrait mode at night, we can get a very good night view, that is, Jiao Wai shot by lens. For example, the following example.

You will find that the blur transition of the whole picture is very natural, and because there are more point light sources at night, the blur effect is even similar to that of many large aperture SLR lenses.

Not only that, lidar can also show its talents in focusing. Or because it is an active attack, in theory, any light environment can capture the dynamic motion of the object, and it is based on this feature that the iPhone has a very strong focusing speed. Apple officially claims that the low-light focusing speed is even six times that of the past.

Basically, you can concentrate in the dark, not for fun. It will be more effective for us to shoot videos at night.

Of course, these still need to be discussed.

If you finish watching this year's conference, you will find that you are talking about photography most of the time, while you are talking about 5G at other times. In fact, 5G has been well known by domestic users, and in this respect, Apple has fallen behind. But we also know that in the past year, 5G technology has still not improved our life in essence. There are many reasons for this. As Cook said, to make 5G technology work, it needs to make progress in hardware, software and operators.

Hardware, the support for 5G is natural, and the popularity of operators is also visible to the naked eye. The key to the problem lies in the software ecology. Apple's answer is faster network speed and lower latency. Intriguingly, the game actually chose the alliance mobile game version, and Tencent really ranked high.

However, we all know that 5G is just a prelude, and we still have a lot to answer.

Another hot spot is definitely "no charger", which I know must be accompanied by great controversy.

Whether it is cost consideration or Apple's environmental protection purpose, we all know that not sending a charging head can't be a measure of whether to choose an Apple product. The only thing you need to know about second-hand:

Whether iPhone photography is backward or not has been discussed loudly in the past two years. But one thing is certain, this is a test of photography:

"The combination of art and art."

Among them, the so-called technique is the technical means and the cornerstone of everything. It is precisely because of the development and progress of sensors and algorithms that today's mobile phone photography situation has emerged.

The so-called art is a choice of ideas, which guides the direction of "art", is the aesthetics of photography and the cognition of photography.

Today's Apple may have some shortcomings in one aspect of "art" or find another way, but it still maintains its own mature thinking concept in the cognition of art. Don't believe me, if you look back at all the samples in the whole article now, are they still ahead in aesthetics?

This is the real capital of mobile phone photography, which is designed for those who know how to create and for those who love to create.