How Phase Detection Autofocus Works

When it comes to digital single-lens reflex (DSLR) camera technology, there appears to be quite a deal of uncertainty over the precise operation of phase detection autofocus. If you are curious about how and why a camera might have an autofocus problem, this article will throw some light on what occurs within the camera in terms of focusing when a photo is taken.

While for most individuals, this topic may not be of great interest, this article will shed some light on what happens inside the camera in terms of autofocus when a picture is taken. There is an overwhelming amount of negative feedback on autofocus issues on such fine tools as the Canon 5D Mark III, Nikon D800, Pentax K-5, and other digital SLR cameras, and it seems like the majority of photographers do not seem to understand that the underlying problem is not necessarily with a specific model or type of camera, but instead with the specific way in which these cameras acquire focus.

This is despite the fact that there is an overwhelming amount of negative feedback on autofocus issues on such fine tools. You may find thousands of autofocus reports on various DSLRs dating back more than ten years if you search for them on the internet. Therefore, the front focus and rear focus difficulties that we experience in current cameras are nothing new – they have been present ever since the first DSLR with a phase-detect sensor was manufactured. This means that they are not a result of any technological advancement.

How DSLR Cameras Work

It is necessary first to become familiar with the operation of a DSLR camera in order to have a more in-depth understanding of this topic. The graphics that are typically used to describe DSLR cameras only show a single reflex mirror angled at a 45-degree angle.

They don’t indicate that there’s a secondary mirror behind the reflex mirror that reflects some of the light into a phase-detection sensor, but that’s what’s actually happening behind the scenes. Take a look at the streamlined drawing that I created below based on a sample photograph taken with a Nikon D800:


The following is an explanation of each number shown in the graphic that was just presented:

  1. Ray of light
  2. Main/Reflex Mirror
  3. The Secondary Mirror is also known as the “Sub-Mirror.”
  4. Camera Shutter and Image Sensor
  5. Eccentric pin (1.5mm hex) for adjusting the Main Mirror
  6. Eccentric pin (1.5mm hex) for adjusting the Secondary Mirror
  7. Phase Detect Sensor (AF Sensor)
  8. Pentaprism
  9. Viewfinder

Let’s have a look at the process that takes place within the camera every time a photo is taken. Light rays are able to enter the camera after passing through the lens (1). Because it is tilted at an angle of 45 degrees, the primary mirror (2) is partially transparent, which allows it to reflect the majority of the light vertically into the pentaprism (8). The pentaprism performs a seemingly miraculous transformation on the vertical light, turning it horizontal and inverting it so that what you see via the viewfinder is identical to what you see when you look through the camera’s viewfinder (9).

There is a little amount of light that is able to get through the primary mirror and is then reflected by the secondary mirror (3), which is similarly tilted at an angle (54 degrees on many recent Nikon cameras, as seen above). The light is then directed to a set of sensors (two sensors for each AF point) via the Phase Detect / AF Sensor (7), which is the next component to be reached by the light. The camera then does an analysis and comparison of the pictures received from these sensors (this is analogous to how the rangefinder determines whether or not the subject is in focus), and if the photos do not seem to be identical, the camera orders the lens to make the necessary modifications (for further information, see below).

There is a significant flaw in this strategy, despite the fact that the procedure described above appears to be relatively uncomplicated. The Phase Detect sensor is the one that gives the lens instructions on how to make the appropriate changes, but the picture itself is acquired by a totally another device, which is the sensor that is located on the back of the camera. What exactly is the issue here? Remember that when you take a photo, the shutter opens, the primary and secondary mirrors flip up, and the light from the lens directly strikes the camera sensor (4).

When you take a picture, the primary and secondary mirrors flip up. It is necessary to maintain consistency in the distance that exists between the lens mount and the camera sensor as well as the distance that exists between the lens mount and the Phase Detect sensor for phase detection autofocus to function correctly. The autofocus will be inaccurate even if there is only a very tiny shift in position. In addition to this, problems with autofocus will also occur if the angle of the secondary mirror is not precisely what it should be.

How Phase Detect Sensor Works

As I said before, phase-detect technology operates in a manner that is analogous to that of rangefinder cameras. The light that is reflected off of the secondary mirror is then received by two or more miniature image sensors that have microlenses placed directly above them. The number of focus points that an AF system has determined how many focus points it has.

One on either side of the lens, as indicated in the image at the top of page 7 (7) (the illustration exaggerates this behavior by showing two different light beams reaching two separate sensors.) For each focus point that you see in a viewfinder, there are two small sensors for phase difference.

In point of fact, a contemporary phase detection device contains a great deal more than two sensors, and these sensors are positioned quite close to one another. If an item is in focus when the light reaches these two sensors, light beams from the extreme edges of the lens will converge precisely in the center of each sensor (just as they would on an image sensor), indicating that the object is in focus.

If the picture on both sensors is identical, this demonstrates that the subject in question is, in fact, sharp across its whole. In the event that an object is not in sharp focus, the light will no longer converge and will instead strike the sensor from a variety of angles, as seen in the diagram below (picture courtesy of Wikipedia):


The images in Figures 1 through 4 illustrate four different ways in which the lens can be focused: (1) incorrectly, (2) appropriately, (3) too far, and (4) much too far. The phase difference between the two profiles may be utilized to identify not only which direction to alter the focus in order to attain optimal focus but also how big of a change to make in that direction. This can be seen clearly from the graphs. Take note of the fact that, in practice, the lens moves rather than the sensor.

Since the phase-detection system is able to determine whether an object is front focused or rear-focused, it is able to provide the camera lens with precise instructions on the direction in which to change its focus and the amount by which to do so. When a camera successfully focuses on a target using a closed-loop autofocus operation, the following sequence of events takes place:

Two image sensors are used to analyze the light that comes in contact with the lens on either side of its perimeter.
The AF system is able to discern whether an object is front or back-focused, as well as by how much, based on how the light enters the image sensors. Based on this information, the AF system then gives instructions to the lens to alter its focus.
The steps described above are carried out an unlimited number of times until the desired level of concentration is attained. If the lens cannot attain focus, it will reset and begin the process of reacquiring focus, which can result in “hunting” for focus.
When the autofocus mechanism has determined that the subject has been brought into sharp focus, it will indicate this by displaying a green dot inside the viewfinder or emitting a beep.
All of this takes place in a concise amount of time, which is why the contrast-detection system is significantly slower than the phase-detection system (which relies on shifting focus back and forth until focus is obtained, with a lot of image data analysis taking place on the level of the image sensor).

The phase-detection/autofocus system is a reasonably sophisticated technology that gets upgrades pretty much every time that a higher-end camera line is refreshed. These advances have helped the system become faster and more accurate.

The number of autofocus points and the number of cross-type autofocus points that are more dependable have both been steadily growing over the course of the past several years. For instance, both the Canon 1D X and the Canon 5D Mark III contain a staggering 61 focus points, with 41 of those points being of the cross-type variety. Take a good look at this intricate matrix of focusing sensors that are located on the camera:


Not only has the total amount of AF points grown, but so has the certainty with which they may be redeemed. The vast majority of professional cameras manufactured in the present day come equipped with autofocus systems that are incredibly quick and highly adjustable, allowing them to monitor subjects and acquire focus continually.

DSLR Autofocus Problems

As can be seen above, the phase detection autofocus technology is quite complicated and calls for a great level of accuracy in order to provide correct results. During the assembly process, the phase-detector/alignment-finder (AF) system has to be correctly placed and aligned. This is of the utmost importance.

The autofocus would be off even if there were only a minute variation, which is really something that occurs relatively frequently during manufacture. This is the primary reason why phase-detection has been the root of issues pretty much ever since the introduction of the first DSLR camera equipped with a phase-detection sensor. Taking into mind the possibility of such discrepancies, the producers of DSLR cameras came up with a high-precision calibration system that allows for individual camera calibration throughout the process of quality control and assurance (QA).

The system will do automated computerized testing that will go through each and every focus point and manually modify it in the camera if a phase-detect sensor alignment fault is discovered. This testing will continue until the problem is resolved. After re-calibrating and re-adjusting the spots that are off, the compensation values are then stored into the camera firmware. You may think of this as a procedure that is very similar to the AF Fine Tune and AF Micro Adjust that occur on the phase detection level, with the exception that it is done independently for each AF focus point.

We will be happy to hear your thoughts

Leave a reply

Compare items
  • Cameras (0)
  • Phones (0)