17 min readfrom Photography

Longer focal length doesn’t mean more reach (in the way you’re probably thinking)

Grab a coffee. This is a long one, but I think it's worth it.

People say "longer lens means more zoom." That is sort of true in the way that "bigger sensor means brighter image" is sort of true. It describes what you observe without identifying what actually causes it. Now the same structural error lives in how people think about focal length, resolution, and "reach."

The current popular intuition is something like: focal length determines how much detail you can resolve on a distant subject. 200mm resolves twice as much detail as 100mm. 600mm is a "long reach" lens. If you want to photograph birds, you need long glass. And the related claim: a crop sensor gives you "extra reach" because it multiplies your focal length by 1.5x or 2x or whatever the crop factor is.

This is wrong in a similar way that "higher ISO has more noise" is wrong. Focal length is correlated with resolving power, but it is not the cause.

Strip the problem down. Forget sensors and pixels entirely. Just ask: what determines the finest angular detail an optical system can resolve?

The answer is the entrance pupil (As is with many things in optics). The physical diameter of the aperture as seen from the front of the lens. Nothing else. The minimum resolvable angle between two point sources is:

θ = 1.22 λ / D (Rayleigh criterion, and 1.22 is where the Bessel function has the first zero, and the Bessel function arises from the Fourier transform of the aperture function. Also note this is the small angle approximation version which you will commonly see. It works because we're dealing in arcseconds so it holds well)

where λ is the wavelength of light and D is the entrance pupil diameter. Focal length does not appear. A 50mm lens and a 500mm lens with the same entrance pupil diameter resolve the same angular detail in the scene. (For those about to make an objection with étendue, keep reading first). This is not an approximation or a special case. It is the diffraction limit. It is the wave nature of light. It cannot be overcome by any lens design.

When a wavefront passes through a circular aperture, it does not focus to a perfect point. It forms a pattern called an Airy disc, a bright central blob surrounded by faint rings, caused by the edges of the wavefront interfering with themselves. The diameter of this blob is the smallest "dot" the system can produce. Two objects closer together than one Airy disc width merge into a single indistinguishable blob. The entrance pupil diameter alone sets this limit.

So what does focal length actually do?

Focal length determines the plate scale, the mapping between angles in the real world and physical distance on the image plane. A longer focal length spreads the same angular information across more millimeters of sensor. The Airy disc gets physically bigger, but it still represents the same angular size in the scene, because the plate scale grew proportionally.

Here is a worked example.

N = f/D (f stop = focal length / entrance pupil diameter)

D_airy = 2.44λN (airy disk diameter = 2.44 * wavelength * f stop)

θ = 1.22 λ / D (angular resolution = 1.22 * wavelength / entrance pupil diameter)

Take a 200mm f/2.8 lens. Entrance pupil is 200/2.8 = 71mm. Angular resolution at 550nm is 1.22 × 0.00055 / 71 = 9.5 microradians, about 2.0 arcseconds. The Airy disc diameter on the sensor is 2.44 × 0.00055 × 2.8 = 3.7 microns.

Now take a 50mm lens with the same 71mm entrance pupil. That is a 50mm f/0.7 lens, extremely fast but let us stay in the thought experiment. Angular resolution: same formula, same D, same result. 2.0 arcseconds. But the Airy disc on the sensor is now 2.44 × 0.00055 × 0.7 = 0.94 microns.

(note there is a small angle approximation in here, converting the linear distance on the sensor is just that angle. For something where this isn't true, you need to do x = f * tan(θ))

Same angular detail in the scene. Same ability to separate two distant points. But the 50mm lens crammed that information into a spot four times smaller on the sensor. To actually capture it, you would need pixels around 0.24 microns. Those do not exist. The smallest pixels in production are around 0.5 microns on smartphone sensors, and even those are pushing practical limits of photon collection.

The 200mm lens spreads that same information across 3.7 microns. Easy to sample with any modern sensor. That is what focal length buys you. Not more angular detail. (directly; I hear you étendue people, we will get there) More room to record the angular detail that the entrance pupil already captured.

This is where the "crop sensor gives extra reach" myth breaks down. Put a 200mm f/2.8 on APS-C. The crop sensor sees the central portion of the image circle. People say you now have "300mm equivalent reach." You do not have more angular resolution. The entrance pupil is still 71mm. The Airy disc is still 3.7 microns. You have the same angular detail landing on the sensor. The subject fills more of the frame when you view the image at full size, but only because the frame itself is smaller. That is not reach. That is cropping.

Here is a concrete example. Take a 24MP APS-C sensor and a 24MP full-frame sensor. Same megapixel count, different pixel sizes. The APS-C sensor has smaller, more densely packed pixels, about 3.9 microns versus 6.0 microns on the full-frame. Mount the same 200mm f/2.8 on both. The Airy disc is 3.7 microns on both, because it depends only on f-number and wavelength. The APS-C sensor samples the Airy disc with more pixels per disc, so within its narrower field of view, it actually records more spatial samples on the subject than the full-frame sensor does. That is a real advantage, but it came from pixel density, not from crop factor doing something magical.

Now level the playing field. Put a 56MP full-frame sensor behind the same lens. This sensor has roughly the same pixel density as the 24MP APS-C, about 3.9 micron pixels, just spread across a larger area. Crop the full-frame image to the APS-C field of view. You now have the same number of pixels on the subject, the same sampling of the same Airy disc, the same angular resolution, and the same framing. The images are indistinguishable. But you also kept the rest of the frame, which you can use or discard.

The crop sensor did not add anything the full-frame sensor could not match at equal pixel density. You subtracted field of view and called it zoom.

Now here is where it connects to why longer lenses seem to give more reach, and why that correlation is real even though focal length is not the cause.

There is a hard physical limit on how large an entrance pupil can be for a given focal length. The f-number is f/D, and the minimum possible f-number is f/0.5. This comes from geometry. At f/0.5, the entrance pupil diameter is twice the focal length. The light cone converging onto the sensor has a half-angle of 90 degrees. Light is arriving from the full forward hemisphere. There is no more light to collect. Rays from steeper angles would have to come from behind the lens, which is physically meaningless.

This is conservation of étendue, which comes from the second law of thermodynamics. The same principle that prevents a magnifying glass from heating something above the temperature of the sun. You cannot compress a light field into a smaller area without increasing the angular spread, and 2π steradians is the ceiling.

So for any focal length, there is a maximum entrance pupil:

D_max = 2f

Which gives a maximum angular resolution:

θ_min = 1.22 λ / 2f = 0.61 λ / f

For a 50mm lens: D_max = 100mm, θ_min = 6.7 microradians, about 1.4 arcseconds. For a 200mm lens: D_max = 400mm, θ_min = 1.7 microradians, about 0.35 arcseconds. For a 600mm lens: D_max = 1200mm, θ_min = 0.56 microradians, about 0.12 arcseconds.

The 600mm lens can resolve about 12 times finer detail than the 50mm lens. Not because 600mm is a magic number, but because 600mm of focal length is allowed to have a 1200mm entrance pupil, and 50mm is only allowed a 100mm entrance pupil. The longer focal length unlocked a bigger aperture. The aperture did the work.

This has a counterintuitive consequence. A lens can have more focal length and less resolving power at the same time. Consider two lenses you can actually buy. A 135mm f/1.8, like the Sony 135mm GM, has an entrance pupil of 75mm. A 600mm f/11, like the Canon RF 600mm f/11, has an entrance pupil of 54.5mm. The 135mm resolves about 1.8 arcseconds. The 600mm resolves about 2.6 arcseconds. The portrait lens sees finer angular detail on a distant subject than the super-telephoto.

The 600mm still produces a larger image on the sensor. The subject fills more pixels. But it is magnifying a blurrier angular image. This is the optical equivalent of what microscopists call empty magnification, exceeding the resolving power of the objective and just making the blur bigger. A 100% crop from the 600mm looks detailed because the image is large, but a crop from the 135mm rescaled to match framing contains more real angular information. The extra focal length did not see more. It just spread less across more sensor.

So why not just crop the 135mm and throw away the 600mm? Because the 135mm's Airy disc at f/1.8 is 2.4 microns. To Nyquist sample that on full frame, you would need roughly 2,400 megapixels. No current sensor comes close. The 600mm's Airy disc at f/11 is 14.8 microns. A modern high-resolution sensor can fully sample that. The 600mm has less angular information, but it spreads that information across a scale that real pixels can capture. The 135mm has more angular information locked inside spots too small for any existing sensor to resolve. The focal length is doing the same job as magnification in a microscope eyepiece. It does not add detail. It makes existing detail large enough to see.

Now, nobody builds lenses at f/0.5. Real telephoto lenses sit around f/2.8 to f/5.6. But the logic is the same: a 600mm f/4 has a 150mm entrance pupil. A 50mm lens would need to be f/0.33 to match that, which is physically impossible. The focal length enabled the entrance pupil to exist.

This is exactly the same structure as the sensor size argument for noise. Bigger sensor does not inherently mean less noise. Bigger sensor enables a larger entrance pupil at the same f-number, which collects more light. The sensor enabled the entrance pupil. The entrance pupil did the work.

Bigger sensor → enables larger entrance pupil at practical f-numbers → more photons → less noise. Longer focal length → enables larger entrance pupil at physically possible f-numbers → finer angular resolution → more "reach."

It is the same causal chain with the same hidden variable. If you skip the middle step and draw a direct arrow from the thing you can see (sensor size, focal length) to the outcome you care about (noise, resolution), you get the wrong answer for the same reason.

Here is the pixel constraint worked out a little more. Even if you had the entrance pupil and the angular resolution, you still need to record it. The Airy disc at the image plane has a physical size:

d_Airy = 2.44 λ N

To Nyquist sample the Rayleigh resolution, you need about 4 pixels across the Airy disc diameter, which means a pixel pitch of about 0.61λN. At f/0.5, the Airy disc is about 0.67 microns, requiring roughly 0.17 micron pixels. We cannot build those. At f/4, the Airy disc is 5.4 microns, requiring 1.35 micron pixels, within reach of current smartphone sensors. At f/11, the Airy disc is 14.7 microns, requiring 3.7 micron pixels, and diffraction begins to limit resolution even on high-density sensors.

The Nikon P1000 is a perfect example of this. It has a 1/2.3 inch sensor, one of the smallest in any camera, with 1.34 micron pixels and 16 megapixels. At FF that would be around 480MP. At full zoom, the actual focal length is 539mm at f/8. That gives an entrance pupil of 67mm, which resolves about 2.0 arcseconds, nearly identical to an 85mm f/1.2, a 200mm f/2.8, a 400mm f/6, or a 600mm f/9 on full frame. A 600mm f/4, the kind of lens that costs $13,000-$15,000, has a 150mm entrance pupil and resolves about 0.92 arcseconds. That is only 2.2× finer angular resolution than the P1000 for 15× the price. The angular resolving power of this thousand dollar superzoom is closer to professional super-telephoto territory than anyone gives it credit for. Nobody frames it that way because the spec sheet says "1/2.3 inch sensor" and people stop reading.

The tiny sensor sees an extremely narrow field of view, about 0.7 degrees across, which is why the equivalent FF focal length is 3000mm. Could a full-frame camera just crop to the same framing? In principle, yes. A 200mm f/2.8 on a 45 megapixel full-frame body has the same angular resolution. But its field of view is about 12 degrees. Crop that down to 0.7 degrees and you keep about 1/225th of the pixels. That is 0.2 megapixels. The angular information is there in the Airy discs but there are not nearly enough pixels sampling it.

This is what the tiny pixels on the P1000 are actually doing. They are not helping with light. They are not hurting it either. Low-light performance on the P1000 is poor because the entrance pupil is 67mm at f/8. That is how much light is collected. The sensor records whatever arrives, and 1.34 micron pixels with poor per-pixel SNR can be binned to recover the same signal-to-noise as fewer larger pixels, exactly as I described in the sensor size post. (https://www.reddit.com/r/photography/comments/1s893oh/neither\_pixel\_size\_nor\_sensor\_size\_help\_low\_light/?utm\_source=share&utm\_medium=web3x&utm\_name=web3xcss&utm\_term=1&utm\_content=share\_button) The tiny pixels are not the reason the low-light images look bad. The small entrance pupil is.

What the tiny pixels do is ensure there are enough spatial samples across a sensor that is only 6.17mm wide. If you used 5 micron pixels instead, typical of a full-frame sensor, you would have about 1 megapixel. Unusable. The pixel size is a resolution variable, not a light variable, and this camera is a pure demonstration of that fact.

The P1000 does not have insane reach because of its sensor or its focal length. It has insane reach because 539mm of focal length allowed a 67mm entrance pupil to exist at a buildable f/8, the tiny pixels tiled the sensor with enough samples to capture what the entrance pupil resolved, and the focal length spread the Airy disc across enough microns for those pixels to sample it. Entrance pupil set the resolving power.

So the full hierarchy of constraints on "reach" is:

Hard physics: entrance pupil sets the angular resolution ceiling. This is diffraction and cannot be beaten.

Hard physics: entrance pupil cannot exceed twice the focal length. This is étendue and cannot be beaten. To get a bigger entrance pupil, you must go to a longer focal length.

Soft engineering: pixel pitch must be fine enough to Nyquist sample the Airy disc. Currently limited to about 0.5 microns, improving over time. Longer focal lengths relax this requirement by spreading the information across more microns.

Soft engineering: lens aberrations in practice prevent most lenses from reaching diffraction-limited performance across the full field. Real lenses fall short of what the entrance pupil theoretically allows.

Only the first two are fundamental. The last two are fabrication and design constraints that improve with technology.

Here is a final thought experiment that ties this together.

Take a hypothetical 35mm f/0.5 lens. Entrance pupil: 70mm. Airy disc: 0.67 microns. Nyquist pixel pitch: 0.17 microns. On a full-frame sensor (36 × 24mm), you would need about 214,000 pixels across the long axis. That is roughly 31 gigapixels to fully sample the diffraction limit across the full frame. Field of view is about 63 degrees.

Now take a 600mm f/8.6 lens, same 70mm entrance pupil. Airy disc: 11.5 microns. Nyquist pixel pitch: 2.9 microns. On full frame, about 12,400 pixels across the long axis. That is about 105 megapixels. Field of view is about 3.6 degrees.

Same angular resolution. Same ability to distinguish fine detail on a distant subject. But the 35mm system requires about 300 times more pixels to capture it. And those pixels need to be 17 times smaller. Both systems see 2.0 arcsecond detail, but the 600mm system lets you record that detail with a sensor that is within reach of current technology.

If you could somehow build the 31 gigapixel sensor with 0.17 micron pixels, the 35mm image cropped to the same 3.6 degree field of view would contain exactly the same angular information as the 600mm image. Same detail on the bird, same resolution on distant text, same "reach." The 600mm did not see anything the 35mm could not. It just made the information large enough to record with real hardware.

And if you cropped the other direction, kept the full 63 degree field of the 35mm system, you would have a panoramic image with telephoto-grade resolution across the entire frame. This is what the episcope lens in that DIY Perks video achieves. A 432mm f/5 lens with a 500mm image circle, which is a 35mm equivalent field of view with the angular resolution of a long telephoto, all in one frame. The image circle is so large that it would take approximately 70 gigapixels to fully sample. The only reason this is exotic is because no sensor that large exists.

The community got the ISO argument right. Now apply the same logic one level up. ISO does not cause noise, lack of light causes noise. Sensor size does not cause less noise, a larger entrance pupil causes less noise. Focal length does not cause reach, a larger entrance pupil causes reach. The only thing that determines how much angular detail you capture is the entrance pupil. Everything else, focal length, sensor size, pixel count, crop factor, is either an enabling constraint or a way of distributing the information that the entrance pupil already captured.

Here's a quick exercise to test if you understood this post:

You have a Canon R5 (45MP, 4.39µm pixels) and an 800mm f/5.6. Does a 1.4× teleconverter help? Does a 2× help? Does stacking both help?

A note on sampling: throughout this post I used the Rayleigh criterion for Nyquist sampling, which requires about 4 pixels across the Airy disc diameter. This captures all information up to the Rayleigh resolution limit. The diffraction MTF does not actually reach zero at the Rayleigh frequency. It reaches about 9% contrast there, and continues to transmit information (at rapidly diminishing contrast) out to a cutoff frequency about 22% higher. Fully sampling this extended MTF requires about 4.88 pixels per Airy disc diameter instead of 4. The extra 22% of spatial frequency bandwidth contains about 1.5% of the total MTF area. Whether that last 1.5% matters depends on your application. For photography, lens aberrations, atmospheric turbulence, and sensor noise will almost always obliterate sub-9% contrast detail before you get a chance to sample it. For space telescopes and lithography systems, it matters. The framework and every comparison in this post are the same under either criterion.

submitted by /u/jimmystar889
[link] [comments]

Want to read more?

Check out the full article on the original site

View original article

Tagged with

#health and wellness
#luxury photography
#fashion photography
#wellness photography
#high-end travel
#creative direction