In the days of 35mm film photography a good lens would be able to resolve about 80 lines (technically line pairs) per millimeter. This translates to about 2880 line pairs on the long side of the standard 24 x 36mm frame.
A standard HD TV flat screen has 1920 pixels on the long
side of the panel. My old HD TV measures 1020mm on the long side giving us 1.9
pixels per mm.
For the purposes of this post I will assume that for a lens
to resolve all the detail which the
sensor is able to represent, it will
need to have a resolution of one line pair per sensor pixel. That might not be
technically entirely correct but should be close enough for the present
exercise.
The Canon EOS D30, introduced in the year 2000, had 3 million pixels on an APSC sensor measuring
22.0 x 14.9 mm, giving us 2160 pixels on the long side of the sensor,
equivalent to 98 pixels per mm, nominally requiring a lens resolution of 98
line pairs per millimeter.
The Canon EOD 40D put 10 megapixels, 3888 x 2592 pixels onto
a sensor measuring 22.2 x 14.8 mm giving us 175 pixels per millimeter.
The EOS R10 has 6000 pixels on the long side giving us 270
pixels per millimeter.
The R7 with 6960 pixels on the long side has 314 pixels per
milllimeter.
Some Fujifilm cameras already have 40 million pixels on
APSC. If as rumored the R7 Mk 2 gets 40 million pixels it will have 348 pixels
per millimeter.
By the way a full frame sensor with the same pixel pitch as
the R7 would have 32.5 x 1.6 x 1.6 = 83 Mpx. No currently available full frame
cameras have such a high pixel count.
My current camera is a Canon EOS R7 with 32.5 Mpx on a 22.5
x 14.8mm APSC sensor. I devised a simple test to discover whether one of my
sharpest lenses could resolve all the detail potentially available from this
sensor.
I used the Canon RF 28mm f2.8 lens which although
inexpensive and fitted with three large plastic moulded aspheric elements is
actually one of the sharpest lenses in the RF catalogue.
My standard test chart measures 1200mm on the long side. I
attach a millimeter rule to the chart, near the centre. If I make the chart fill the frame we have
1200 lines on the long side of the frame. This has to be projected onto the
sensor which measures 22.5 mm on the long side, giving us a required lens
resolution of 53 lines (technically line pairs) per mm.
The RF 28mm f2.8 is easily able to achieve this.
Next I double the distance between the chart and the focal
plane of the camera. Now the millimeter scale represents 2400 lines per long
side of the chart and a required lens resolution of 106 lppmm. The RF 28mm f2.8
manages this easily.
Then I increase the camera distance to 3x that which fills
the frame with the chart. The lines on the rule now represent 3600 lines per
long side of the sensor requiring a resolution of 162 lppmm from the lens,
which it can manage.
At a distance of 4 x the original we are asking the lens for
213 lppmm which it cannot manage. All the lines are mushed together into a
blur.
So the on-sensor resolution of this lens tops out somewhere around
180 lppmm.
Now let us check out how much lens resolution our current
model cameras require if all the spatial information which the sensor could
deliver is actually able to be revealed.
The EOS 40D with 10 Mpx has 3888 pixels on the long side
requiring a lens resolution of 175 lppmm, which our RF 28mm f2.8 lens could
just about manage if we could somehow mount it on the 40D.
The EOS R10 has 6000 pixels on the long side of the frame,
requiring a lens resolution of 6000 divided by 22.5 which is 270 lppmm.
The EOS R7 has 6960 pixels on the long side of the sensor
requiring a lens resolution of 314 lppmm.
A 40 Mpx Canon APSC sensor would require a lens resolution
of 348 lppmm.
We have a problem.
Either the anti-aliasing filter on these Canon sensors is substantially reducing the real world resolution of the sensor, or the lens lacks sufficient resolution, or both.
Which raises the question: why stack so many pixels onto the sensor ? Is it just marketing or is there a real advantage ?
There is a mis-match between the theoretical resolution of the lens-sensor system and the actual real-world resolution.
We have been seeing this in camera reviews for several years
. For instance when the Fujifilm X100.6 arrived in 2023 with a 40 Mpx sensor,
reviewers compared its image quality with that of the older X100.5 with a 26
Mpx sensor, and found no difference in ability to resolve fine details. In this case the lens appears to be unable to keep up with the sensor.
When I compare the R10 with the R7 as to ability to resolve
fine subject details, I find at most a barely detectable difference even with
the best available lenses.
If current and rumored high pixel count camera sensors are
to reveal their full potential maybe we need a new line of ultra high resolution
lenses, or perhaps a less resolution robbing alternative to anti-aliasing filters especially for APSC models with the highest pixel densities.

It's always dangerous to come to conclusions based on theory absent from experiment to verify the theory findings.
ReplyDeleteDo you ever see aliasing in your images, either colour moire or the harder to detect luminance aliasing? If you do, you are using a lens that outresolves the sensor. When the sensor has insufficient resolution to resolve spatial frequencies 2x higher than the finest detail produced by the lens, the result will inevitably be aliasing artefacts polluting the accuracy of the data. Luminance aliasing in particular (colour moire is obvious) is not always easy to recognise until you have tuned into what it looks like, but once recognised it can't be unseen.
Jim Kasson's blog is one of the finest technical resources out there for understanding what is going on in digital technology, I recommend it if you can cope with the maths and engineering. He does try to provide layman's explanations as well.
In this post https://blog.kasson.com/the-last-word/sensors-outresolving-lenses/ Jim demonstrates through extensive testing what a GFX100 100MP sensor image looks like with a lens that outresolves the sensor. It looks nice and crisp but the aliasing artefacts in the centre of the Siemans star are very obvious. In the next test, he shows what the image looks like if the lens optical image is softened sufficiently so the sensor can fully resolve it without aliasing artefacts. I think that image might be a shock to people used to seeing incredibly crisp views at 100% pixel peeping.
Jim has extensively tested the finest sensors and lenses and concluded it will take up to 800MP to fully outperform the very best lenses used at their optimal apertures and to remove all traces of aliasing.
Of course, there are no shortage of weak lenses that won't need this kind of sensor resolution, and if you don't shoot the finest lenses at relatively wide apertures, simple diffraction will remove the high frequency detail for you. No aliasing then, but also much reduced detail.
HI, Thanks for the feedback. Andrew
Delete