PDA

View Full Version : DxO dynamic range vs. ISO comparisons


drmarkf
21st March 2017, 12:44 AM
I hadn't seen this nifty graphic before, but what it does do is illustrate the uncanny work the Sony and Olympus engineers have done to dig performance out of the new E-M1 mkII sensor.

http://photonstophotos.net/Charts/PDR.htm#Nikon%20D500,Olympus%20OM-D%20E-M1,Olympus%20OM-D%20E-M1%20Mark%20II,Sony%20ILCE-7RII

Playing around with this app seems to confirm the approx. 1 stop improvement over the E-M1 mki, the marked step up in recorded information to the full frame sensor of the A7Rii (again no surprise), but the distinctly modest relative performance of some supposed industry leading competitors. I expected the 5D mkiv wouldn't show well, but I was surprised by how closely this little chip matches the D500, D750 and D810.

DXO Mark testing has its problems overall because of how they factor in sensor resolution, among other things, but the dynamic range testing they do must be straightforward and absolute, I think.

I'm not surprised Steve Gosling's landscapes looked so good on Sunday!

pdk42
21st March 2017, 09:51 AM
I hadn't seen this nifty graphic before, but what it does do is illustrate the uncanny work the Sony and Olympus engineers have done to dig performance out of the new E-M1 mkII sensor.

http://photonstophotos.net/Charts/PDR.htm#Nikon%20D500,Olympus%20OM-D%20E-M1,Olympus%20OM-D%20E-M1%20Mark%20II,Sony%20ILCE-7RII

Playing around with this app seems to confirm the approx. 1 stop improvement over the E-M1 mki, the marked step up in recorded information to the full frame sensor of the A7Rii (again no surprise), but the distinctly modest relative performance of some supposed industry leading competitors. I expected the 5D mkiv wouldn't show well, but I was surprised by how closely this little chip matches the D500, D750 and D810.

DXO Mark testing has its problems overall because of how they factor in sensor resolution, among other things, but the dynamic range testing they do must be straightforward and absolute, I think.

I'm not surprised Steve Gosling's landscapes looked so good on Sunday!

This is Bill Claff's site. I contributed a bunch of images from my Pen-F for him to be able provide data for it. And therein lies a caution. He gets the data by asking people to perform testing according to his method - e.g. http://photonstophotos.net/Collaborations/Dynamic_Range_Collaboration.htm

Of course, there is likely to be potential here for inconsistent results. I haven't looked in detail as to whether his subsequent processing of the results might compensate for such variability, but it's a concern nonetheless.

As regards DR being "absolute" - caution is needed here too. By definition DR is the range between the darkest usable area and the brightest.

Brightest is the easiest to be absolute about since the raw file will quite clearly show when we've hit the limit on the numerical pixel count - but even then, with real sensors, some channels will saturate before others and good raw processors will be able to realise this and make a half-decent job of colour-shifting recovered areas where there is only partial colour data. Most test measurements will assume max brightness recorded is when any channel clips but you might get a little more with good highlight recovery in your PP package.

For shadows, the question becomes what is the definition of a usable dark area? As light drops, the noise increases of course - but at what point does it become unusable? For some (e.g. DxO) it's when the signal to noise ratio is 50:50. I doubt many would consider that "usable", at least not if there's any image detail in there. In any case, since there's no formal definition of what constitutes the low end of the DR scale, it's hard to compare measurements across different testers. It's also hard to compare across sensor sizes too.

Bottom line for me is that these tests are useful as a comparative measure across a single test site, but not otherwise.

drmarkf
21st March 2017, 10:38 AM
Absolutely fascinating. I never cease to be amazed at the engagement, knowledge and skill of people on this site!

I'll temper my absolutism a bit, although this sort of testing is at least an attempt at scientific analysis with control of some of the important variables. Very refreshing in comparison with the usual opinion biased stuff one reads 90% of the time.