For film, it's easy to measure ISO since all you do is determine the exposure required to build a particular density on the emulsion after development. Of course, you need to calibrate the development process, the measuring eqpt etc, but that's all quite doable. For digital though, it's not at all so clear since the final image brightness can be manipulated within very wide margins after exposure. There are at least three approaches you could take:
- The exposure needed to saturate the sensor (clipping) can be used as the baseline. Then, the ISO is (somewhat arbitrarily) calculated as the light needed to produce a particular pixel saturation based on the reflected light from an 18% grey card at a given illumination level (i.e. what your Weston Master would tell you!). The standard is for that 18% grey to produce a 12.7% pixel saturation.
- The level of acceptable noise in the final image. In this test, the S/N ratio of the image is used to determine what an acceptable exposure is. There are two thresholds used for the S/N - 40:1 and 10:1, determined by subjective analysis of a particular print resolution and viewing distance.
- A so-called Standard Output Sensitivity test which measures the exposure needed to produce a specific bit value in an RGB file with a specific gamma. This will include all the processing that happens on a file and can be thought of as a "JPEG" measure.
Most manufacturers will use some combination of the above to arrive at a number, but there are other complications such as spectral characteristics and white balance settings which should also be taken into account.
The bottom line is that measuring ISO is complex and trying to claim that manufacturers are in some way misleading us is, well, misleading!
Comment