Ah yes; put simply, with CR the "magic rays" have to penetrate more layers in the detector stack (when compared to DR, which came along later).

That also probably explains the tendency of radiographers to elevate selected factors - the phenomenon known as “exposure creep”. In the early days of CR radiologists often complained of poor quality ("under-exposed") images; so radiographers became used to avoiding grief by cranking up the factors. Unfortunately, the higher patient doses did not seem to count for much.

The technical reason for those "poor quality" images was actually due to the higher quality (broad dynamic range) of digital detectors (when compared to traditional film) - this in turn produced noise (digital, or signal, noise) that could be eliminated (or at least much improved) by increasing exposure factors.

Also, of course with CR the cassettes have to be read as a separate process, thereby introducing yet another opportunity for loss of image definition (that may be overcome, yet again, by cranking up exposure factors). To my mind the unavoidable - and undesirable - increase in patient doses that all this leads to is a good (?) example of the "Law of Unintended Consequences"!

Lastly, it would be nice if someone could produce some data showing that patient doses have now decreased (using the latest DR technology, presumably).


If you don't inspect ... don't expect.