Sensor characteristics (gain, read noise, dark current, full well capacity, dynamic range and Quantum Efficiency) are derived from noise analyses. This is made possible by the unique signature of noise from collecting light itself (photons). The noise in the light collected is the square root of the number of photons recorded. So to derive sensor performance, an analyst can measure the noise in a group of pixels covering a uniformly lit area and by squaring the variance, derive the average photon count per pixel. Do this for different intensities, and what is called the Photon Transfer Curve may be derived. From the numbers recorded in the raw data file, the gain conversion from a digital number to photon count can be derived. There is also a conversion gain: usually 1 photon = 1 electron recorded. Sometimes conversion gain is 2 or 4, so this must be derived too, but becomes clear in a digital camera when stepping through ISOs making a series of measurements at each ISO. With zero light on the sensor, the sensor read noise plus downstream electronics noise in terms of electrons can be derived. It is elegantly simple and works very well in practice if on has access to true raw data.
But it is becoming increasingly evident that the raw data recorded in digital cameras is not truly raw, and this has significant consequences. Let's say the raw data are filtered and the resulting noise is reduced a factor of two. Then the derived sensor characteristics are in error. Dynamic range appears improved and low level noise looks lower compared to a camera that has no filtering. Review sites may say one camera is better than another based on unequal data, and consumers may make a choice based on that unequal data.
I want to be clear that none of this is cheating by the camera manufacturers. There is no law that "raw" data must be truly raw. Camera manufactures are trying to deliver cameras that make great images, and they have made decisions to implement processing to deliver the best images in their assessment that they can. But filtering can have detrimental consequences to image quality. I have yet to see a filtering algorithm that works perfectly in all situations. I prefer a camera with true raw data, then I can choose a specific noise filtering algorithm and settings for that algorithm that are best tuned for a particular image.
If the raw data are filtered, it will impact derived sensor specs and make them look better without necessarily improving image quality over a photographer custom tuning a filtering algorithm in a raw converter. In the future, a better filtering algorithm may be invented that does a better job, but if the raw data are already filtered, you may not be able to use the new algorithm effectively.
Another common problem with review sites is when different cameras are compared and image results are shown that have been put through a raw converter, the results can be horribly different. For example, the same raw converter settings may be used on each camera's images, but the same settings aren't necessarily optimum for that camera. For example, see this post on dpreview which optimizes raw converter setting and completely changes the impressions of the cameras: Comparisons: Are they kidding?
There are multiple methods people use to try and detect filtering. Commonly a Fast Fourier Transform (FFT) analysis is done, but FFTs do not detect all filtering. A new method of filtering detection that is not necessarily detected by FFT analysis is the clipping now found in raw data in multiple cameras. Bernard Delley first (I believe) showed this: hot-pixel suppression in D800 and D4. and Mark Shelly has further refined the analysis: AA filter, spatial filter and star colours.
Questions. What other filtering could have escaped detecting so far and what effect on sensor analyses could that have? The reason I ask is by looking at short exposures of light frames, the raw data on some cameras has texture that looks odd--either spatial filtering or odd sensor response, e.g. sometimes looking wormy. The bottom line might be that with filtering, the sensor noise analyses may result in 1) sensor gain is inaccurate, 2) read noise is lower, and 3) dark current is lower. I am also skeptical because some of the derived gains do not follow expected trends with newer generation sensors. For example, the derived specs on the Nikon D850 are lower than earlier cameras. Is that true, or due to less/no spatial filtering so derived sensor characteristics are more realistic?
The bottom line is question cameras that have great sensor specs and see if the results pan out in real world imaging. I will personally avoid cameras with demonstrated filtering. What matters in the end is what kind of image can you produce and does it meet your standards? If you process raw data, produce the best image you can out of the raw converter. This may mean different settings in the raw converter for different cameras, as in the post Comparisons: Are they kidding? Given internet reported issues, sometimes different strategies in the raw converter can mitigate those problems, and one can make great images from the camera.
Below is a list of links and discussions on filtering found in various cameras
What effects filtering has depends on your application. In some situations it may help an image, but in others, especially images with small details, artifacts may appear that are difficult to impossible to get rid of. Personally, I will choose cameras with minimal/no detectable raw data filtering because I want to choose the best noise reduction in post processing that benefits my image the best.
To date, I have seen no evidence for any significant raw data filtering in Canon cameras. Some Canon cameras have banding issues at low ISOs (2018 models and earlier; often worse in earlier cameras). Banding can be difficult to fix with filtering. Cameras with larger pixels from all manufacturers tend to have some fixed pattern/banding issues at low ISOs.
The Shelly clipping analysis shows many artifacts, especially on stars, but will be similar to other small bright objects, like catchlights in eyes, and glints off of small objects, or backlit subjects (e.g. water drops, water spray), or distant lights in a night city photo.
|Summary of AA filter, spatial filter and star colours Shelly clipping analysis cloudynights.com thread:|
There appears to be 2 clipping algorithms:
Other problems / filtering detected:
Nikon Z7: horizontal banding due to PDAF pixels.
The Sony "Star Eater" problem.
Nikon D850 vs Canon 6D2: https://www.dpreview.com/forums/post/62148439. What surprises me in the Nikon image is 1) many stars are turned green--implies filtering of the raw data, 2) many bright stars are gone or turned to colored mush (red, magenta and blue mush). This also implies significant filtering. Is the filtering in the camera, or in the raw converter?
Astrophotography: Dark Subtraction and Spatial Filtering
- Spatial filtering detected in Panasonic Lumix DC-S1R raw data https://www.dpreview.com/forums/thread/4379863 (April, 2019)
Procedures for performing this analysis are described in:
Procedures for Evaluating Digital Camera Noise, Dynamic Range, and Full Well Capacities; Canon 1D Mark II Analysis