logo
background
 Home and Links
 Your PC and Security
 Server NAS
 Wargames
 Astronomy
 PhotoStory
 DVD making
 Raspberry Pi
 PIC projects
 Other projects
 Next >>

What are 'defective pixels' ?

Defective pixels

Defective Pixels - what they are, what can be done ?

All CCD / CMOS camera sensors will suffer from some defective pixels. Given the size of the sensor array plus the almost unbelievable pixel counts, this is inevitable (if they are to be sold at an unbelievably low cost).

The most common defect is the non-functional or 'dark' pixel. Needless to say, in astro-imaging this type of defect is of little or no consequence.

However the other type of common defect, those that are 'stuck' permanently 'on' (i.e. 'shorted out') or 'hot' pixels are a disaster (since you could be misled into thinking you have just discovered a new supernova :-) )

The final type of pixel 'error' is one that is excessively 'sensitive' (so keeps running into saturation) or one that is very insensitive (so turns in very low values).

The above might give the impression that defects are limited to single pixels = this rarely the case - defects often come in 'clumps' with one or two 'hot' pixels might be 'surrounded' by a few 'over sensitive' ones and a dark might be surrounded by insensitive ones. Fortunately, it is relatively rare for defects to spread beyond directly adjacent pixels

What can be done ?

A1. Dark pixels are typically simply ignored.

A2. To eliminate 'hot' pixels, Dark Frame subtraction is performed (in RAW mode). All 'hot spots' will be eliminated (since they will be stuck at 100% & thus 100% subtracted from themselves).

A3. Variations in pixel sensitivity can be corrected to some extent by the use of 'flats'

However no amount of processing with Dark Frames (or Flats) can do anything to 'replace' the eliminated 'hot spot' or any of the 'missing' dark pixels with 'real data' - and this has serious implications for Colour Imaging.

Bad pixels, what are the implications for Colour Imaging ?

Dark Frame subtraction will remove 'hot spot' pixels, turning them into 'darks'.

Unfortunately, the generation of an RGB image pixel involves the processing all 4 sub-pixels in the 2x2 Bayer Matrix array = to derive each R, G & B component, the overall pixel 'intensity' is first found by processing all 4 sub-pixels .. then the 4 individual sub-pixels (R, 2xG & B) are examined to derive the colour 'balance'.

Thus one 'hot spot' (or dark pixel) in any 2x2 Bayer array will 'contaminate' the whole RGB pixel ... first 25% of the intensity is lost, then because the RGB 'balance' is lost (a dark sub-pixel will 'take out' the entire R or B, or half the G) a false colour is generated.

How can we minimise the loss of image detail ?

Another problem is (of course) that there is normally no way to discover the presence of any part of the image being observed that falls on a defective pixel - and whilst a single 'dark' pixel in the midst of a galaxy image may not be noticed at first glance, careful examination might lead to a lot of erroneous conclusions re: black holes or similar :-)

One trick that can be pulled when using modern 'stacking' software, which is able to re-align frames, is to (very) slightly reposition (shift or twist) the CCD between shots. This will ensure that image elements such as individual stars etc. will not fall on exactly the same hot spots / dark pixels in every exposure. After stacking, elements of the image that would otherwise be 'invisible' due to falling on defective pixels, will now be seen.

Defect mapping

One way to locate defective pixels is to generate a 'Flat', exposed at about 50% intensity. The RAW Flat can then be examined manually to locate both 'dark' and 'hot' pixels (i.e. those well below 50% and those well above 50%).

Of course, it's much easier to perform this in software, especially if manual control of the lower / upper thresholds is possible.

The row / column position of the defective pixels can then be recorded and a 'badpixels' file generated.

How is defect repair / replacement possible ?

Image Let us assume, in the example left, the central 2x2 Bayer Matrix (dark shading) has a 'bad' blue sub-pixel (the bold 'b' with 'strike out'). The repair process uses the blue sub-pixels from the 4 "nearest neighbour" 2x2 Bayer Matrix arrays (the bold 'B' pixels in the light shaded sets) to generate a replacement for the bad 'b'. The values of the 4 'B' sub-pixels can simply be added together and the total divided by 4 to obtain a replacement b.

Note that the sub-pixels immediately 'next door' (Rgr G g RGR) to the defective one are not used in the repair process. Of course they are the 'wrong' colour, however since they are the closest 'spatially' you might expect their 'average intensity' to be used in some way. However, semi-conductor device manufacturing defects typically form 'localised cluster defects'. So, chances are, the next door pixels will be effected by the defect and, to avoid any 'blooming' or 'bleed over' caused by a 'Hot' defective 'b', the immediate next door pixels are avoided (in fact, it's likely that one or more of the 'next door' pixels will also be defective and have to be 'repaired' in turn).

Software promising to replace badpixels

A1. UFraw (using dcraw)

If your own generated '.badpixels' file is found, the underlaying code (dcraw) will (it claims) replace the selected pixels. Although UFraw may do this before RGB processing is imposed, it's virtually impossible to tell what has actually occurred because it's not possible to extract the data as 16bit TIFF or FITS from UFraw - (the only TIFF you can get is 8bit RGB, which may be better than .jpg, but not by much)

A2. Adobe DNG Converter

Whilst you can't specify your own list of bad-pixels, it is understood that some sort of 'automatic' bad pixel detection and replacement is performed on every incoming RAW image before it is saved. The problem is, however, that it's anyone's guess how Adobe decides the difference between a 'hot' pixel and a single bright star .. or how it can tell the difference between a 'dark' pixel and an asteroid .. so you will never know if Adobe DNG 'filtered out' your discovery of a 'supernova' or asteroid by 'occultation'

This 'feature' is not well known, can not be disabled and the algorithm is both hidden and proprietary. One only has to wonder what else Adobe does to corrupt the RAW data before saving it. Steer well clear.

Next page :- Astro image processing

[top]