PDA

View Full Version : Monochrome



cjackson
2012-Aug-09, 06:41 PM
Why are NASA's images of celestial objects monochromatic? Does it offer an advantage over color photography?

Shaula
2012-Aug-09, 06:48 PM
Better contrast for a given exposure. Multispectral CCDs tend to have a lower resolution or require more complex filter systems that take images that are not quite coincident in time. Panchromatic is generally easier and more robust.

NEOWatcher
2012-Aug-09, 06:49 PM
Bandwith.

Best to get a quicker first look before making pretty pictures.

Ara Pacis
2012-Aug-09, 07:35 PM
Color CCDs have a filter over the photosites, creating 3 different color sensors in an array (the Bayer pattern (http://en.wikipedia.org/wiki/Bayer_pattern), for example) which works well enough with bright objects on earth. Some more advanced cameras on earth use 3 CCDs with a prism, but that cuts down on light transmission For space observation, with so many distant and therefore small items, it's better to use the full resolution for each exposure and take multiple exposures using different filters.

ngc3314
2012-Aug-11, 04:22 PM
Furthermore, the best data product for the science goals often uses specialized filters (narrowband, redshifted for a spectral feature, outside the visible spectral range) which would not be possible with a typical Bayer-mask sensor. It is often important to get as much signal as possible in a given time, so the detector setups are optimized for this.

There are a few which manage multiple wavelength bands at once, such as use of the dichroic beamsplitter on GALEX to get near- and far-ultraviolet images simultaneously. X-ray detectors can get reasonable precision in energy as well as position for each photon, and there has been some progress in small arrays of optical detectors which give similar data - so far good enough for color but not really spectra pixel-pixel.

cjameshuff
2012-Aug-11, 05:00 PM
As mentioned, typical single-sensor color cameras use a Bayer-pattern filter that tiles red, green, and blue filters over the sensor pixels. The resolution for each color channel is thus lower than the actual sensor resolution...typically half the number of pixels for green and one-quarter for red and blue. The three channels are also not aligned, so processing to combine them into a single image can leave colored fringes. The resulting color image is a stack of three scaled-up single-color images, rather than 3 full-resolution single-color images. Generally a decent compromise for consumer photography, but not ideal for scientific imagery.

Sensitivity is also reduced because the filters block light not in their band, and the filters interfere with use of other filters to produce images outside the visible range or in narrower bands within the visible range. MSL has cameras with Bayer filters that are specially designed to be transparent outside the visible range to make them still useful for such things, but even these reduce the flexibility, and not all color cameras have such filters, and these filters still interfere within the visible range.

These instruments are generally not taking pictures of fast moving objects, so there's no real problem with taking three exposures with different filters.