Color balancing techniques

Now that we know how to change the colour balance, how do we know what to actually set it to?

There are a great number of tools and techniques that can be applied in StarTools that let you home in on a good colour balance. Before delving into them, It is highly recommended to switch 'Style' to 'Scientific (Color Constancy)' during color balancing, even if that is not the preferred style of rendering the colour of the end result, this is because the Color Constancy feature makes it much easier to colour balance by eye in some instances due to its ability to show continuous, constant colour throughout the image. Once a satisfactory colour balance is achieved, of course, feel free to switch to any alternative style of colour rendering.


White reference by clicking a pixel

If you know that a particular pixel or area in your image is supposed to be a shade of neutral white or gray, simply clicking on it is sufficient to let StarTools compute the right Red, Green and Blue bias settings to make that pixel appear neutral. This technique is particularly useful if you have a star of spectral type G2V (sun-like) in your image. The reasoning is that the sun is the perfect daylight white reference, and so any similar star elsewhere in the galaxy should be too.


White point reference by mask sampling

Some stars put in a mask.
We can calibrate against a big enough population of non-associated foreground stars, by putting them in a mask, clicking 'Sample' in the Color module and applying the found bias values to the whole image again.

Upon launch, or upon clicking the Sample button, the Color module samples whatever mask is set (note also that the set mask also ensures the Color module only applies any changes to the masked-in pixels!) and sets the Red, Green and Blue bias settings accordingly.

We can use this same behaviour to sample larger parts of the image that we know should be white. This method mostly exploits the fact that stars come in all sorts of sizes and temperatures (and thus colours!) and that this distribution is completely random. Therefore if we sample a large enough population, we should find the average star to be somewhere in the middle. Our sun is a very average star and is the white balance that we're after. Therefore, if we sample a large enough number of pixels containing a large enough number of stars, we should find a good colour balance.

A color balanced M101.
A reasonably good color balance achieved by putting all stars in a mask using the Auto feature and sampling them.

We can accomplish that in two ways; we either sample all stars (but only stars!) in a wide enough field, or we sample a whole galaxy that happens to be in the image (note that the galaxy must be of a certain type to be a good candidate and be reasonably close - preferably a barred spiral galaxy much like our own Milkyway).

Whichever you choose, we need to create a mask, so we launch the Mask editor. Here we can use the Auto feature to select a suitable selection of stars, or we can us the Flood Fill Brighter or Lassoo tool to select a galaxy. Once selected, return to the Color module and click Sample. StarTools will now determine the correct Red, Green and Blue bias to match the white reference pixels in the mask so that they come out neutral.

To apply the new colour balance to the whole image, launch the Mask editor once more and click Clear, then click Invert to select the whole image. Upon return to the Color module, the whole image will now be balanced by the Red, Green and Blue bias values we determined earlier with just the white reference pixels selected.


White balancing in MaxRGB mode

MaxRGB mode showing green channel dominance.
Major green channel dominance in the core points to color imbalance in that area.

StarTools comes with a unique colour balancing aid called MaxRGB. This mode of colour balancing is exceptionally useful if trying to colour balance by eye, but the user suffers from colour blindness or uses a screen that is not colour calibrated very well.

MaxRGB showing green dominance removed.
Reducing the green bias has removed green dominance in the core, leaving only spurious/random green dominance due to noise.

The MaxRGB aid allows you to view which channel is dominant per-pixel. If a pixel is mostly red, that pixel is shown red, if a pixel is mostly green, that pixel is shown green, and if a pixel is mostly blue, that pixel is shown blue.

Green corrected image.
Switching from MaxRGB mode to Normal mode confirms the image still looks good.

By cross referencing the normal image with the MaxRGB image, it is possible to find deficiencies in the colour balance. For example, the colour green is very rarely dominant in space (with the exception of highly dominant OIII emission areas in, for example the Trapezium in M42).

Therefore, if we see large areas of green, we know that we have too much green in our image and we should adjust the bias accordingly. Similarly if we have too much red or blue in our image, the MaxRGB mode will show many more red than blue pixels in areas that should show an even amount (for example the background). Again we then know we should adjust red or green accordingly.


White balancing by known features and processes

A color balanced M101.
M101 exhibiting a nice yellow core, bluer outer regions, red/brown dust lanes and purple HII knots, while the foreground stars show a good distribution of color temperatures from red to orange, yellow, white to blue.

StarTools' Color Constancy feature makes it much easier to see colours and spot processes, interactions, emissions and chemical composition in objects. In fact, the Color Constancy feature makes colouring comparable between different exposure lengths and different gear. This allows for the user to start spotting colours repeating in different features of comparable objects. Such features are, for example, the yellow cores of galaxies (due to the relative over representation of older stars as a result of gas depletion), the bluer outer rims of galaxies (due to the relative over representation of bright blue young stars as a result of the abundance of gas) and the pink/purplish HII area 'blobs' in their discs. Red/brown (white light filtered by dust) dust lanes complement a typical galaxy's rendering.

Similarly, HII areas in our own galaxy (e.g. most nebulae), while in StarTools Color Constancy Style mode, display the exact same colour signature found in the galaxies; a pink/purple as a result of predominantly deep red Hydrogen-alpha emissions mixed with much weaker blue/green
emissions of Hydrogen-beta and Oxygen-III emissions and (more dominantly) reflected blue star light from bright young blue giants who are often born in these areas, and shape the gas around them.

Dusty areas where the bright blue giants have 'boiled away' the Hydrogen through radiation pressure (for example the Pleiades) reflect the blue star light of any surviving stars, becoming distinctly blue reflection nebulae. Sometimes gradients can be spotted where (gas-rich) purple gives away to (gas-poor) blue (for example the Rosette core) as this process is caught in the act.

Diffraction spikes, while artefacts, also can be of great help when calibrating colours; the "rainbow" patterns (though skewed by the dominant colour of the star whose light is being diffracted) should show a nice continuum of colouring.

Finally, star temperatures, in a wide enough field, should be evenly distributed; the amount of red, orange, yellow, white and blue stars should be roughly equal. If any of these colors are missing or are over-represented we know the colour balance is off.


Colour balancing of data that was filtered by a light pollution filter

Colour balancing of data that was filtered by a light pollution filter is fundamentally impossible; narrow (or wider) bands of the spectrum are missing and no amount of colour balancing is going to bring them back and achieve proper colouring. A typical filtered data set will show a distinct lack in yellow and some green when properly colour balanced. It's by no means the end of the world - it's just something to be mindful of.

Correct colouring may be achieved however by shooting deep luminance data with light pollution filter in place, while shooting colour data without filter in place, after which both are processed separately and finally combined. Colour data is much more forgiving in terms of quality of signal and noise; the human eye is much more sensitive to noise in the luminance data that it is in the colour data. By making clever use of that fact and performing some trivial light pollution removal in Wipe, the best of both worlds can be achieved.