Chances are you have used noise reduction at some stage. In astrophotography, the problem with most noise reduction routines, is that they have no idea how much worse the noise grain has become (or will become) in your image as you process(ed) it. These routines, have no idea how you stretched and processed your image earlier or how you will in the future. And they certainly have no idea how you squashed and stretched the noise component locally with wavelet sharpening or local contrast optimisation.
In short, the big problem, is that separate image processing routines and filters have no idea what came before, nor what will come after when you invoke them. All pixels are treated the same, regardless of their history (is this pixel from a high SNR area or a low SNR area? Who knows?). Current image processing routines and filters are still as 'dumb' as they were in the early 90s. It's still "input, output, next". They pick a point in time, look at the signal and estimated noise component and do their thing.
In astrophotography, the problem with most noise reduction routines, is that they have no idea how much worse the noise grain has become (or will become) in your image as you process(ed) it.
Without knowing how signal and its noise component evolved to become your final image, trying to, for example, squash noise accurately is fundamentally impossible. What's too much in one area, is too little in another, all because of the way prior filters have modified the noise component beforehand. The same is true for applying noise reduction before stretching (e.g. at the linear stage); noise grain is ultimately only a problem when it becomes visible and this hasn't happened yet at that stage.
The separation of image processing into dumb filters and objects, is one of the biggest problems for signal fidelity in astrophotographical image processing software today. It is the sole reason for poorer final images, with steeper learning curves than are necessary. Without addressing this fundamental problem, "having more control with more filters and tools" is an illusion. The IKEA effect aside, long workflows with endless tweaking and corrections do not make for better images. On the contrary, they make for much poorer images.
Now imagine every tool, every filter, every algorithm could work backwards from the finished image, tracing signal evolution, per-pixel, all the way back to the source signal? That's Tracking!
The module will then have a better understanding of how the signal (and its noise component) was stretched and modified per-pixel.
Tracking how you process your data also allows the noise reduction routines in StarTools to achieve superior results.
The Stereo 3D module can be used to synthesise depth information based on astronomical image feature characteristics.
The 'WebVR' button in the module exports your image as a standalone HTML file.
You can convert everything you see to a format you find convenient. Give it a try!