Mathematical implementation of the signal evolution Tracking engine

Image processing in a traditional application; every step uses the output of the previous step as its input. This workflow shows a sequence where the user subtracts a gradient "g", performs deconvolution on the linear data and then performs a non-linear gamma correction stretch.

In conventional processing engines, every pixel as-you-see-it is the result of the operation that was last carried out (some simple screen stretch capabilities excepted to visualise linear data). Operations are carried out one after the other and exist in some linear stack (typically accessible via an 'undo' history). The individual operations however, have no concept of what other operations preceded them, nor what operations will follow them, nor what the result was or will be. Signal flows one way in time; forward. Conventional software does not feed back signal, nor propagates it back and forth in order to refine the final result of the stack or 'undo' history.

An "illegal" sequence where deconvolution is erroneously applied after a non-linear gamma correction stretch.

Some software platforms even mistakenly implement astronomical signal processing in a formalised object oriented platform. An object oriented approach, by definition, implements strict decoupling of the individual operations, and formalises complete unawareness of the algorithms contained therein, with regards to where and when in the signal flow they are being invoked. This design completely destroys any ability of such algorithms to know what augmenting data or statistics may be available to them to do a better job. Worse, such software allows for entirely nonsensical signal flows that violate mathematical principles and the physics these principles are meant to model. The result is lower quality images through less sophisticated (but more numerous) algorithms, rounding errors, user-induced correction feedback loops (invoking another module to correct the output of the last), and steeper learning curves than necessary.

In StarTools, your image as-you-see-it is the result of an ever changing equation that is mathematically sound and simplified. This equation shows a sequence where the user subtracts a gradient "g", performs a gamma correction, and performs deconvolution (noting that the deconvolution step is normally considered out-of-sequence). As a result of the equation building, non-linear vs linear procesing is completely abstracted away.

In contrast, StarTools works by constantly re-building and refining a single equation, for every pixel, that transforms the source data into the image-as-you-see-it. It means there is no concept of linear versus non-linear processing, there are no screen stretches with lookup tables, there is no scope for illegal sequences, there is no overcooking or noise grain/artefact propagation, there are no rounding errors. What you see is the shortest, purest transformation of your linear signal into a final image. And what you see is what you get.

An algorithmic tour de force; enhancing a standard Richardson & Lucy deconvolution iteration by basing the regularization step on a forward propagated version of the previous iteration, thereby taking into account artefact propagation in the user's full workflow in the stretched, processed domain; this makes an algorithm like deconvolution privy to statistics that are far outside its normal linear-only purview. Substitution and equation solving like this, encompassing the entire workflow, is impossible in convential software.

Even more ground-breaking; substituting some of its variables for the equation itself (or parts thereof), allows complex feedback of signal to occur. This effectively provides, for example, standard algorithms like deconvolution or noise reduction, precise knowledge about a "future" or "past" of the signal. Such algorithms will be able to accurately calculate how the other algorithms will behave in response to their actions anywhere on the timeline. The result is that such algorithms are augmented with comprehensive signal evolution statistics and intelligence for the user's entire workflow. This lets these algorithms yield greatly superior results to that of equivalent algorithms in conventional software. Applying the latter innovation to - otherwise - standard, well known algorithms is, in fact, the subject of most of StarTools' research and development efforts.

The power of StarTools' novel engine, is not only expressed in higher signal fidelity and lifting of limitations of conventional engines; its power is also expressed in ease-of-use. Illegal or mathematically incongruent paths are closed off, while parameter tweaks always yield useful and predictable results. Defaults just work for most datasets, proving that the new engine is universally applicable, consistent and rooted in a mathematically sound signal processing paradigm.