No AI-based neural hallucination that invents "plausible" signal

StarTools prides itself on robustly implementing physics-based algorithms, as well as documentary fidelity.

StarTools does not encourage, nor enable practices that introduce unquantifiable "make believe" signal, without transparently warning the user of the consequences of exterminating documentary fidelity. Usage of unsophisticated algorithms that use basic neural hallucination in response to an impulse, have no place in documentary astrophotography; they invariably introduce signal that was never recorded (documented).

StarTools' principal developer is not exactly a Luddite when it comes to AI - he studied it and is named inventor on a number of AI-related patents. More than most, he is excited by how AI improves our daily lives, from augmented medical diagnoses to self-driving cars. The future of AI - overall - is an incredibly bright one.

"I reject your reality and substitute my own"

The flipside of AI is that it can be used for deception, whether borne out of ignorance, insecurities or malice. Neural hallucination - the lowest hanging AI fruit - is quite literally not a substitute for real recorded detail. Just like educated guesses are not a substitute for real measurements.

Just like most scoff at applying an AI Instagram filter and passing it off as a real photo of a person, so should an documentary astrophotographer scoff at applying an AI "Instagram filter" to their data and pass it off as something that was recorded for real.

StarTools will not ever sell you on "game changing" snake oil or open up its platform for other actors to do the same. In honest, documentary astrophotography, the game is about turning signal into detail in the face of noise. We choose to focus our development efforts on giving you the best odds in that game, but we will never help your rig it.