Can you provide the source of a few of these completely different LLMs?
add even a small amount of change into an LLM […] radically alter the output
You mean perturbing the parameters of the LLM? That’s hardly surprising IMO. And I’m not sure it’s convincing enough to show independence, unless you have a source for this?