So let me get this straight… The thing is trained on text harvested from the internet. Turns out that (Star Trek aside) if you ask politely it’s more likely to access the data leading to the answer? Anyone surprised by the fact that asking nicely, is more likely that the human on the other side of the connection will write the answer, that is then harvested for AI training?
“The Moon Is A Harsh Mistress” by Robert Heinlein. At one point in the story, the super computer/AI Mycroft Holmes decides to create a female persona to help them understand humanity.
Heinlein predicted AIs with multiple personalities using different identies to solve problems.
After thinking about it, the last one HAS to be “Padme, when the younglings fell”. But I wanted too much, too quick and didn’t really think about it. Shaka moment.
You leave the galley after another night of Neelix’s terrible food. You walk by the holodeck and notice it’s always on. You go into engineering and B’elanna tells you that it runs on its own power supply. You suggest hooking that power supply up to the replicators. She stares at you and says, “we don’t have enough power to run the replicators.”
You need Samantha Wildman’s xenobiology expertise. The computer doesn’t detect her aboard. You go to Naomi and ask where her mother is. She looks at you funny and says, “mother? I never had a mother.”
Due to a transporter accident, a new life form is created, a merger of Neelix and Tuvok named Tuvix. Later, you see both Tuvok and Neelix seem to be back to their original selves. When you ask them what happened to Tuvix, they look at you strangely.
You start your mission with 38 irreplaceable photon torpedoes and two tricobalt devices. Seven years later Voyager has used 81 torpedoes. Nobody can explain why you were carrying the tricobalt devices.
“One thing is for sure: the model is not a Trekkie,” Catherine Flick at Staffordshire University, UK, told New Scientist.
“It doesn’t ‘understand’ anything better or worse when preloaded with the prompt, it just accesses a different set of weights and probabilities for acceptability of the outputs than it does with the other prompts,” she said.
It’s possible, for instance, that the model was trained on a dataset that has more instances of Star Trek being linked to the right answer, Battle told New Scientist.
The article doesn’t get it. There are more instance of Star Trek being linked to the right answer because Star Trek is the right answer.
tenforward
Hot
This magazine is from a federated server and may be incomplete. Browse more on the original instance.