WholeEnchilada,

Same thing my momma tole me in the Ozarks in the 1800s when i was all fainting.

EnderMB,

I work in AI.

We’ve known this about LLM’s for many years. One of the reasons they weren’t widely used was due to hallucinations, where they’ll be coerced into saying something confidently incorrect. OpenAI created a great set of tools that showed true utility for LLM’s, and people were able to largely accept that even if it’s wrong, it’s good for basic tasks like writing a doc outline or filling in boilerplate in scripts.

Sadly, grifters have decided that LLM’s were the future, and they’ve put them into applications where they have no more benefit than other, compositional models. While they’re great at orchestration, they’re just not suited to search, answering broad questions with limited knowledge, or voice-based search - all areas they’ll be launched in. This doesn’t even scratch the surface of a LLM being used for critical subjects that require knowledge of health or the law, because those companies that decided that AI will build software for them, or run HR departments are going to be totally fucked when a big mistake happens.

It’s an arms race that no one wants, and one that arguably hasn’t created anything worthwhile yet, outside of a wildly expensive tool that will save you some time. What’s even sadder is that I bet you could go to any of these big tech companies and ask IC’s if this is a good use of their time and they’ll say no. Tens of thousands of jobs were lost, and many worthwhile projects were scrapped so some billionaire cunts could enter an AI pissing contest.

PrimeMinisterKeyes,
Someone64,
dudinax,

Wow geologists are dicks

Seudo,

It’s like if 4chan and Quora had a baby.

mrgreyeyes,

The Ai is going to play World of Warcraft the next few years whilst he comes of age.

sepi,

Looking forward to one of my stupid comments coming up as an answer for a real query on google.

KillingTimeItself,

i think this is probably the best one of these so far.

kjaeselrek,
zipzoopaboop,

She a classy lady who eats MINERALS tyvm

Decoy321,

Jesus Christ, Marie!!

Iron_Lynx,

And only quartz and amethyst at that smh

boatsnhos931,
mo_lave,
altima_neo,
@altima_neo@lemmy.zip avatar

How do people get these responses? I try and it doesn’t show me any ai generated text like that.

FlyingSquid,
@FlyingSquid@lemmy.world avatar

Do you do it on a phone? It doesn’t do it on desktop for me.

altima_neo,
@altima_neo@lemmy.zip avatar

Yeah, on my phone

pjwestin,
@pjwestin@lemmy.world avatar

You gotta opt in to it. There’s a little chemistry beaker in the corner you click on

dan,
@dan@upvote.au avatar

My wife sees this feature but I don’t.

Kolanaki,
@Kolanaki@yiffit.net avatar

Geologists like to get stoned because geology rocks.

lauha,

For rock and stone!

GiddyGap,

That you, Bert?

2deck, (edited )
@2deck@lemmy.world avatar

Just imagine how many not so obvious, or nuanced ‘facts’ are being misrepresented. Right there, under billions of searches.

There will be ‘fixes’ for this, but it’s never been easier to shape ‘the truth’ and public opinion.

FlyingSquid,
@FlyingSquid@lemmy.world avatar

It’s worse. So much worse. Now ChatGPT will have a human voice with simulated emotions that sounds eminently trustworthy and legitimately intelligent. The rest will follow quickly.

People will be far more convinced of lies being told by something that sounds like a human being sincere. People will also start believing it really is alive.

2deck,
@2deck@lemmy.world avatar

Inb4 summaries and opinion pieces start including phrases like “think of the children”, “may lead to dire consequenses” and “should concern everybody”

NeatNit,

“A human being sincere” is a nice little garden-path sentence :)

FlyingSquid,
@FlyingSquid@lemmy.world avatar

My point is that people will trust something that what sounds like it is being said sincerely by a living person more than they will regular text results a lot of the time because the “living person” sounds like they have emotions, which makes them sound like a member of our species, which makes them sound more trustworthy.

There’s a reason why predators sometimes disguise themselves, or part of themselves, as their prey. The anglerfish wouldn’t be as successful without that little light telling nearby fish “mate with me.”

NeatNit,

I didn’t make any comment about what you’re saying, I saw your point and had nothing to add.

A garden path sentence is one where you read it wrong the first time around and have to backtrack to understand it, for example: the old man the boat.

“a human being” is normally a noun, but then it turns out “being” is actually a verb.

en.wikipedia.org/wiki/Garden-path_sentence

Sam_Bass,

Everybody must get stoned

  • All
  • Subscribed
  • Moderated
  • Favorites
  • lemmyshitpost@lemmy.world
  • fightinggames
  • All magazines