[-] secrethat@kbin.social 10 points 1 year ago

As a data scientist, people seem to just attribute anything that is a computer and they don't understand to AI or worse ChatGPT. Shudder

[-] secrethat@kbin.social 3 points 1 year ago

PETA Bioweaponry!

[-] secrethat@kbin.social 16 points 1 year ago

...a priest, an imam(or guru/monk--etc) and a rabbit walk into a hospital. I'm a Type-A says the priest, AB says the Imam, rabbit looks around and says.. I think i'm a typo.

[-] secrethat@kbin.social 29 points 1 year ago

Does that mean they cut ties with scientology?

[-] secrethat@kbin.social 12 points 1 year ago

get that snussy

[-] secrethat@kbin.social 4 points 1 year ago

get that snussy

[-] secrethat@kbin.social 2 points 1 year ago

Wait wait wait.... this was more civilised?

[-] secrethat@kbin.social 3 points 1 year ago

AFAIK it takes these large bodies of text and rather than digesting them and keeping it in some sort of database, rather it holistically (and i'm generalising here), see how often certain words are strung together and taking note of that. Let's call them weights.

Then users can prompt something and the 'magic' here is that it is able to pick out words of different weights based on the prompt. Be it, are you writing an angry email to your boss, a code in python, or structure for a book.

But it is unable to recreate the book from a prompt.
People who know the topic more intimately please correct me if I am wrong .

[-] secrethat@kbin.social 4 points 1 year ago

It's almost like someone would need a tool to climb over that wall.. something that is maybe 11 or 12 feet tall

[-] secrethat@kbin.social 3 points 1 year ago

behind every sign is a story

[-] secrethat@kbin.social 3 points 1 year ago

Well you could in theory for lets say AI generated images, train a neural network model that could pick up on artifacts in an image that only seems to be present in AI art, as well as AI generated texts, seeing how common a certain sort of text structure appears or something like emotion or sentiment analysis where an AI generated text doesn't do as good in terms of presenting genuine emotions.

Of course it's not 100% there yet. But to call them bullshit is closing doors that are not fully realised

[-] secrethat@kbin.social 10 points 1 year ago

But when the world needed them most the robots(.txt) were no where to be found

view more: next ›

secrethat

joined 1 year ago