this post was submitted on 06 Jul 2023
9 points (100.0% liked)

Technology

31 readers
1 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 2 years ago
 

The LLMentalist Effect: how chat-based Large Language Models replicate the mechanisms of a psychic’s con

The new era of tech seems to be built on superstitious behaviour

you are viewing a single comment's thread
view the rest of the comments
[–] matjoeman@kbin.social 2 points 1 year ago* (last edited 1 year ago) (1 children)

There are two possible explanations for this effect:

  1. The tech industry has accidentally invented the initial stages a completely new kind of mind, based on completely unknown principles, using completely unknown processes that have no parallel in the biological world.
  2. The intelligence illusion is in the mind of the user and not in the LLM itself.

I agree with the author of the article but I'm curious if there is any well understood model of biological intelligence that we could use to say whether an artifical intelligence system has parallels to it or not.

[–] Ragnell@kbin.social 2 points 1 year ago* (last edited 1 year ago) (1 children)

@matjoeman Well, we kinda do. Computers at their most basic circuit level use logic gates, and perform functions by doing mathematics. At the base, so that they can communicate even within microchips, they must be binary coded. Even if over that on/off there's octal, hex, decimal, there must be a binary code at the core, two possibilities. We expand upon that by adding more paths, more logic gates, more complexity but the signal is a square. Two voltages.

Even as a computer recreates a sound for a human's ear, a sound that is a sine wave, it is still digitally encoded. Meaning it's a complex string of bits. It's two voltages that are being manipulated by logic gates to produce sound in a sine wave. But it's two voltages.

Human brains, however, are processing those sine waves, those complex frequencies that go across many voltages, as a spectrum. They aren't boiling it down to two voltages, they aren't basing it all on two voltages.

I'm not saying we'll NEVER get AI, but I think we need a revolutionarily different way of transferring information WITHIN the microchip to achieve that level of complexity.

[–] matjoeman@kbin.social 1 points 1 year ago (1 children)

I don't think there's a fundamental reason why you couldn't program AI digitally. Maybe there's some high level reason why it needs analog processing but I doubt it.

ML models use floating point numbers which approximate continuous values. An analog computer like you are describing could maybe speed up those calculations but it wouldn't fix the fact that ML models just can't be intelligent because of how they work (in my opinion).

[–] Ragnell@kbin.social 1 points 1 year ago* (last edited 1 year ago)

@matjoeman Maybe, but I don't think we're anywhere need the complexity yet and attempts to relate the way humans think to the way computers process aren't useful.