this post was submitted on 26 Dec 2024
51 points (70.4% liked)

Technology

60112 readers
2055 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

top 50 comments
sorted by: hot top controversial new old
[–] VoterFrog@lemmy.world 17 points 12 hours ago (1 children)

ITT: A bunch of people who have never heard of information theory suddenly have very strong feelings about it.

[–] General_Effort@lemmy.world 3 points 10 hours ago

If they had heard of it, we'd probably get statements like: "It's just statistics." or "It's not information. It's just a probability."

[–] renegadespork@lemmy.jelliefrontier.net 68 points 20 hours ago (3 children)

We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.

[–] nelly_man@lemmy.world 6 points 3 hours ago

Bit in this context refers to the Shannon from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.

[–] w3dd1e@lemm.ee 3 points 3 hours ago

Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.

[–] FooBarrington@lemmy.world 3 points 11 hours ago (1 children)

I also don't have 10 fingers. That doesn't make any sense - my hands are not numbers!

Ooooor "bits" has a meaning beyond what you assume, but it's probably just science that's stupid.

[–] renegadespork@lemmy.jelliefrontier.net 7 points 11 hours ago (1 children)

I can tell you’re trying to make a point, but I have no idea what it is.

[–] FooBarrington@lemmy.world 8 points 10 hours ago* (last edited 10 hours ago) (1 children)

You say "we don't think in bits because our brains function nothing like computers", but bits aren't strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.

To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That's because the concept of "10" is applicable both to math and topics that math can describe, just like "bits" are applicable both to information theory and topics that information theory can describe.

For the record: I didn't downvote you, it was a fair question to ask.

I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.

[–] renegadespork@lemmy.jelliefrontier.net 1 points 8 hours ago (2 children)

Bits are binary digits used for mechanical computers. Human brains are constantly changing chemical systems that don’t “process” binary bits of information so it makes no sense as a metric.

imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.

It’s not about how you measure it, it’s about using a unit system that doesn’t apply. It’s more like trying to calculate how much star costs in USD.

[–] FooBarrington@lemmy.world 3 points 54 minutes ago* (last edited 28 minutes ago)

Maybe try looking into the topic instead of confidently repeating your wrong assertions? You're literally pulling a "my hand is not a number!" right now.

Just because you have a limited understanding of a unit, doesn't mean that unit is only applicable to what you know. Literally the star example I brought up.

[–] scratchee@feddit.uk 6 points 1 hour ago

Bits are also a unit of information from information theory. In that context they are relevant for anything that processes information, regardless of methodology, you can convert analogue signals into bits just fine.

[–] Aatube@kbin.melroy.org 32 points 20 hours ago (3 children)

Because it’s a Techspot article, of course they deliberately confuse you as to what “bit” means to get views. https://en.wikipedia.org/wiki/Entropy_(information_theory) seems like a good introduction to what “bit” actually means.

[–] Buffalox@lemmy.world 6 points 20 hours ago (1 children)

Base 2 gives the unit of bits

Which is exactly what bit means.

base 10 gives units of "dits"

Which is not bits, but the equivalent 1 digit at base 10.

I have no idea how you think this changes anything about what a bit is?

[–] Aatube@kbin.melroy.org 5 points 19 hours ago (1 children)

The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.

[–] General_Effort@lemmy.world 1 points 13 hours ago (2 children)

Wrong. They are called the same because they are fundamentally the same. That's how you measure information.

In some contexts, one wants to make a difference between the theoretical information content and what is actually stored on a technical device. But that's a fairly subtle thing.

[–] typeswithpenis@lemmynsfw.com 5 points 11 hours ago

A bit in the data sense is just an element of the set of booleans. A bit in the entropy sense is the amount of information revealed by an observation with two equally probable outcomes. These are not the same thing because the amount of information contained in a bit is not always equal to one bit of entropy. For example, if a boolean is known to be 0, then the amount of information it contains is 0 bits. If it is known that the boolean is equally 0 or 1, then the information content is 1 bit. It depends on the prior probability distribution.

[–] Aatube@kbin.melroy.org 3 points 13 hours ago (1 children)

I don't see how that can be a subtle difference. How is a bit of external storage data only subtly different from information content that tells the probability of the event occurring is ½?

load more comments (1 replies)
load more comments (2 replies)
[–] Buffalox@lemmy.world 45 points 23 hours ago (60 children)

Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.

This piece is garbage.

[–] GamingChairModel@lemmy.world 14 points 17 hours ago* (last edited 16 hours ago) (3 children)

Speaking which is conveying thought, also far exceed 10 bits per second.

There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.

Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I'm curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik's cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?

EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That's an important limitation in that it's not trying to measure internal richness in unobserved thought.

So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik's cube solving, memory contests).

It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn't really change the result.

There's also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as "subjective inflation"), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.

I still think visual processing seems to be faster than 10, but I'm now persuaded that it's within an order of magnitude.

[–] RustyEarthfire@lemmy.world 2 points 14 hours ago

Thanks for the link and breakdown.

It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.

"Thinking speed" is also a poor description for input/output measurement, akin to calling a monitor's bitrate the computer's FLOPS.

Visual processing is multi-faceted. I definitely don't think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.

load more comments (2 replies)
[–] scarabic@lemmy.world 2 points 14 hours ago

Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.

[–] meyotch@slrpnk.net 6 points 20 hours ago (7 children)

You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.

The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.

load more comments (7 replies)
load more comments (57 replies)
[–] Australis13@fedia.io 14 points 21 hours ago (18 children)

Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via=ihub

It doesn't look like these "bits" are binary, but "pieces of information" (which I find a bit misleading):

“Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.

The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:

To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.

[–] scarabic@lemmy.world 2 points 14 hours ago* (last edited 14 hours ago)

So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.

load more comments (17 replies)
[–] terminhell@lemmy.dbzer0.com 3 points 17 hours ago

Crazy how a biological analog lump is capable of even a fraction of what a brain can do.

[–] leaky_shower_thought@feddit.nl 6 points 22 hours ago (1 children)

i can agree at some extent why it could be at 10bits/sec.

the brain is known to do some shortcuts when parsing/speed reading but slows down when we try to extract details from written works. it is also more tiring to scrutinize details than to just read articles.

i was surprised that they got the speed measured.

load more comments (1 replies)
load more comments
view more: next ›