this post was submitted on 11 May 2024
332 points (97.7% liked)

Technology

34889 readers
598 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] chirospasm@lemmy.ml 180 points 6 months ago* (last edited 6 months ago) (8 children)

"We did the back-of-napkin math on what ramping up this experiment to the entire brain would cost, and the scale is impossibly large — 1.6 zettabytes of storage costing $50 billion and spanning 140 acres, making it the largest data center on the planet."

Look at what they need to mimic just a fraction of our power.

[–] henfredemars@infosec.pub 51 points 6 months ago

It’s not even complete. You might have the physical brain tissue, but that tissue is stateful. The tissue contains potentials and electrical charges that must be included in a complete model.

[–] UnfortunateShort@lemmy.world 49 points 6 months ago (1 children)

Heh, we the best. Now excuse me while I use my amazing brainpower to watch questionable anime and get more depressed 🫥

[–] whoisearth@lemmy.ca 10 points 6 months ago (1 children)

Every day we stray further from God's light lol

[–] Zerfallen@lemmy.world 4 points 6 months ago

naruto-running from his light

[–] Korkki@lemmy.world 37 points 6 months ago

it's not like human brain memory or consciousness is that information dense. They just did that high of a definition of a scan.

[–] SirEDCaLot@lemmy.today 21 points 6 months ago (1 children)

In fairness, the scan required such astronomical resources because of how they were scanning it. They took the cubic millimeter chunk and cut it into 5,000 super thin flat slices and then did extremely high detail scans of each slice. That's why they needed AI, to try and piece those flat layers back together into some sort of 3D structure.

Once they have the 3D structure, the scans are useless and can be deleted.

In time it should be possible to scan the tissue and get the 3D structure without such extreme data use.

[–] redcalcium@lemmy.institute 16 points 6 months ago (2 children)

Imagine donating your body to science and the scientists slice your brain and scan them, then decades later you suddenly wake up in a virtual space because the scientists are finally able to emulate a copy of your brain in a supercomputer.

[–] SirEDCaLot@lemmy.today 5 points 6 months ago (1 children)

Sounds good to me. You should also look into cryonics. Basically you sign up with a company and donate your body to them, when you die they pump you full of antifreeze and then vitrify you in liquid nitrogen. Right now there's no way to recover from it, the antifreeze is toxic and we don't yet know how to undo the cell damage from freezing. But the idea is someday in the future we will figure those things out, and then hopefully be able to thaw the frozen dead person, fix the damage caused by the freezing process, fix whatever problem killed them in the first place, and reanimate them.

For a lower fee, they will cut off your head and just freeze that. Idea being that someday in the future they will be able to transplant your brain into an artificially created body.

[–] TheRealKuni@lemmy.world 1 points 6 months ago

But they really only unfreeze people who knew Richard Dawkins and Mrs. Garrison. Then laugh at you when you want to play Nintendo Wii.

[–] USSEthernet@startrek.website 2 points 6 months ago
[–] zagaberoo@beehaw.org 11 points 6 months ago (2 children)

And the whole human body, brain and all, can run on ~100 watts. Truly astounding.

[–] BastingChemina@slrpnk.net 1 points 6 months ago

And we get these 100W from transforming food that we find on/in the ground.

[–] ThePantser@lemmy.world 8 points 6 months ago

Shouldn't be long and we will have that much in our phones.

[–] Aopen@discuss.tchncs.de 6 points 6 months ago (1 children)

CERN datacenter has 1600 times less capacity

https://home.cern/news/news/computing/exabyte-disk-storage-cern

Although global storage capacity will be 125 times higher by 2025 than whole scan would occupy

https://cybersecurityventures.com/the-world-will-store-200-zettabytes-of-data-by-2025/

[–] utopiah@lemmy.ml 2 points 6 months ago (2 children)

I'd be curious about the access speed comparison, because I'd assume for the brain it's be RAM equivalent, not SDD

[–] Vivendi@lemmy.zip 2 points 6 months ago

The brain is a tightly coupled biological computer , it's access speed is practically instantaneous

Also data/processing in the brain is some mighty uncovered field of science

[–] TheRealKuni@lemmy.world 2 points 6 months ago

Just gotta lower the clock speed enough for us not to notice. As long as we don’t interact with the outside world, just other stored human brains, it can be slow as molasses and we won’t notice.

load more comments (1 replies)
[–] riplin@lemm.ee 28 points 6 months ago (9 children)

That’s capturing everything. Ultimately you need only a tiny fraction of that data to emulate the human brain.

Numenta is working on a brain model to create functional sections of the brain. Their approach is different though. They are trying to understand the components and how they work together and not just aggregating vast amounts of data.

[–] henfredemars@infosec.pub 35 points 6 months ago (1 children)

No it does not. It captures only the physical structures. There’s also chemical and electrical state that’s missing.

[–] biscuitswalrus@aussie.zone 6 points 6 months ago

Think of this:

You find a computer from 1990. You take a picture (image) of the 1KB memory chip which is on a RAM stick, there are 4 RAM sticks. You are using a DSLR camera. Your image in RAW comes out at 1GB. You project because there's 8 chips per stick, and 4 sticks it'll 32GB to image your 4KB of RAM.

You've described nothing about the ram. This measurement is meaningless other than telling you how detailed the imaging process is.

[–] kakes@sh.itjust.works 18 points 6 months ago (1 children)

Of course, not to say the data isn't also important though. It's very possible that we're missing something crucial regarding how the brain functions, despite everything we know so far. The more data we have, the better we can build/test these more streamlined models.

[–] andrew_bidlaw@sh.itjust.works 6 points 6 months ago

These models would likely be tested against these real datasets, so they help each other.

[–] remotelove@lemmy.ca 13 points 6 months ago (3 children)

Ultimately you need only a tiny fraction of that data to emulate the human brain.

I am curious how that conclusion was formed as we have only recently discovered many new types of functional brain cells.

While I am not saying this is the case, that statement sounds like it was based on the "we only use 10% of our brain" myth, so that is why I am trying to get clarification.

[–] biscuitswalrus@aussie.zone 7 points 6 months ago

They took imaging scans, I just took a picture of a 1MB memory chip and omg my picture is 4GB in RAW. That RAM the chip was on could take dozens of GB!

[–] MajorSauce@sh.itjust.works 6 points 6 months ago

Not taking a position on this, but I could see a comparison with doing an electron scan of a painting. The scan would take an insane amount of storage while the (albeit ultra high definition) picture would fit on a Blu-ray.

[–] riplin@lemm.ee 4 points 6 months ago

Oh I’m not basing that on the 10% mumbo jumbo, just that data capture usually over captures. Distilling it down to just the bare functional essence will result in a far smaller data set. Granted, as you noted, there are new neuron types still being discovered, so what to discard is the question.

[–] GolfNovemberUniform@lemmy.ml 12 points 6 months ago (2 children)

I don't think any simplified model can work EXACTLY like the real thing. Ask rocket scientists

[–] FaceDeer@fedia.io 3 points 6 months ago

Fortunately it doesn't have to be exactly like the real thing to be useful. Just ask machine learning scientists.

[–] riplin@lemm.ee 1 points 6 months ago

Given the prevalence of intelligence in nature using vastly different neurons I’m not sure if you even need to have an exact emulation of the real thing to achieve the same result.

[–] eleitl@lemmy.ml 8 points 6 months ago

No, that captures just the neuroanatomy. Not the properties like density of ion channels, type, value of the synapse and all the things we don't know yet.

[–] tsonfeir@lemm.ee 3 points 6 months ago

I want the whole brain, not a republican.

[–] eguidarelli@lemmy.world 2 points 6 months ago

Never seen Numenta talked about in the wild! Worked with them on a pattern recognition project in college and it was freaky similar to how toddlers learned about the world around them.

[–] can@sh.itjust.works 1 points 6 months ago

Ultimately you need only a tiny fraction of that data to emulate the human brain.

Point for simulation theory.

load more comments (1 replies)
[–] Evil_Shrubbery@lemm.ee 27 points 6 months ago

Yes, humans kinda brute-forced intelligence with current assets - made it bigger (with some birthing issues) & more power hungry (with some cooling issues), but it mostly works.

[–] SouthFresh@lemmy.ml 20 points 6 months ago (2 children)

“Google to shutter human brains”

Why anyone teams up with a company with its greatest achievement being a high score on the “I wish they hadn’t shut that down” list, is beyond my understanding.

[–] GolfNovemberUniform@lemmy.ml 14 points 6 months ago

Because almost nobody else has enough money for such research and governments won't pay for it because it's not very useful

[–] fruitycoder@sh.itjust.works 5 points 6 months ago

I mean kubernetes, android, tensorflow, and the only OpenSource PDK for silicon that I know of.

They have a lot of bed rock contributions in the tech space.

[–] comador@lemmy.world 17 points 6 months ago (1 children)

Storage vendors are rolling their hands in delight while systems administrators, particularly backup admins are cringing at the thought.

[–] fruitycoder@sh.itjust.works 2 points 6 months ago

A SUV full of tape might have the band with needed to restore from backup, but bless the tops that gotta load those tapes

[–] delirious_owl 10 points 6 months ago

How much of that is dead pixels?

[–] Lojcs@lemm.ee 7 points 6 months ago (3 children)

So a 4k movie is 100 GB? 2 hour movie would make it 110 mbps. Insane bitrate even for h.254 imo

[–] onion@feddit.de 14 points 6 months ago

The movies as shown in cinema are ~600GB

[–] Tempo@lemmy.ml 13 points 6 months ago (1 children)

4K Blu Rays encoded in H265 are usually on 100gb discs, so I can see where they're coming from

[–] cmnybo@discuss.tchncs.de 2 points 6 months ago

4K bluray can be up to 144 mbps, so that's reasonable.

[–] morbidcactus@lemmy.ca 2 points 6 months ago (1 children)

I have LOTR directors cut on my server, haven't bothered reencoding it because I'm not super experienced with keeping hdr 10 going to h265 or equivalent. Return of the king alone is around 130 gigs across two files, jellyfin says its bitrate is about 70 mbps.

Titanic is only about 74 gigs

[–] Lojcs@lemm.ee 2 points 6 months ago (1 children)

Maybe it's just me but for 4k60fps h.265 video above 20 mbps looks indistinguishable to me unless I pause on a frame side by side to compare. I'm not sure about h.264 but it can't be too many times worse

load more comments (1 replies)
load more comments
view more: next ›