this post was submitted on 25 Apr 2024
447 points (95.0% liked)

Programmer Humor

32495 readers
649 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] Kyrgizion@lemmy.world 79 points 6 months ago (5 children)

I think SOMA made it pretty clear we're never uploading jack shit, at best we're making a copy for whom it'll feel as if they've been uploaded, but the original remains behind as well.

[–] Dasnap@lemmy.world 44 points 6 months ago (2 children)

A lot of people don't realize that a 'cut & paste' is actually a 'copy & delete'.

And guess what 'deleting' is in a consciousness upload?

[–] pixeltree@lemmy.blahaj.zone 14 points 6 months ago* (last edited 6 months ago) (2 children)

I mean, if I die instantaneously and painlessly, and conciousness is seemingly continuous for the surviving copy, why would I care?

My conciousness might not continue but I lose consciousness every day. Someone exists who is me and lives their (my) life. I totally understand peoples aversion to death but I also don't see any difference to falling asleep and waking up. You lose consciousness, then a person who's lived your life and is you regains consciousness. Idk

[–] TopRamenBinLaden@sh.itjust.works 12 points 6 months ago* (last edited 6 months ago) (1 children)

You make a good point. We all might be being copied and deleted in our sleep every night, for all we know.

There'd be no way to know anything even happened to you as long as your memory was copied over to the new address with the rest of you. It would be just a gap in time to us, like a dreamless sleep.

load more comments (1 replies)
[–] Dasnap@lemmy.world 5 points 6 months ago (1 children)

Most people don't like the idea of a suicide machine.

[–] pixeltree@lemmy.blahaj.zone 8 points 6 months ago* (last edited 6 months ago)

Yeah, and I completely understand that. Just from a logical perspective though, lets say the process happens after you fall asleep normally at night. If you can't tell it happened, does it matter? I've been really desensitized to the idea of dying through suicidal ideation throughout most of my life (much better now), so I'm able to look at it without the normal emotional aversion to it. If teleportation existed, via this same method, I don't think I'd have qualms about at least trying it. Certainly wouldn't expect other people to but to me I don't think it's that big a deal. I wouldn't do a mind upload scenario, but moreso due to a complete lack of trust in system maintenance and security, and a doubt that true conciousness can be achieved digitally. If it's flesh and blood to flesh and blood though? I'd definitely try

[–] HubertManne@kbin.social 12 points 6 months ago

its the transporters all over again.

[–] TheYang@lemmy.world 25 points 6 months ago (7 children)

I wonder how you ever could "upload" a consciousness without Ship-of-Theseusing a Brain.

Cyberpunk2077 also has this "upload vs copy" issue, but doesn't actually make you think about it too hard.

[–] KazuyaDarklight@lemmy.world 8 points 6 months ago (1 children)

That's what I've always thought more or less, to have a chance you would need a method where mental processing starts to be shared in both, then transfers more and more to the inorganic platform till it's 100% and the organic isn't working anymore.

[–] Schmoo@slrpnk.net 6 points 6 months ago* (last edited 6 months ago) (1 children)

The animated series Pantheon has a scene depicting exactly this, and it's one of the most disturbing things I've ever seen.

Edit: Here is the scene in question. It's explained he has to be awake during the procedure because the remaining parts of his brain need to continue functioning in tandem with the parts that have already been scanned.

[–] KazuyaDarklight@lemmy.world 4 points 6 months ago (1 children)

Interesting but I would argue that's actually still a destructive copy process. "Old Man's War" did a good job of what I'm talking about, it was body to clone body but the principal was similar and at the halfway point the person was experiencing existence in both bodies at once, seeing both bodies from the perspective of each other until the transfer completed and they were in the new body and the old slumped over.

load more comments (1 replies)
load more comments (6 replies)
[–] HopeOfTheGunblade@kbin.social 22 points 6 months ago (1 children)

Any sufficiently identical copy of me is me. A copy just means there are more me in the universe.

[–] Skullgrid@lemmy.world 10 points 6 months ago

reproduction 101

[–] Azzk1kr@feddit.nl 7 points 6 months ago (1 children)

That ending screwed with my mind. Existential horror at it's finest!

[–] dev_null@lemmy.ml 5 points 6 months ago (3 children)

I was just annoyed at the protagonist for expecting anything else. The exact same thing already happened 2 times to the protagonist (initial copy at beginning of the game, then move to the other suit). Plus it's reinforced in the found notes for good measure. So by the ending, the player knows exactly what's going to happen and so should the protagonist, but somehow he's surprised.

load more comments (3 replies)
[–] highsight@programming.dev 6 points 6 months ago (2 children)

Ahh, but here's the question. Who are you? The you who did the upload, or the you that got uploaded, retaining the memories of everything you did before the upload? Go on, flip that coin.

[–] Kyrgizion@lemmy.world 12 points 6 months ago (1 children)

If you are the version doing the upload, you're staying behind. The other "you" pops into existence feeling as if THEY are the original, so from their perspective, it's as if they won the coin flip.

But the original CANNOT win that coinflip...

[–] Maven@lemmy.sdf.org 8 points 6 months ago (1 children)

But like.. do I care? "I" will survive, even if I'm not the one who does the surviving.

[–] Kyrgizion@lemmy.world 9 points 6 months ago (3 children)

I can't speak for anyone else, but I would. The knowledge that "A" me is out there, somewhere, safe and sound, is uplifting, but it's still quite chilling to realize you are staying wherever the hell you are. At least we die after enough time has passed because our bodies decay.

onthullingThe SOMA protagonist wasn't that lucky...

[–] Maven@lemmy.sdf.org 4 points 6 months ago

Is it chilling? I was already going to stay where I am, whether I made a copy or not. Sharding off a replica to go on for me would be strictly better than not doing that

load more comments (2 replies)
[–] Localhorst86@feddit.de 5 points 6 months ago

which instance of theseu's ship am I?

[–] marble@sh.itjust.works 54 points 6 months ago

Implementation will be

{
    // TODO
    return true;
}
[–] mdhughes@lemmy.ml 34 points 6 months ago (1 children)

It's still a surviving working copy. "I" go away and reboot every time I fall asleep.

[–] jkrtn@lemmy.ml 9 points 6 months ago (2 children)

Why would you want a simulation version? You will get saved at "well rested." It will be an infinite loop of put to work for several hours and then deleted. You won't even experience that much, your consciousness is gone.

[–] mdhughes@lemmy.ml 9 points 6 months ago

Joke's on them, I've never been "well rested" in my life or my digital afterlife.

load more comments (1 replies)
[–] Daxtron2@startrek.website 24 points 6 months ago (3 children)

Well yeah, if you passed a reference then once the original is destroyed it would be null. The real trick is to make a copy and destroy the original reference at the same time, that way it never knows it wasn't the original.

[–] dwemthy@lemdro.id 7 points 6 months ago

I want Transmetropolitan style burning my body to create the energy to boot up the nanobot swarm that my consciousness was just uploaded to

[–] nialv7@lemmy.world 5 points 6 months ago (1 children)

I think you mean std::move

[–] Daxtron2@startrek.website 5 points 6 months ago

get your std away from me sir

load more comments (1 replies)
[–] Schmoo@slrpnk.net 24 points 6 months ago (1 children)

If anyone's interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it's created and used by big tech companies is uncomfortably real.

The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can't help but wonder if it's at least partially because of its harsh criticisms of the tech industry.

load more comments (1 replies)
[–] RedditWanderer@lemmy.world 22 points 6 months ago

I get this reference

[–] electricprism@lemmy.ml 17 points 6 months ago

There are many languages I would rather die than be written in

[–] Nobody@lemmy.world 14 points 6 months ago* (last edited 6 months ago) (5 children)

You see, with Effective Altruism, we'll destroy the world around us to serve a small cadre of ubermensch tech bros, who will then somehow in the next few centuries go into space and put supercomputers on other planets that run simulations of people. You might actually be in one of those simulations right now, so be grateful.

We are very smart and not just reckless, over-indulged douchebags who jerk off to the smell of our own farts.

load more comments (5 replies)
[–] exocortex@discuss.tchncs.de 13 points 6 months ago (1 children)

Glad that isn't Rust code or the pass by value function wouldn't be very nice.

[–] Cornelius@lemmy.ml 5 points 6 months ago

Borrow checker intensifies

[–] waigl@lemmy.world 12 points 6 months ago (2 children)

In a language that has exceptions, there is no good reason to return bool here…

[–] itslilith@lemmy.blahaj.zone 8 points 6 months ago
[–] catnip@lemmy.zip 5 points 6 months ago
[–] yogthos@lemmy.ml 11 points 6 months ago* (last edited 6 months ago) (1 children)

I think that really depends on the implementation details. For example, consider a thought experiment where artificial neurons can be created that behave just the same as biological ones. Then each of your neurons is replaced by an artificial version while you are still conscious. You wouldn't notice losing a single neuron at a time, in fact this regularly happens already. Yet, over time, all your biological neurons could be replaced by artificial ones at which point your consciousness will have migrated to a new substrate.

Alternatively, what if one of your hemispheres was replaced by an artificial one. What if an artificial hemisphere was added into the mix in addition to the two you have. What if a dozen artificial hemispheres were added, or a thousand, would the two original biological ones still be the most relevant parts of you?

load more comments (1 replies)
[–] dfyx@lemmy.helios42.de 10 points 6 months ago

Literally the plot twist in...

spoilerSoma

[–] alphapuggle@programming.dev 7 points 6 months ago

A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy

[–] xilliah@beehaw.org 6 points 6 months ago* (last edited 6 months ago)
public static Consciousness Instance;
[–] EmoDuck@sh.itjust.works 5 points 6 months ago

The Closest-Continuer schema is a theory of identity according to which identity through time is a function of appropriate weighted dimensions. A at time 1 and B at time 2 are the same just in case B is the closest continuer of A, according to a metric determined by continuity of the appropriate weighted dimensions.

Lonk

I don't think that I fully agree with it but it's interesting to think about

[–] Matriks404@lemmy.world 4 points 6 months ago (1 children)

What if every part of my body is replaced by computer part continously. At what point do I lose my consciousness?

I think this question is hard to answer because not everyone agrees what consciousness even is.

load more comments (1 replies)
[–] xantoxis@lemmy.world 4 points 6 months ago

So, I'm curious.

What do you think happens in the infinite loop that "runs you" moment to moment? Passing the same instance of consciousness to itself, over and over?

Consciousness isn't an instance. It isn't static, it's a constantly self-modifying waveform that remembers bits about its former self from moment to moment.

You can upload it without destroying the original if you can find a way for it to meaningfully interact with processing architecture and media that are digital in nature; and if you can do that without shutting you off. Here's the kinky part: We can already do this. You can make a device that takes a brain signal and stimulates a remote device; and you can stimulate a brain with a digital signal. Set it up for feedback in a manner similar to the ongoing continuous feedback of our neural structures and you have now extended yourself into a digital device in a meaningful way.

Then you just keep adding to that architecture gradually, and gradually peeling away redundant bits of the original brain hardware, until most or all of you is being kept alive in the digital device instead of the meat body. To you, it's continuous and it's still you on the other end. Tada, consciousness uploaded.

[–] Clent@lemmy.world 4 points 6 months ago

It would be easier to record than upload. Since upload requires at least a decode steps. Given the fleeting nature of existence how does one confirm the decoding? This also requires we create a simulated brain, which seems more difficult and resource intensive than forming a new biological brain remotely connected to your nervous system inputs.

Recording all inputs in real time and play them back across a blank nervous system will create an active copy. The inputs can be saved so they can be played back later in case of clone failure. As long as the inputs are record until the moment of death, the copy will be you minus the death so you wouldn't be aware you're a copy. Attach it to fresh body and off you go.

Failure mode would take your literal lifetime to reform your consciousness but what's a couple decades to an immortal.

We already have the program to create new brains. It's in our DNA. A true senior developer knows better than to try and replicate black box code that's been executing fine. We don't even understand consciousness enough to pretend we're going to add new features so why waste the effort creating a parallel system of a black box.

Scheduled reboots of a black box system is common practice. Why pretend we're capable of skipping steps.

[–] essteeyou@lemmy.world 4 points 6 months ago

Conscience?

load more comments
view more: next ›