this post was submitted on 26 Jan 2024
671 points (97.7% liked)

Technology

59402 readers
2854 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

George Carlin Estate Files Lawsuit Against Group Behind AI-Generated Stand-Up Special: ‘A Casual Theft of a Great American Artist’s Work’::George Carlin's estate has filed a lawsuit against the creators behind an AI-generated comedy special featuring a recreation of the comedian's voice.

top 50 comments
sorted by: hot top controversial new old
[–] cubism_pitta@lemmy.world 107 points 9 months ago (2 children)

If its wrong to use AI to put genitals in someone's mouth it should probably be wrong to use AI to put words in their mouth as well.

[–] TheFriar@lemm.ee 19 points 9 months ago

Damn.

snaps

[–] ClamDrinker@lemmy.world 6 points 9 months ago (1 children)

I agree and I get it's a funny way to put it, but in this case they started the video with a massive disclaimer that they were not Carlin and that it was AI. So it's hard to argue they were putting things in his mouth. If anything it's praiseworthy of a standard when it comes to disclosing if AI was involved, considering the hate mob revealing that attracts.

[–] CleoTheWizard@lemmy.world 11 points 9 months ago (1 children)

The internet doesn’t care though. If I make fake pictures of people using their likeness and add a disclaimer, people will just repost it without the disclaimer and it will still do damage. Now whether or not we can or should stop them is another story

[–] ClamDrinker@lemmy.world 8 points 9 months ago* (last edited 9 months ago)

Completely true. But we cannot reasonably push the responsibility of the entire internet onto someone when they did their due diligence.

Like, some people post CoD footage to youtube because it looks cool, and someone else either mistakes or malicious takes that and recontextualizes it to being combat footage from active warzones to shock people. Then people start reposting that footage with a fake explanation text on top of it, furthering the misinformation cycle. Do we now blame the people sharing their CoD footage for what other people did with it? Misinformation and propaganda are something society must work together on to combat.

If it really matters, people would be out there warning people that the pictures being posted are fake. In fact, even before AI that's what happened after tragedy happens. People would post images claiming to be of what happened, only to later be confirmed as being from some other tragedy. Or how some video games have fake leaks because someone rebranded fanmade content as a leak.

Eventually it becomes common knowledge or easy to prove as being fake. Take this picture for instance:

It's been well documented that the bottom image is fake, and as such anyone can now find out what was covered up. It's up to society to speak up when the damage is too great.

[–] fmstrat@lemmy.nowsci.com 68 points 9 months ago (103 children)

This case is not just about AI, it’s about the humans that use AI to violate the law, infringe on intellectual property rights and flout common decency.”

Well put.

[–] KairuByte@lemmy.dbzer0.com 24 points 9 months ago (10 children)

Eh…. I don’t know that I can agree with this.

I understand the intent behind it, but this specific instance is legitimately in parallel with impersonators, or satire. Hear me out.

They are impersonating his voice, using new content in his style, and make no claim to be legitimate.

So this comes down to “this is in bad taste” which, while I can understand and might even agree with… isn’t illegal.

The only novel concept in this, is that “scary tech” was used. There was no fraud, there was no IP violation, and no defamation. Where is the legal standing?

[–] doctorcrimson@lemmy.world 7 points 9 months ago (1 children)

They didn't write satire in his style, they sampled his actual work with a machine. It's not a parody of George Carlin, it's an inferior approximation of him.

[–] KairuByte@lemmy.dbzer0.com 4 points 9 months ago (23 children)

I didn’t say this was satire, I said it was in line with satire on a legal front. And why did you ignore the “impersonator” line immediately before it and jump straight into parody?

They sampled his work, yes. To get voice, pacing, image, etc. they didn’t then have it spit out copies, or even remixes of his previous work, they had it create new content and made it clear it was not him.

I don’t see this as any different than an impersonator watching hundreds of hours of his routines, getting into character visually and verbally, and walking out on stage to do their own routine.

In fact, let me just ask directly: would you be taking issue with this if it was a real human, no AI involved, who had dressed and trained to move and sound approximately like the man, and then filmed it and put it online? Would you say that is illegal?

load more comments (23 replies)
load more comments (9 replies)
load more comments (102 replies)
[–] Steve@communick.news 44 points 9 months ago (1 children)

I'm torn. I can see why they would be upset. And they may have a case with likeness rights.

But at the same time, this specific example isn't trying to claim any kind of authenticity. It goes out of its way to explain that it's not George. It seems clearly to be along the lines of satire. No different than an impersonator in a SNL type sketch.

I guess I don't have any real problem with clearly fake AI versions of things. My only real problem would be with actual fraud. Like the AI Biden making calls trying to convince people not to vote in a primary. That's clearly criminal fraud, and an actual problem.

[–] A_Very_Big_Fan@lemmy.world 6 points 9 months ago* (last edited 9 months ago)

My only real problem would be with actual fraud. Like the AI Biden making calls trying to convince people not to vote in a primary.

That's the difference between impression and impersonation. My disappointment in the Lemmy community for not understanding the difference is immeasurable. We're supposed to be better than this but really we're no better than Reddit, running with ragebait headlines for the cheap dopamine hit that is the big upvote number.

If it were a human doing a Carlin impression, literally NOBODY would give a fuck about this video.

[–] randomaside@lemmy.dbzer0.com 30 points 9 months ago* (last edited 9 months ago) (7 children)

Ive been thinking about this a lot and if you think about this like they are selling a stolen product then it can be framed differently.

Say I take several MegaMan games, take a copy of all the assets, recombine them into a new MegaMan game called "Unreal Tales of MegaMan". The game has whole new levels inspired by capcom's Megaman. Many would argue that the work is transformative.

Am I allowed to sell that MegaMan game? I'm not a legal expert but I think the answer to that would generally be no. My intention here is to mimic a property and profit off of a brand I do not own the rights too.

Generative AI uses samples of original content to create the derivative work to synthesize voices of actors. The creator of this special intention is to make content from a brand that they can solely profit from.

If you used an AI to generate a voice like George Carlin to voice the Reptilian Pope in your videogame, I think you would have a different problem here. I think it's because they synthesized the voice and then called it George Carlin and sold it as a "New Comedy Special" it begins to fall into the category of Bootleg.

[–] pjwestin@lemmy.world 21 points 9 months ago* (last edited 9 months ago) (1 children)

You couldn't sell that game, even if you created your own assets, because Mega Man is a trademarked character. You could make a game inspired by Mega Man, but if you use any characters or locations from Mega Man, you would be violating their trademark.

AI, celebrity likeness, and trademark is all new territory, and the courts are still sorting out how corporations are allowed to use a celebrities voices and faces without their consent. Last year, Tom Hanks sued a company that used an IA generated version of him for an ad, but I think it's still in court. How the courts rule on cases like this will probably determine how you can use AI generated voices like in your Reptilian Pope example (though in that case, I'd be more worried about a lawsuit from Futurama).

This lawsuit is a little different though; they're sidestepping the issue of likeness and claiming that AI is stealing from Carlin's works themselves, which are under copyright. It's more similar to the class action lawsuit against Chat GPT, where authors are suing because the chatbot was fed their works to create derivative works without their consent. That case also hasn't been resolved yet.

Edit: Sorry, I also realized I explained trademark and copyright very poorly. You can't make a Mega Man game because Mega Man, as a name, is trademarked. You could make a game that has nothing to do with the Mega Man franchise, but if you called it Mega Man you would violate the trademark. The contents of the game (levels, music, and characters) are under copyright. If you used the designs of any of those characters but changed the names, that would violate copyright.

[–] Couldbealeotard@lemmy.world 8 points 9 months ago (3 children)

Celebrity likeness is not new territory.

Crispin Glover successfully sued the filmmakers of Back to the Future 2 for using his likeness without permission. Even with dead celebrities, you need permission from their estate in order to use their likeness.

load more comments (3 replies)
load more comments (6 replies)
[–] carpelbridgesyndrome@sh.itjust.works 18 points 9 months ago* (last edited 9 months ago) (1 children)
load more comments (1 replies)
[–] steelrat@lemmy.world 11 points 9 months ago

I'll take Lawyers Maximizing Billable Hours for $500, Alex

[–] afraid_of_zombies@lemmy.world 6 points 9 months ago (2 children)

Internet: this is awful, of course your inheritors own your own image as stewarts.

Also Internet: I have a right to take pictures of you, your car, your house, or record you without consent. Edit it however I want. Make as much money as I want from the activities and you have no rights. Since if technology allows me to do something you have no expectation that I won't.

We are demanding that a public figure who is dead have more rights than a private person who is alive.

load more comments (2 replies)
[–] Showroom7561@lemmy.ca 6 points 9 months ago (6 children)

What's the alleged crime? Comedy impersonation isn't illegal. And the special had numerous disclaimers that it was an impersonation of Carlin.

Sounds like a money grab by the estate, which Carlin himself probably would have railed on.

[–] Maggoty@lemmy.world 24 points 9 months ago (13 children)

Where's the line? Were they parodying Carlin? Or just using his likeness? Can Fox News do this with Biden?

This is a far larger thing than just a comedy impersonation.

[–] Wogi@lemmy.world 13 points 9 months ago (1 children)

It's something the law isn't equipped to handle as written.

[–] rottingleaf@lemmy.zip 5 points 9 months ago

And fear of things for which no law can be ready imagined in their extremes is how I got my current attitude to everything legal.

About the event itself - well, I suppose Carlin himself would be amused by the fact.

[–] 4AV@lemmy.world 7 points 9 months ago (1 children)

Whether it's presented as real seems a reasonable line to me.

Fox News could not use it to mislead people into thinking Biden said something that he did not, but parody like "Sassy Justice" from the South Park creators (using a Trump deepfake) would still be fine.

[–] Maggoty@lemmy.world 11 points 9 months ago

Fox News could run it with every disclaimer out there and it would still get picked up by every other conservative channel and site as legitimate.

This is why likenesses are protected.

load more comments (11 replies)
[–] CerealKiller01@lemmy.world 19 points 9 months ago (9 children)

What do you mean by "comedy impersonation" - parody, or just copying a comedian?

If I were to set up a music show with a Madonna impersonator and slightly changed Madonna songs (or songs in her style), I'll get my pants sued off.

If Al Yankovic does a parody of a Madonna song, he's in the clear (He does ask for permission, but that's a courtesy and isn't legally mandatory).

The legal term is "transformative use". Parody, like where SNL has Alec Baldwin impersonating Trump, is a recognized type of transformative use. Baldwin doesn't straight up impersonate Trump, he does so in a comedic fashion (The impersonation itself is funny, regardless of how funny Trump is). The same logic applied when parodying or impersonating a comedian.

load more comments (9 replies)
load more comments (4 replies)
[–] Mango@lemmy.world 4 points 9 months ago (3 children)

It's nothing like Carlin.

It's theft of his work.

Pick one.

[–] Scipitie@lemmy.dbzer0.com 4 points 9 months ago (4 children)

I read it like:

Mimic, pace of tone and body language are parts of the work.

That they don't hit the main part (I.e the humor) is just the icing.

Perhaps I'm top lenient though.

load more comments (4 replies)
load more comments (2 replies)
load more comments
view more: next ›