this post was submitted on 24 Aug 2024
164 points (96.6% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
54577 readers
538 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean yes but that's like saying Bitcoin is used by criminals to buy drugs and weapons. The problem is that's not their only use.
Wait till you hear about the idiots who unironically make that argument for banning Bitcoin too
Bitcoin is a bad example, since it's not designed as a private currency. Monero/XMR is actually usable.
That's like saying Voat isn't only used by incel trolls who got banned from reddit
That's like saying Voat isn't only used by incel trolls who got banned from reddit
Yep. The issue is that they put out a tool that does some good things, but is also heavily adopted by criminals who piggyback on it.
Should we let child abuse just proliferate with these tools, because there’s so much need for privacy? How do you weed out the bad without kneecapping the good? There’s no good answer here. The good parts of the tech working enable the bad parts, too.
There has to be a certain level of knowledge and acceptance of the bad parts to continue developing it. It’s a catch 22, so law enforcement has to pick between sacrificing the privacy or allowing a tool to exist that proliferates child abuse material and other ills.
There are valid arguments for the importance of privacy, and valid arguments for making sure there these crimes shouldn’t have a safe haven. Action to either end will hurt some people and enrage others.
The standard I recall being established back in the nineties as to whether strong encryption was even legal in the US was "substantial non-infringing use" or similar. It's been awhile.
The problem with key-escrow or anything similar is that any proscribed circumvention is also available to the "bad guys".
I think Telegram's stance would be that they can't moderate because of strong end-to-end encryption. Back in the day the parallel would have been made to the phone system or mail.
Of course this is all happening in France, so I have no idea what the combination of French and EU laws will have on this, but I would still broadly expect that if a parallel can be made to mail or phone, Telegram would be in the clear. The phone company and mail service have no expectation of content moderation.
I guess we'll see.
The huge difference between mail or phone and telegram is that both mail and phone work with law enforcement, with useful records being made available upon subpoena. Telegram, by design, will not.
If you think drawing that parallel is useful to Telegram, they would then also be required to maintain the same standards of security as the mail, with package inspections, drug dogs, entire teams of government officials investigating illegal activities etc.
The criminals use it precisely because it is not a parallel to other available channels, as it circumvents those safeguards.