self

joined 2 years ago
MODERATOR OF
[–] self@awful.systems 33 points 6 months ago (1 children)

between that thread’s activity pattern and how hard they tried to fudge the numbers on their own survey to make this feature look popular: boy there’s a lot of stank on this one

but hey here’s some worrying shit straight from the Proton team:

Our business audience was the most interested in a writing assistant, this is why we started gradually rolling it out starting with Business and Visionary plans. We will look into making it available to more users at a later date!

so there’s something utterly fucking obvious for the “it’s only for business users” posters to consider; they’re doing the same frog boiling shit that all LLM fuckheads do.

I’m tempted to crosspost David’s article and my mastodon thread to that community, since Proton hasn’t really replied otherwise, and they seem plenty active there answering softball questions and removing posts. I don’t look forward to the Kagi-level shitstorm in my inbox afterwards though

[–] self@awful.systems 27 points 6 months ago* (last edited 6 months ago)

it can run locally, but Proton discourages it in their marketing, it has very high system requirements, and it requires you use a chromium-based browser (which is a non-starter for a solid chunk of Proton’s userbase). otherwise, it uses the cloud version of the feature, which works exactly like the quote describes, though Proton tries to pretend otherwise; it’s actually incredibly out of the ordinary that they pushed this feature at all without publishing anything about its threat model.

it’s unclear what happens if the feature’s enabled and set to local but you switch to a computer that can’t run the LLM. it’s also just fucked that there’s two identical versions of the same feature, but one of them exfiltrates your data.

Besides, I just don’t want AI in general, is that too much to ask?

you’re not alone. the other insulting part of this is that the vast majority of Proton’s userbase indicated they didn’t want this feature in responses to Proton’s 2024 survey, which was effectively constructed to make it impossible to say no to the LLM feature, since the feature portion of the survey was stack ranked. the blog post introducing Scribe even lies about the results of the survey — an LLM wasn’t even close to being the most requested feature.

e: and for those curious who missed it in the article, the system requirements for the local version of the feature are here

[–] self@awful.systems 24 points 6 months ago

though to be honest, the fact that you think this is local-only and only affects business accounts perfectly demonstrates how fucking dangerous Proton’s marketing and design around this feature is

[–] self@awful.systems 23 points 6 months ago

read the fucking article before you multi-post your uninformed shit in this thread, thanks

[–] self@awful.systems 7 points 6 months ago

David’s article has some details on what the LLM is. I don’t think it’s trained on emails, but that doesn’t make me feel much better.

[–] self@awful.systems 16 points 6 months ago (26 children)

did you read the parts of the article that describe why the LLM is an issue?

[–] self@awful.systems 18 points 6 months ago (1 children)

they’re not end-to-end encrypted; their security model involves giving their server both your GPG private key and its passphrase, which makes your inbox and other data trivially able to be subpoenaed by German authorities.

I don’t think this is a replacement for Proton or Tutanota at all.

[–] self@awful.systems 19 points 6 months ago (2 children)

do you get banned from twitter if you call him a fucking asshole?

I’m working on a more detailed reply on mastodon but to be honest, I’m pretty sure he didn’t read the original post

[–] self@awful.systems 11 points 6 months ago (2 children)

only one of the 8 computers I own (and I’m not being cheeky here and counting embedded or retro systems, just laptops and desktops) is physically capable of meeting the model’s minimum requirements, and that’s only if I install chromium on the Windows gaming VM my bigger GPU’s dedicated to and access protonmail from there. nothing else I do needs a GPU that big, professional or otherwise — that hardware exists for games and nothing else. compared with the integrated GPUs most people have, a 2060’s fucking massive.

do you see how these incredibly high system requirements (for a webmail client of all things), alongside them already treating the local model as strictly optional, can act as a funnel redirecting people towards the insecure cloud version of the feature? “this feature only works securely on one of the computers where you write mail, at best” feels like a dark pattern to me.

[–] self@awful.systems 24 points 6 months ago

just a little violation of my trust for the company I pay for privacy and encryption services. as a treat.

[–] self@awful.systems 24 points 6 months ago

alternatively, if the only version of this that doesn’t break Proton’s e2e security model is the local-only version, maybe don’t ship the cloud hosted version of the feature under any circumstances

I’d still hate the feature because the LLM model’s derived from plagiarized work and the labor of exploited workers from the global south, but this didn’t have to be a fucking privacy catastrophe

[–] self@awful.systems 5 points 6 months ago

so, uh, you remember AskJeeves?

(alternative answer: the third buzzword in a row that’s supposed to make LLMs good, after multimodal and multiagent systems absolutely failed to do anything of note)

view more: ‹ prev next ›