If an organization runs a survey in 2024 on whether it should get into AI, then they’ve already bodged an LLM into the system and they’re seeing if they can get away with it. Proton Mail is a priva…
we appear to be the first to write up the outrage coherently too. much thanks to the illustrious @self
Your prompt — that is, the email you’re writing — is kept in plain text on their server
Besides, I just don’t want AI in general, is that too much to ask? I wonder how long it will be until there are companies actively promoting their lack of AI.
it can run locally, but Proton discourages it in their marketing, it has very high system requirements, and it requires you use a chromium-based browser (which is a non-starter for a solid chunk of Proton’s userbase). otherwise, it uses the cloud version of the feature, which works exactly like the quote describes, though Proton tries to pretend otherwise; it’s actually incredibly out of the ordinary that they pushed this feature at all without publishing anything about its threat model.
it’s unclear what happens if the feature’s enabled and set to local but you switch to a computer that can’t run the LLM. it’s also just fucked that there’s two identical versions of the same feature, but one of them exfiltrates your data.
Besides, I just don’t want AI in general, is that too much to ask?
you’re not alone. the other insulting part of this is that the vast majority of Proton’s userbase indicated they didn’t want this feature in responses to Proton’s 2024 survey, which was effectively constructed to make it impossible to say no to the LLM feature, since the feature portion of the survey was stack ranked. the blog post introducing Scribe even lies about the results of the survey — an LLM wasn’t even close to being the most requested feature.
Doesn’t sound like it
Besides, I just don’t want AI in general, is that too much to ask? I wonder how long it will be until there are companies actively promoting their lack of AI.
Its already happening, to some extent, but not mainly among the big corps. Grabbing some random examples I could find:
Cara blew up a few weeks ago off the back of Instagram going all-in on AI
Glaze and Nightshade earned a lot of popularity by offering means to sabotage them
Dove also made waves by directly taking shots at AI, too
Nintendo publicly eschewed using it, stating they’re focused “delivering value that is unique to Nintendo and cannot be created by technology alone”.
Newgrounds put the hammer down early on AI, but more publicly disavowed it alongside adding an option to flag something as AI-made in March this year
Last, but not least, Beth Spencer cooked up a quick-and-dirty “Made with Human Intelligence” badge which has since blown the fuck up online
I’m probably missing some examples, but I think my point’s made.
it can run locally, but Proton discourages it in their marketing, it has very high system requirements, and it requires you use a chromium-based browser (which is a non-starter for a solid chunk of Proton’s userbase). otherwise, it uses the cloud version of the feature, which works exactly like the quote describes, though Proton tries to pretend otherwise; it’s actually incredibly out of the ordinary that they pushed this feature at all without publishing anything about its threat model.
it’s unclear what happens if the feature’s enabled and set to local but you switch to a computer that can’t run the LLM. it’s also just fucked that there’s two identical versions of the same feature, but one of them exfiltrates your data.
you’re not alone. the other insulting part of this is that the vast majority of Proton’s userbase indicated they didn’t want this feature in responses to Proton’s 2024 survey, which was effectively constructed to make it impossible to say no to the LLM feature, since the feature portion of the survey was stack ranked. the blog post introducing Scribe even lies about the results of the survey — an LLM wasn’t even close to being the most requested feature.
e: and for those curious who missed it in the article, the system requirements for the local version of the feature are here