this post was submitted on 31 Jan 2025
202 points (95.1% liked)

Open Source

32337 readers
634 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

(page 2) 26 comments
sorted by: hot top controversial new old
[–] wuphysics87@lemmy.ml 10 points 13 hours ago (2 children)

There are many llms you can use offline

load more comments (2 replies)
[–] zante@slrpnk.net 7 points 12 hours ago

Proton have been too noisy from the very start .

[–] Telorand@reddthat.com 5 points 13 hours ago (8 children)

It might be that they're equating the name with the app and company, not the open source model, based on one of the first lines:

AI chat apps like ChatGPT collect user data, filter responses, and make content moderation decisions that are not always transparent.

Emphasis mine. The rest of the article reads the same way.

Most people aren't privacy-conscious enough to care who gets what data and who's building the binaries and web apps, so sounding the alarm is appropriate for people who barely know the difference between AI and AGI.

I get that people are mad at Proton right now (anyone have a link? I'm behind on the recent stuff), but we should ensure we get mad at things that are real, not invent imaginary ones based on contrived contexts.

[–] JOMusic@lemmy.ml 6 points 11 hours ago

Yeah it's a fair call, but to me it is the very context of why people are made at Proton that makes me suspicious of articles like this.

I can't find the original summary post someone made, but here's the last response from Proton CEO. Read the comments as well to get a good summary: https://www.reddit.com/r/ProtonMail/comments/1i2nz9v/on_politics_and_proton_a_message_from_andy/

TL;DR: Proton used their official accounts to share CEO's pro-US-Republican thoughts as their official stance. They since apologized and said they would use personal account to share those thoughts. But (IMO) now having posted this blog on the actual Proton website, it says to me that there are some serious bias alignment issues with a company that is supposed to be a safe-haven away from all of that.

[–] ReversalHatchery@beehaw.org 4 points 12 hours ago

it is certainly that. but recently its become very trendy to hate Proton, so its just easier to do that instead of thinking. I'm really disappointed in this community

load more comments (6 replies)
[–] ReversalHatchery@beehaw.org 3 points 12 hours ago* (last edited 12 hours ago) (4 children)

im not an expert at criticism, but I think its fair from their part.

I mean, can you remind me what are the hardware requirements to run deepseek locally?
oh, you need a high-end graphics card with at least 8 GB VRAM for that*? for the highly distilled variants! for more complete ones you need multiple such graphics card interconnected! how do you even do that with more than 2 cards on a consumer motherboard??

how many do you think have access to such a system, I mean even 1 high-end gpu with just 8 GB VRAM, considering that more and more people only have a smartphone nowadays, but also that these are very expensive even for gamers?
and as you will read in the 2nd referenced article below, memory size is not the only factor: the distill requiring only 1 GB VRAM still requires a high-end gpu for the model to be usable.

https://www.tomshardware.com/tech-industry/artificial-intelligence/amd-released-instructions-for-running-deepseek-on-ryzen-ai-cpus-and-radeon-gpus

https://bizon-tech.com/blog/how-to-run-deepseek-r1-locally-a-free-alternative-to-openais-o1-model-hardware-requirements#a6

https://codingmall.com/knowledge-base/25-global/240733-what-are-the-system-requirements-for-running-deepseek-models-locally

so my point is that when talking about deepseek, you can't ignore how they operate their online service, as most people will only be able to try that.

I understand that recently it's very trendy, and cool to shit on Proton, but they have a very strong point here.

load more comments (4 replies)
load more comments
view more: ‹ prev next ›