this post was submitted on 15 Nov 2024
89 points (98.9% liked)

Futurology

1776 readers
249 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] chaosCruiser@futurology.today 7 points 14 hours ago* (last edited 14 hours ago) (1 children)

”However, if it is performance you are concerned about, "it's important to note that GPUs still far outperform NPUs in terms of raw performance," Jessop said, while NPUs are more power-efficient and better suited for running perpetually.”

Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.

[–] JohnDClay@sh.itjust.works 1 points 11 hours ago

Or use a laptop with a GPU? An npu seems to just be slightly upgraded onboard graphics.