this post was submitted on 29 Apr 2024
454 points (93.8% liked)
Technology
59428 readers
3150 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The problem is that LLMs require a considerable amount of computing power to run, unlike the simple markov chain predictions that keyboards use. You could use a cloud-based service like ChatGPT or something, but most people wouldn't want their keyboards to send all their keystrokes to a remote server... and even if they didn't know or care, the response time wouldn't be good enough for real-time predictions.
Now smartphone SoC makers like Qualcomm have started adding NPUs (neural processing units) with their latest chips (such as the SD8 Gen 3, featured in the most recent flagship phones), but it's going to take a while before devices with NPUs become commonplace, and it'll take a while for developers to start making/updating apps that can make use of it.
But yeah the good news is that it is coming, it's only a matter of "when" - I suspect it won't be long before the likes of SwiftKey start to take advantage of this.