this post was submitted on 10 Dec 2023
118 points (84.7% liked)

No Stupid Questions

35806 readers
1819 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
 

Like if I type "I have two appl.." for example, often it will suggest "apple" singular instead of plural. Just a small example, but it is really bad at predicting which variant of a word should come after the previous

you are viewing a single comment's thread
view the rest of the comments
[–] Knusper@feddit.de 18 points 11 months ago (7 children)

I guess, the real question is: Could we be using (simplistic) LLMs on a phone for predictive text?

There's some LLMs that can be run offline and which maybe wouldn't use enormous amounts of battery. But I don't know how good the quality of those is...

[–] ashe@lemmy.starless.one 46 points 11 months ago* (last edited 11 months ago) (1 children)

You can run an LLM on a phone (tried it myself once, with llama.cpp), but even on the simplest model I could find it was doing maybe one word every few seconds while using up 100% of the CPU. The quality is terrible, and your battery wouldn't last an hour.

[–] astraeus@programming.dev 3 points 11 months ago (1 children)

Does the AI processing have to be performed locally or constantly active?

[–] EatYouWell@lemmy.world 15 points 11 months ago (3 children)

No, but you open up a can of worms from a security aspect if you send it out to be processed.

[–] Scubus@sh.itjust.works 7 points 11 months ago

I'm sure every phone having a keylogger won't end badly

[–] Zippy@lemmy.world 3 points 11 months ago
[–] bassomitron@lemmy.world 15 points 11 months ago* (last edited 11 months ago) (1 children)

The kind of local/offline LLMs that would work on your phone would not be very good quality. There's been amazing progress in quantization of LLMs to get them working on weaker GPUs with lower VRAM and CPUs, so maybe it'll occur, but I'm not an expert.

I also don't foresee them linking it up to a cloud-based LLM as that'd be a shit load of queries and extremely expensive.

[–] astraeus@programming.dev 1 points 11 months ago* (last edited 11 months ago) (1 children)

OpenAI is probably already handling a significant amount of queries, I think for daily use the LLM should simply initialize a word map based on user history and then update it semi-occasionally, like once a week or two. Most people don’t drastically change their vocabulary in the course of a few weeks

[–] EatYouWell@lemmy.world 1 points 11 months ago

We're talking about orders of magnitude more queries if we start offloading predective text like that.

[–] SpooksMcDoots@mander.xyz 6 points 11 months ago

Openhermes 2.5 Mistral 7b competes with LLMs that require 10x the resources. You could try it out on your phone.

[–] Mr_Blott@lemmy.world 6 points 11 months ago

That was my next question, thanks!

Didn't think of battery use, makes sense

[–] Munkisquisher@lemmy.nz 5 points 11 months ago (1 children)

A pre trained model isn't going to learn how you type the more you use it. Though with Microsoft owning SwiftKey, I imagine they will try it soon

[–] SidewaysHighways@lemmy.world 5 points 11 months ago

I was so heartbroken when I found out that Microsoft purchased Swiftkey. It was my favorite. Is there any way to still use it without Microsoft involved? Lawdhammercy

[–] neptune@dmv.social 4 points 11 months ago (1 children)

I think apple has pitched this for a future iPhone, yes.

[–] squaresinger@feddit.de 3 points 11 months ago

They'll probably have to offload that to a server farm in real time. That's not gonna be easy.

[–] 0x4E4F@sh.itjust.works 0 points 11 months ago

I guess... why not... but the db is probably huge, like in the hundreds of GB (maybe even TB... who knows), can't run that offline.