this post was submitted on 22 Jan 2024
394 points (94.4% liked)
Technology
59377 readers
4059 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I can already locally host a pretty decent ai chatbot on my old M1 Macbook (llama v2 7B) which writes at the same speed I can read, its probably already possible with the top of the line phones.
Lol, "old M1 laptop" 3 to 4 years is not old, damn!
(I have running macbookpro5,3 (mid 2009) on Arch, lol)
But nice to hear that M1 (an thus theoretically even the iPad, if you are not talking about M1 pro / M1 max) can already run llamma v2 7B.
Have you tried the mistralAI already, should be a bit more powerful and a bit more efficient iirc. And it is Apache 2.0 licensed.
https://mistral.ai/news/announcing-mistral-7b/
An iPhone XR/XS can run Stable Diffusion, believe it or not.
Huh, nice. I got the macbook air secondhand so I thought it was older. Thanks for the suggestion, I'll try mistralAI next, perhaps on my phone as a test.