14
submitted 3 months ago by shaked_coffee@feddit.it to c/fosai@lemmy.world

Today thanks to a NetworkChuck video I discovered OpenWebUl and how easy it is to set up a local LLM chat assistant. In particular, the ability to upload documents and use them as a context for chats really caught my interest. So now my question is: let's say l've uploaded 10 different documents on OpenWebUl, is there a way to ask llama3 which between all the uploaded documents contains a certain information (without having to explicitly tag all the documents)? And if not is something like this possible with different local lIm combinations?

you are viewing a single comment's thread
view the rest of the comments
[-] WalnutLum@lemmy.ml 2 points 3 months ago

Only if your model has a large enough token context to contain all the documents' info would you be able to do something like that

[-] shaked_coffee@feddit.it 1 points 3 months ago

And where do I find how much token context has my llm?

[-] General_Effort@lemmy.world 2 points 3 months ago

It probably says somewhere where you dled the model. It's also in the metadata. I forget where it's displayed. Maybe in the terminal window.

Things you should know:

  • What a token is depends on the model.
  • Context takes a lot of (V)RAM.
  • People modify models to increase the context but that often doesn't work well. Watch out for the model missing things, esp. in the middle of the document.

L3 is probably not the right base for the task. Maybe Phi-3 or Cohere.

this post was submitted on 03 Jun 2024
14 points (100.0% liked)

Free Open-Source Artificial Intelligence

2819 readers
2 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 1 year ago
MODERATORS