Discussion:
LLM Local Documents ?
(too old to reply)
jeremy ardley
2024-06-25 04:00:02 UTC
Permalink
I've installed a desktop Large Language Model shell GPT4all

GPT4All provides a gui interface to installed or remote Large Language
models. I'm running Nous Hermes 2 Mistral DPO locally and I can also use
remote APIs, I think including openai GPT series.

I want to get GPT4All to index and use Debian documentation to
supplement the more general training in the model. GPT4All has a local
documents feature where you put all the documents you need into a
directory which is then indexed.

Does anyone have experience in doing this? Is there limitations on the
quantity and format of documents that can be used? I guess there may be
practical limits on just how much documentation is indexed, not so much
in storage but in diluting relevant content.

Jeremy
Jeff Peng
2024-06-25 06:10:02 UTC
Permalink
Post by jeremy ardley
I've installed a desktop Large Language Model shell GPT4all
does gtp4all have a shell only interface? my debian is remote server,
has no desktop.

regards.
jeremy ardley
2024-06-25 07:50:01 UTC
Permalink
Post by Jeff Peng
does gtp4all have a shell only interface? my debian is remote server,
has no desktop.
GPT4ALl hosts a webservice on localhost - 127.0.0.1

There are many ways to access that including using haproxy or running a
web browser on the server using X tunneling.

But your big problem will be speed. All LLMS require a modern graphics
card / GPU to get acceptable speed. Something like an nVidia 3060 with
12G video RAM or better.

If your server is headless you won't have that compute available and
results will be very ordinary.

Loading...