Here are
3 public repositories
matching this topic...
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
-
Updated
Mar 21, 2024
-
Python
(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
50-line local LLM assistant in Python with Streamlit and GPT4All
-
Updated
Dec 31, 2023
-
Python
Improve this page
Add a description, image, and links to the
local-llama
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
local-llama
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.