Here are
797 public repositories
matching this topic...
Get up and running with Llama 3, Mistral, Gemma, and other large language models.
?? Lobe Chat - an open-source, modern-design LLMs/AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Bedrock / Azure / Mistral / Perplexity ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT chat application.
-
Updated
May 3, 2024
-
TypeScript
Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
-
Updated
May 3, 2024
-
TypeScript
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)
-
Updated
May 3, 2024
-
TypeScript
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
-
Updated
May 1, 2024
-
Python
Low-code framework for building custom LLMs, neural networks, and other AI models
-
Updated
May 3, 2024
-
Python
-
Updated
May 2, 2024
-
Python
A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!
-
Updated
Apr 23, 2024
-
TypeScript
Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama3 for WhatsApp & Messenger.
-
Updated
May 3, 2024
-
Jupyter Notebook
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
-
Updated
Apr 30, 2024
-
Python
?? Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
Updated
Apr 29, 2024
-
Python
Finetune Llama 3, Mistral & Gemma LLMs 2-5x faster with 80% less memory
-
Updated
May 3, 2024
-
Python
中文LLaMA-2 & Alpaca-2大模型二期?目 + 64K超?上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
-
Updated
Apr 30, 2024
-
Python
Firefly: 大模型??工具,支持??Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
-
Updated
Apr 28, 2024
-
Python
A series of large language models developed by Baichuan Intelligent Technology
-
Updated
Feb 26, 2024
-
Python
-
Updated
May 3, 2024
-
Python
Fault-tolerant, highly scalable GPU orchestration, and a machine learning framework designed for training models with billions to trillions of parameters
-
Updated
Mar 23, 2024
-
Jupyter Notebook
雅意大模型:?客?打造安全可?的??大模型,基于大?模中英文多?域指令?据??的 LlaMA 2 & BLOOM 系列模型,由中科?歌算法????。(Repo for YaYi Chinese LLMs based on LlaMA2 & BLOOM)
-
Updated
Jan 17, 2024
-
Python
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
Updated
May 3, 2024
-
Python
OpenCompass is an LLM evaluation platform, supporting a wide range of models (Llama3, Mistral, InternLM2,GPT-4,LLaMa2, Qwen,GLM, Claude, etc) over 100+ datasets.
-
Updated
Apr 30, 2024
-
Python
Improve this page
Add a description, image, and links to the
llama2
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
llama2
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.