llamacpp
Here are 171 public repositories matching this topic...
AGiXT is a dynamic AI Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.
-
Updated
Nov 24, 2023 - Python
Run LLaMA/GPT model easily and fast in C#!🤗 It's also easy to integrate LLamaSharp with semantic-kernel, unity, WPF and WebApp.
-
Updated
Nov 27, 2023 - C#
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
-
Updated
Aug 3, 2023 - Rust
A FastAPI service for semantic text search using precomputed embeddings and advanced similarity measures, with built-in support for various file types through textract.
-
Updated
Nov 22, 2023 - Python
The TypeScript library for building multi-modal AI applications.
-
Updated
Nov 27, 2023 - TypeScript
Calculate token/s & GPU memory requirement for any LLM. Supports llama.cpp/ggml/bnb/QLoRA quantization
-
Updated
Nov 4, 2023 - JavaScript
A simple "Be My Eyes" web app with a llama.cpp/llava backend
-
Updated
Nov 28, 2023 - JavaScript
The "vicuna-installation-guide" provides step-by-step instructions for installing and configuring Vicuna 13 and 7B
-
Updated
Oct 10, 2023
♾️ toolkit for air-gapped LLMs on consumer-grade hardware
-
Updated
Oct 27, 2023 - Python
AI-powered cybersecurity chatbot designed to provide helpful and accurate answers to your cybersecurity-related queries and also do code analysis and scan analysis.
-
Updated
Sep 2, 2023 - Python
Run any Large Language Model behind a unified API
-
Updated
Nov 13, 2023 - Python
🪶 Lightweight OpenAI drop-in replacement for Kubernetes
-
Updated
Nov 28, 2023 - Python
A Discord Bot for chatting with LLaMA, Vicuna, Alpaca, MPT, or any other Large Language Model (LLM) supported by text-generation-webui or llama.cpp.
-
Updated
Jun 4, 2023 - Python
Inference Vision Transformer (ViT) in plain C/C++ with ggml
-
Updated
Nov 27, 2023 - C++
Improve this page
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."

