llamacpp
Here are 308 public repositories matching this topic...
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
Updated
Jun 2, 2024 - Python
Private & local AI personal knowledge management app.
-
Updated
Jun 2, 2024 - TypeScript
A FastAPI service for semantic text search using precomputed embeddings and advanced similarity measures, with built-in support for various file types through textract.
-
Updated
Jun 2, 2024 - Python
Drop-in, local AI alternative to the OpenAI stack. Multi-engine (llama.cpp, TensorRT-LLM). Powers 👋 Jan
-
Updated
Jun 2, 2024 - C++
Unified framework for building enterprise RAG pipelines with small, specialized models
-
Updated
Jun 2, 2024 - Python
Build LLM-enabled FastAPI applications without build configuration.
-
Updated
Jun 2, 2024 - Python
Your AI second brain. Get answers to your questions, whether they be online or in your own notes. Use online AI models (e.g gpt4) or private, local LLMs (e.g llama3). Self-host locally or use our cloud instance. Access from Obsidian, Emacs, Desktop app, Web or Whatsapp.
-
Updated
Jun 2, 2024 - Python
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
-
Updated
Jun 2, 2024 - Python
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
-
Updated
Jun 2, 2024 - Dart
LLM telegram bot
-
Updated
Jun 2, 2024 - Python
Improve this page
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."