📑 Docs • 🌐 Website • 🤝 Contribute • ✍🏽 Blogs
✨ If you would like to help spread the word about Rig, please consider starring the repo!
Warning
Here be dragons! As we plan to ship a torrent of features in the following months, future updates will contain breaking changes. With Rig evolving, we'll annotate changes and highlight migration paths as we encounter them.
Rig is a Rust library for building scalable, modular, and ergonomic LLM-powered applications.
More information about this crate can be found in the official and crate API reference documentation.
- Agentic workflows that can handle multi-turn streaming and prompting
- Full GenAI Semantic Convention compatibility
- 20+ model providers, all under one singular unified interface
- 10+ vector store integrations, all under one singular unified interface
- Full support for LLM completion and embedding workflows
- Support for transcription, audio generation and image generation model capabilities
- Integrate LLMs in your app with minimal boilerplate
- Full WASM compatibility (core library only)
Below is a non-exhaustive list of companies and people who are using Rig:
- St Jude - Using Rig for a chatbot utility as part of
proteinpaint, a genomics visualisation tool. - Coral Protocol - Using Rig extensively, both internally as well as part of the Coral Rust SDK.
- VT Code - VT Code is a Rust-based terminal coding agent with semantic code intelligence via Tree-sitter and ast-grep. VT Code uses
rigfor simplifying LLM calls and implement model picker. - Con - Con is a GPU-accelerated terminal emulator with a built-in AI agent harness. It uses Rig as the provider abstraction layer for its integrated coding agents.
- Dria - a decentralised AI network. Currently using Rig as part of their compute node.
- Nethermind - Using Rig as part of their Neural Interconnected Nodes Engine framework.
- Neon - Using Rig for their app.build V2 reboot in Rust.
- Listen - A framework aiming to become the go-to framework for AI portfolio management agents. Powers the Listen app.
- Cairnify - helps users find documents, links, and information instantly through an intelligent search bar. Rig provides the agentic foundation behind Cairnify’s AI search experience, enabling tool-calling, reasoning, and retrieval workflows.
- Ryzome - Ryzome is a visual AI workspace that lets you build interconnected canvases of thoughts, research, and AI agents to orchestrate complex knowledge work.
- deepwiki-rs - Turn code into clarity. Generate accurate technical docs and AI-ready context in minutes—perfectly structured for human teams and intelligent agents.
- Cortex Memory - The production-ready memory system for intelligent agents. A complete solution for memory management, from extraction and vector search to automated optimization, with a REST API, MCP, CLI, and insights dashboard out-of-the-box.
- Ironclaw - A secure personal AI assistant
- ilert - Incident management & alerting platform. Uses Rig as the multi-provider abstraction in its agentic LLM proxy powering ilert AI.
For a full list, check out our ECOSYSTEM.md file.
Are you also using Rig? Open an issue to have your name added!
Use the root rig facade when you want feature-gated access to companion crates,
or use rig-core directly when you only need the core provider abstractions.
cargo add rig
# or: cargo add rig-coreuse rig::client::{CompletionClient, ProviderClient};
use rig::completion::Prompt;
use rig::providers::openai;
#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
// Create OpenAI client
let client = openai::Client::from_env()?;
// Create agent with a single context prompt
let comedian_agent = client
.agent(openai::GPT_5_2)
.preamble("You are a comedian here to entertain the user using humour and jokes.")
.build();
// Prompt the agent and print the response
let response = comedian_agent.prompt("Entertain me!").await?;
println!("{response}");
Ok(())
}Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features
or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).
You can find more examples in each crate's examples directory (for example, examples). Many provider-specific examples now also live as ignored live integration tests under tests/providers, organized by provider. When running those provider-backed tests, prefer provider-specific targets such as cargo test -p rig --test openai -- --ignored --test-threads=1 to avoid rate-limiting. More detailed use case walkthroughs are regularly published on our Dev.to Blog and added to Rig's official documentation at docs.rig.rs.
The root rig facade exposes companion crates behind one feature per integration:
rig = { version = "0.36.0", features = ["lancedb", "fastembed"] }| Integration | Crate | Feature | Module path |
|---|---|---|---|
| AWS Bedrock | rig-bedrock |
bedrock |
rig::bedrock |
| AWS S3Vectors | rig-s3vectors |
s3vectors |
rig::s3vectors |
| Cloudflare Vectorize | rig-vectorize |
vectorize |
rig::vectorize |
| FastEmbed | rig-fastembed |
fastembed |
rig::fastembed |
| Google Gemini gRPC | rig-gemini-grpc |
gemini-grpc |
rig::gemini_grpc |
| Google Vertex AI | rig-vertexai |
vertexai |
rig::vertexai |
| HelixDB | rig-helixdb |
helixdb |
rig::helixdb |
| LanceDB | rig-lancedb |
lancedb |
rig::lancedb |
| Memory policies | rig-memory |
memory |
rig::memory |
| Milvus | rig-milvus |
milvus |
rig::milvus |
| MongoDB | rig-mongodb |
mongodb |
rig::mongodb |
| Neo4j | rig-neo4j |
neo4j |
rig::neo4j |
| PostgreSQL | rig-postgres |
postgres |
rig::postgres |
| Qdrant | rig-qdrant |
qdrant |
rig::qdrant |
| ScyllaDB | rig-scylladb |
scylladb |
rig::scylladb |
| SQLite | rig-sqlite |
sqlite |
rig::sqlite |
| SurrealDB | rig-surrealdb |
surrealdb |
rig::surrealdb |
rig::memory is available without the memory feature; it contains the core
conversation memory traits and in-memory backend re-exported from rig-core.
Enabling features = ["memory"] adds reusable history-shaping policy types from
the rig-memory companion crate to the same module.
We also have some other associated crates that have additional functionality you may find helpful when using Rig:
rig-onchain-kit- the Rig Onchain Kit. Intended to make interactions between Solana/EVM and Rig much easier to implement.