Back
Docs
OLLM Documentation
A growing set of docs pages. Start with the request/response contract, then explore more topics as we add them.
Authentication
How to authenticate requests to the OLLM API
Getting Started
Quickstart guide for using the OLLM API
Response Guide
Interactive request/response anatomy with hotspots. Learn where the output lives and how to handle errors.
Vercel AI SDK Integration
Learn how to integrate OLLM with the Vercel AI SDK for seamless access to high-security, zero-knowledge LLM providers.
Want to contribute?
Help improve the OLLM documentation. Edit pages directly on GitHub or add new ones.