OLLM

Learn how to use the OLLM API. Here you'll find Postman Collections, tutorials, documentation, and code examples to get started.

Daily Requests

Postman Collection

Import the Postman Collection to quickly experiment with the OLLM API. Includes a sample request showing how to use the OLLM gateway.

Open

Tutorial & Documentation

Complete guide to using the OLLM API. Step-by-step tutorials and detailed API documentation.

Open

Code Samples

JavaScript/TypeScript Example

Simple example for using the OLLM API with JavaScript/TypeScript

// OLLM API Example
async function callOLLM(prompt: string) {
  const response = await fetch('https://api.ollm.com/v1/chat/completions', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Authorization': 'Bearer your-api-key'
    },
    body: JSON.stringify({
      model: 'near/GLM-4.6',
      messages: [
        {
          role: 'user',
          content: prompt
        }
      ]
    })
  });

  const data = await response.json();
  return data.choices[0].message.content;
}

// Usage
callOLLM('Why is the sky blue?')
  .then(response => console.log(response))
  .catch(error => console.error('Error:', error));

More samples, guides & community projects

Open

Python Example

Python example for integrating the OLLM API

# OLLM API Example
from openai import OpenAI

client = OpenAI(
    base_url="https://api.ollm.com/v1",
    api_key="your-api-key"
)

response = client.chat.completions.create(
    model="near/GLM-4.6",
    messages=[{"role": "user", "content": "Why is the sky blue?"}]
)

print(response.choices[0].message.content)

More samples, guides & community projects

Open

Rust Example

Rust example for integrating the OLLM API

// OLLM API Example (Rust)
use reqwest::blocking::Client;
use serde_json::json;

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let res: serde_json::Value = Client::new()
        .post("https://api.ollm.com/v1/chat/completions")
        .bearer_auth("your-api-key")
        .json(&json!({
            "model": "near/GLM-4.6",
            "messages": [{
                "role": "user",
                "content": "Why is the sky blue?"
            }]
        }))
        .send()?
        .json()?;

    println!("{}", res["choices"][0]["message"]["content"]);
    Ok(())
}

More samples, guides & community projects

Open