Quickstart Guide
Start using Ollama SDK in minutes with these examples
This guide will help you get up and running with the Ollama SDK in minutes. We’ll cover the most common use cases to help you start building applications with Ollama.
Basic Text Generation
The most common use case for Ollama is generating text responses. Here’s how to do that with the SDK:
Streaming Responses
For a better user experience with longer responses, you can stream the text as it’s generated:
Creating Embeddings
Embeddings are vector representations of text that capture semantic meaning, useful for similarity search:
Managing Models
You can list, pull, and get information about models:
Using the OpenAI Compatibility Layer
If you’re migrating from OpenAI’s API, you can use the compatibility layer:
Using Tool Calling
For models that support function calling, you can use the tool calling API:
CLI Usage
The SDK also includes a command-line interface for quick interactions:
Next Steps
Now that you’re familiar with the basics of the Ollama SDK, you can:
- Explore the complete API reference
- Learn about advanced usage patterns
- Check out the example projects
- Read about tool calling capabilities