Workshop setup
Ollama SDK Workshop Setup Guide
This document contains instructions for setting up your environment for the Ollama SDK workshop.
Prerequisites
Before starting the workshop, ensure you have the following installed on your system:
-
Node.js (v16+) - Download Node.js
- Verify installation:
node --version
- Verify installation:
-
Ollama - Download Ollama
- Verify installation:
ollama --version
- Verify installation:
-
Git - Download Git
- Verify installation:
git --version
- Verify installation:
-
A code editor - VS Code recommended
-
Terminal or command prompt
Environment Setup
1. Clone the Workshop Repository
2. Install Dependencies
3. Ensure Ollama is Running
Start the Ollama service:
- macOS/Linux:
ollama serve
- Windows: Ollama should run automatically as a service after installation
Verify it’s running by opening http://localhost:11434
in your browser. You should see a simple “Ollama is running” message.
4. Download Required Models
For the workshop, we recommend having these models available:
Verify models are available:
5. Install the Ollama SDK CLI
Verify installation:
Workshop Directory Structure
After setup, your workshop directory will look like this:
Create Documents for Semantic Search
The semantic search example requires text documents. Create a sample documents directory:
Add some sample text files (copy some Wikipedia articles or other content):
Verify Setup
Run the setup verification script:
You should see output confirming that:
- Ollama is running
- Required models are available
- The Ollama SDK is properly installed
- Example files are accessible
Troubleshooting
Ollama not running
If Ollama isn’t running, try:
Models not downloading
If you encounter issues pulling models:
SDK installation issues
If the SDK doesn’t install properly:
Network issues
If you’re behind a corporate firewall:
Next Steps
Once your environment is set up successfully, proceed to the Workshop Guide to begin the workshop.