Prerequisites

Before installing the SDK, make sure you have:

  1. Node.js (v16.0.0 or later) installed
  2. Ollama installed and running on your machine or a remote server

Installing the SDK

You can install the Tekimax SDK using your preferred package manager:

npm install tekimax-sdk

Verify Installation

To verify that the SDK has been installed correctly, you can create a simple test script:

// test-sdk.js
const { OllamaClient } = require('tekimax-sdk');

async function testConnection() {
  const client = new OllamaClient();
  
  try {
    const models = await client.models.list();
    console.log('Available models:', models.models.map(m => m.name));
    console.log('Tekimax SDK is connected successfully!');
  } catch (error) {
    console.error('Failed to connect to Ollama:', error.message);
    console.log('Make sure Ollama is running on your machine');
  }
}

testConnection();

Run the script using Node.js:

node test-sdk.js

If Ollama is running and the SDK is installed correctly, you should see a list of available models.

Installing the CLI (Optional)

The SDK also comes with a command-line interface (CLI) that you can install globally:

npm install -g tekimax-sdk

Once installed, you can use the CLI to interact with LLMs directly from your terminal:

tekimax-sdk list

This command lists all available models in your Ollama installation.

Configuration

By default, the SDK connects to Ollama at http://localhost:11434. If you’re running Ollama on a different host or port, you can configure the SDK:

const { OllamaClient } = require('tekimax-sdk');

const client = new OllamaClient({
  baseUrl: 'http://your-ollama-host:11434' // Replace with your Ollama host
});

Troubleshooting

If you encounter issues during installation or connection:

  1. Ollama not running: Ensure Ollama is installed and running properly
  2. Connection refused: Check if the Ollama server is accessible at the specified URL
  3. Version compatibility: Make sure you’re using a compatible version of Ollama with this SDK

For more details, see the Troubleshooting Guide or open an issue on our GitHub repository.

Future Model Support

While the SDK currently connects to Ollama’s API, we are actively developing connectors for:

  • OpenAI models
  • Claude by Anthropic
  • Gemini by Google
  • Meta’s Llama models
  • Mistral AI
  • Cohere
  • Other leading LLM providers

These will be available in future releases. We are actively seeking partnerships and collaborations with AI providers to enhance the SDK’s capabilities for educational purposes.

Next Steps

Now that you’ve installed the Tekimax SDK, check out the Quick Start guide to learn how to use it in your applications.