Disclaimer: This SDK is not affiliated with or endorsed by any Large Language Model provider. The tekimax-sdk was created independently to support educational workshops and promote AI literacy. We are actively looking for partnerships and collaborations to make this SDK more robust for AI literacy initiatives.

The tekimax-sdk is a comprehensive TypeScript/JavaScript SDK for working with Large Language Models. Currently, it interfaces with Ollama API for local model execution, but we have plans to expand support for other providers like Claude and Gemini. We created this SDK specifically to simplify LLM integration for workshops, tutorials, and educational purposes.

Why We Built This SDK

This SDK was designed with education in mind:

  • Simplified LLM Integration: Makes working with large language models accessible for developers of all experience levels
  • Workshop-Ready: Includes ready-to-use examples and exercises for LLM workshops
  • Interactive Tutorials: Comes with interactive tutorials to help you learn LLM concepts
  • Local First: Currently focuses on running models locally with Ollama for privacy and learning
  • AI Literacy: Helps developers better understand LLM capabilities and limitations
  • Provider Flexibility: Future-proofed to work with multiple LLM providers

Current API Support

Currently, the SDK interfaces with Ollama, which lets you run open-source large language models, such as Llama 2, locally on your machine. This gives you the power of AI models without sending your data to external APIs.

Future Support (Coming Soon)

We’re actively developing connectors for:

  • OpenAI models
  • Claude by Anthropic
  • Gemini by Google
  • Meta’s Llama models
  • Mistral AI
  • Cohere
  • Other Leading LLM Providers

This will make the SDK even more versatile for educational environments.

Educational Purpose

This SDK is designed primarily for:

  • Educational workshops on LLM technology
  • Classroom instruction on AI capabilities
  • Developer training and skill enhancement
  • Research and experimentation with models

What can you do with this SDK?

With the Tekimax SDK, you can:

  • Generate text responses from prompts
  • Stream responses in real-time
  • Create embeddings for semantic search
  • Manage and interact with different models
  • Use OpenAI-compatible interfaces with local models
  • Leverage tool-calling capabilities (function calling)

Included Workshops & Tutorials

This SDK includes:

  1. LLM Basics Workshop: A step-by-step introduction to working with language models
  2. Semantic Search Tutorial: Learn how to implement semantic search with embeddings
  3. Tool Calling Workshop: Explore how to extend LLM capabilities with external tools
  4. Interactive Chat Example: Build a chat application with streaming responses

Requirements

  • Node.js: v16.0.0 or higher
  • Ollama: Running locally or on a remote server (for Ollama API usage)
  • TypeScript (optional): For type safety

Installation

You can install the SDK using npm, yarn, or pnpm:

# Using npm
npm install tekimax-sdk

# Using yarn
yarn add tekimax-sdk

# Using pnpm
pnpm add tekimax-sdk

Quick Example

Here’s a simple example to get you started:

import { OllamaClient } from 'tekimax-sdk';

async function main() {
  // Initialize the client (currently using Ollama API)
  const ollama = new OllamaClient({
    baseUrl: 'http://localhost:11434' // Default Ollama server URL
  });
  
  // Generate text with a model
  const response = await ollama.generate({
    model: 'llama2',
    prompt: 'Explain quantum computing in simple terms',
    temperature: 0.7
  });
  
  console.log('Generated response:');
  console.log(response.response);
}

main().catch(console.error);

Command & Query Interface

Our SDK features a simple, intuitive command and query interface:

  • Commands: Execute actions (like pulling models)

    // Pull a model
    await ollama.pullModel({ name: 'llama2' });
    
  • Queries: Request information (like generating text)

    // Generate text (query)
    const response = await ollama.generate({ 
      model: 'llama2',
      prompt: 'What is AI?'
    });
    

Workshop Tutorial

Check out our included workshop documentation to get hands-on experience with LLMs:

Next Steps