Getting Started with Tekimax SDK
Learn how to use the TypeScript/JavaScript SDK for Large Language Models
Disclaimer: This SDK is not affiliated with or endorsed by any Large Language Model provider. The tekimax-sdk
was created independently to support educational workshops and promote AI literacy. We are actively looking for partnerships and collaborations to make this SDK more robust for AI literacy initiatives.
The tekimax-sdk
is a comprehensive TypeScript/JavaScript SDK for working with Large Language Models. Currently, it interfaces with Ollama API for local model execution, but we have plans to expand support for other providers like Claude and Gemini. We created this SDK specifically to simplify LLM integration for workshops, tutorials, and educational purposes.
Why We Built This SDK
This SDK was designed with education in mind:
- Simplified LLM Integration: Makes working with large language models accessible for developers of all experience levels
- Workshop-Ready: Includes ready-to-use examples and exercises for LLM workshops
- Interactive Tutorials: Comes with interactive tutorials to help you learn LLM concepts
- Local First: Currently focuses on running models locally with Ollama for privacy and learning
- AI Literacy: Helps developers better understand LLM capabilities and limitations
- Provider Flexibility: Future-proofed to work with multiple LLM providers
Current API Support
Currently, the SDK interfaces with Ollama, which lets you run open-source large language models, such as Llama 2, locally on your machine. This gives you the power of AI models without sending your data to external APIs.
Future Support (Coming Soon)
We’re actively developing connectors for:
- OpenAI models
- Claude by Anthropic
- Gemini by Google
- Meta’s Llama models
- Mistral AI
- Cohere
- Other Leading LLM Providers
This will make the SDK even more versatile for educational environments.
Educational Purpose
This SDK is designed primarily for:
- Educational workshops on LLM technology
- Classroom instruction on AI capabilities
- Developer training and skill enhancement
- Research and experimentation with models
What can you do with this SDK?
With the Tekimax SDK, you can:
- Generate text responses from prompts
- Stream responses in real-time
- Create embeddings for semantic search
- Manage and interact with different models
- Use OpenAI-compatible interfaces with local models
- Leverage tool-calling capabilities (function calling)
Included Workshops & Tutorials
This SDK includes:
- LLM Basics Workshop: A step-by-step introduction to working with language models
- Semantic Search Tutorial: Learn how to implement semantic search with embeddings
- Tool Calling Workshop: Explore how to extend LLM capabilities with external tools
- Interactive Chat Example: Build a chat application with streaming responses
Requirements
- Node.js: v16.0.0 or higher
- Ollama: Running locally or on a remote server (for Ollama API usage)
- TypeScript (optional): For type safety
Installation
You can install the SDK using npm, yarn, or pnpm:
Quick Example
Here’s a simple example to get you started:
Command & Query Interface
Our SDK features a simple, intuitive command and query interface:
-
Commands: Execute actions (like pulling models)
-
Queries: Request information (like generating text)
Workshop Tutorial
Check out our included workshop documentation to get hands-on experience with LLMs:
- Workshop Setup - Set up your environment
- Workshop Guide - Follow our step-by-step guide
Next Steps
- Installation - Detailed installation instructions
- Quick Start - More examples to get you started quickly
- API Reference - Complete API documentation
- Guides - In-depth guides for specific use cases