SDK Reference
The Gambiarra SDK provides Vercel AI SDK integration for using shared LLMs in your applications.
Installation
Section titled “Installation”npm install gambiarra-sdk# orbun add gambiarra-sdkBasic Usage
Section titled “Basic Usage”import { createGambiarra } from "gambiarra-sdk";import { generateText } from "ai";
const gambiarra = createGambiarra({ roomCode: "ABC123", hubUrl: "http://localhost:3000",});
const result = await generateText({ model: gambiarra.any(), prompt: "Hello, Gambiarra!",});Configuration
Section titled “Configuration”createGambiarra(options)
Section titled “createGambiarra(options)”Creates a Gambiarra provider instance.
Options:
| Option | Type | Description | Required |
|---|---|---|---|
roomCode | string | Room code to connect to | Yes |
hubUrl | string | Hub URL | No (auto-discover) |
Model Routing
Section titled “Model Routing”gambiarra.any()
Section titled “gambiarra.any()”Route to any available participant.
const result = await generateText({ model: gambiarra.any(), prompt: "Explain quantum computing",});gambiarra.participant(id)
Section titled “gambiarra.participant(id)”Route to a specific participant by nickname.
const result = await generateText({ model: gambiarra.participant("joao"), prompt: "Write a haiku about TypeScript",});gambiarra.model(name)
Section titled “gambiarra.model(name)”Route to a participant with a specific model.
const result = await generateText({ model: gambiarra.model("llama3"), prompt: "What is the meaning of life?",});Streaming
Section titled “Streaming”import { streamText } from "ai";
const stream = await streamText({ model: gambiarra.model("llama3"), prompt: "Write a story about a robot",});
for await (const chunk of stream.textStream) { process.stdout.write(chunk);}Advanced Options
Section titled “Advanced Options”const result = await generateText({ model: gambiarra.any(), prompt: "Explain recursion", temperature: 0.7, maxTokens: 500,});