Skip to content

Quick Start

This guide will help you set up Gambiarra and start sharing LLMs on your network.

The CLI allows you to start hubs, create rooms, and join as a participant.

Via curl (recommended):

Terminal window
curl -fsSL https://raw.githubusercontent.com/arthurbm/gambiarra/main/scripts/install.sh | bash

Via npm:

Terminal window
npm install -g gambiarra

Via bun:

Terminal window
bun add -g gambiarra

The SDK provides Vercel AI SDK integration for using shared LLMs in your applications.

Terminal window
npm install gambiarra-sdk
# or
bun add gambiarra-sdk
Terminal window
gambiarra serve --port 3000 --mdns
Terminal window
gambiarra create
# Output: Room created! Code: ABC123
Terminal window
gambiarra join ABC123 \
--endpoint http://localhost:11434 \
--model llama3 \
--nickname joao
import { createGambiarra } from "gambiarra-sdk";
import { generateText } from "ai";
const gambiarra = createGambiarra({
roomCode: "ABC123",
hubUrl: "http://localhost:3000",
});
const result = await generateText({
model: gambiarra.any(),
prompt: "Hello, Gambiarra!",
});
console.log(result.text);