Airbolt SDK

Minimal SDK for adding AI chat to your app. Works with vanilla JavaScript, React hooks, or a pre-built React component.

Installation

npm install @airbolt/sdk

Setup

  • Sign up at airbolt.ai and create a project
  • Add your provider API keys in project settings (OpenAI, Anthropic)
  • Copy your project ID (starts with proj_)
  • Usage

    Vanilla JavaScript/TypeScript

    import { Airbolt } from '@airbolt/sdk';
    const client = new Airbolt({
    projectId: 'proj_YOUR_PROJECT_ID'
    });
    // Basic chat
    const response = await client.chat({
    model: 'gpt-5-nano',
    messages: [{ role: 'user', content: 'Hello!' }]
    });
    console.log(response.choices[0].message.content);
    // Streaming
    const stream = await client.stream({
    model: 'gpt-5-nano',
    messages: [{ role: 'user', content: 'Tell me a story' }]
    });
    for await (const chunk of stream) {
    process.stdout.write(chunk.choices[0].delta.content || '');
    }

    React Hook

    import { useChat } from '@airbolt/sdk/react';
    function ChatApp() {
    const {
    messages,
    input,
    handleInputChange,
    handleSubmit,
    isLoading
    } = useChat({
    projectId: 'proj_YOUR_PROJECT_ID'
    });
    return (
    <div>
    {messages.map((m, i) => (
    <div key={i}>{m.role}: {m.content}</div>
    ))}
    <form onSubmit={handleSubmit}>
    <input
    value={input}
    onChange={handleInputChange}
    disabled={isLoading}
    />
    <button type="submit">Send</button>
    </form>
    </div>
    );
    }

    Pre-built Component

    import { ChatInterface } from '@airbolt/sdk/react';
    function App() {
    return (
    <ChatInterface
    projectId="proj_YOUR_PROJECT_ID"
    theme="light" // or "dark"
    height="600px"
    />
    );
    }

    Configuration Options

    Client Configuration

    const client = new Airbolt({
    projectId: 'proj_YOUR_PROJECT_ID', // Required
    endUserId: 'user_123', // Optional: Track end users
    debug: true, // Optional: Enable console logging
    });

    React Hook Options

    const {
    messages, // Message history
    input, // Current input
    isLoading, // Loading state
    error, // Error state
    handleInputChange, // Input handler
    handleSubmit, // Submit handler
    append, // Add message
    reload, // Retry last
    stop, // Stop stream
    } = useChat({
    projectId: 'proj_YOUR_PROJECT_ID',
    model: 'gpt-5-nano', // Optional: Default model
    onFinish: (message) => {}, // Optional: Message callback
    onError: (error) => {}, // Optional: Error callback
    });

    Component Props

    <ChatInterface
    projectId="proj_YOUR_PROJECT_ID" // Required
    model="gpt-5-nano" // Optional: AI model
    temperature={0.7} // Optional: Creativity (0-2)
    maxTokens={1000} // Optional: Response length
    theme="light" // Optional: "light" or "dark"
    height="600px" // Optional: Component height
    placeholder="Type a message..." // Optional: Input placeholder
    onMessage={(msg) => {}} // Optional: Message callback
    onError={(err) => {}} // Optional: Error callback
    />

    TypeScript

    import {
    Airbolt,
    AirboltConfig,
    ChatParams,
    ChatResponse,
    Message
    } from '@airbolt/sdk';
    const client: Airbolt = new Airbolt({
    projectId: 'proj_YOUR_PROJECT_ID'
    });
    const response: ChatResponse = await client.chat({
    model: 'gpt-5-nano',
    messages: [{ role: 'user', content: 'Hello!' }]
    });

    Error Handling

    try {
    const response = await client.chat({ ... });
    } catch (error) {
    switch(error.code) {
    case 'AUTH_ERROR': // Invalid project ID
    case 'RATE_LIMIT_ERROR': // Too many requests
    case 'VALIDATION_ERROR': // Invalid parameters
    case 'NETWORK_ERROR': // Connection issues
    }
    }

    Models

    Supports multiple AI providers. Use the format provider/model for clarity:

    OpenAI Models

  • GPT-5 Family: openai/gpt-5, openai/gpt-5-mini, openai/gpt-5-nano
  • GPT-4.1 Family: openai/gpt-4.1, openai/gpt-4.1-mini, openai/gpt-4.1-nano
  • GPT-4 Models: openai/gpt-4o, openai/gpt-4o-mini, openai/gpt-4-turbo
  • Reasoning Models: openai/o3, openai/o3-mini, openai/o1, openai/o1-mini
  • GPT-3.5 Models: openai/gpt-3.5-turbo
  • Anthropic Models

  • Claude 4 Family: anthropic/claude-sonnet-4-5, anthropic/claude-opus-4-1, anthropic/claude-haiku-4-1
  • Claude 3.7 Family: anthropic/claude-3.7-sonnet, anthropic/claude-3.7-haiku
  • Claude 3.5 Family: anthropic/claude-3.5-sonnet, anthropic/claude-3.5-haiku
  • Legacy format (without provider prefix) is supported for backward compatibility.


    SDK Version: 0.2.0-alpha.1