Ship AI in Minutes

Airbolt is the only way to integrate AI into your app with zero backend code.

Currently in public beta.

Set up in minutes
No backend code
No credit card required

The Only Code You'll Write

Here's all the code you need to start using AI in your app:

import { ChatInterface } from '@airbolt/sdk/react';

function App() {
  return (
    <ChatInterface 
      projectId="proj_your_project_id"
    />
  );
}

That's It. Really.

Get your AI app running in a few simple steps. No complex infrastructure or backend code required.

01

Create an Airbolt project

Sign up, create a project, and add your provider API keys. No credit card required.

02

Set up SDK

Install with your favorite package manager and initialize the SDK with only a few lines of code.

03

Start building

Make your first API call in minutes. Toggle LLM settings in the dashboard with just a few clicks.

Built for Builders

Everything you need to integrate AI into your applications, without the complexity.

Zero Backend

Just add our SDK to any client app and start building. We handle all the backend complexity for you.

Secure by Default

Short-lived tokens and per-user rate limits built in. Zero configuration needed.

Bring Your Own LLM Keys

Your provider API keys stay encrypted on our servers with AES-256-GCM, never exposed to browsers. Supports OpenAI, Anthropic, Google Gemini and more.

Flexible Integration

Drop in our React chat component, or build custom UIs with hooks and JavaScript/TypeScript APIs. Streaming, state management, and error handling enabled out of the box.

Coming soon: Bring your own auth • iOS/Android SDKs • Dynamic model routing • Tool use • And more

Skip The Boilerplate

Focus on building your app, not building LLM boilerplate code.

Without Airbolt

~200 lines
// components/Chat.tsx
import { useState } from 'react';

type Message = { id: string; role: 'user' | 'assistant'; content: string };

export default function Chat() {
  const [messages, setMessages] = useState<Message[]>([]);
  const [input, setInput] = useState('');
  const [loading, setLoading] = useState(false);

  async function sendMessage(e: React.FormEvent) {
    e.preventDefault();
    if (!input.trim()) return;

    const userMessage: Message = {
      id: String(Date.now()),
      role: 'user',
      content: input,
    };

    setMessages(prev => [...prev, userMessage]);
    setInput('');
    setLoading(true);

    try {
      const res = await fetch('/api/chat', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({
          messages: [...messages, userMessage].map(m => ({ role: m.role, content: m.content }))
        }),
      });

      if (!res.ok) throw new Error('Request failed');

      const data = await res.json();
      setMessages(prev => [
        ...prev,
        { id: String(Date.now() + 1), role: 'assistant', content: data.reply || '' },
      ]);
    } catch {
      setMessages(prev => [
        ...prev,
        { id: String(Date.now() + 2), role: 'assistant', content: 'Sorry, something went wrong.' },
      ]);
    } finally {
      setLoading(false);
    }
  }

  return (
    <div className="chat">
      <div className="messages">
        {messages.map(m => (
          <div key={m.id} className={m.role === 'user' ? 'user' : 'assistant'}>
            {m.content}
          </div>
        ))}
      </div>
      <form onSubmit={sendMessage} className="input">
        <input
          value={input}
          onChange={e => setInput(e.target.value)}
          placeholder="Ask something..."
          disabled={loading}
        />
        <button type="submit" disabled={loading || !input.trim()}>
          {loading ? 'Sending...' : 'Send'}
        </button>
      </form>
    </div>
  );
}

// pages/api/chat.ts
import type { NextApiRequest, NextApiResponse } from 'next';
import { initializeProvider } from './providers'; // Your provider logic

const provider = initializeProvider({
  apiKey: process.env.PROVIDER_API_KEY!
});

export default async function handler(req: NextApiRequest, res: NextApiResponse) {
  if (req.method !== 'POST') return res.status(405).end();

  try {
    const { messages } = req.body as { messages: { role: 'user' | 'assistant' | 'system'; content: string }[] };

    const completion = await provider.chat.completions.create({
      model: 'gpt-4o-mini', // or claude-3, gemini-pro, etc
      messages,
    });

    res.status(200).json({
      reply: completion.choices[0]?.message?.content ?? '',
    });
  } catch {
    res.status(500).json({ error: 'AI provider request failed' });
  }
}
State mgmtAPI routesAuthStreamingErrorsDeploy

With Airbolt

8 lines
import { ChatInterface } from '@airbolt/sdk/react';

function App() {
  return (
    <ChatInterface
      projectId="proj_your_project_id"
    />
  );
}
All features included automatically

Included Out of the Box

Secure API proxy
Encrypted key storage (BYOK)
Rate limiting
Streaming responses
Usage logging
Error handling & retries
React UI components
Multiple providers

Questions? We've Got Answers

You bring your own OpenAI API key. We store provider keys encrypted at rest and use them only to fulfill your requests.
Currently, we only support OpenAI models, but multi-provider support will be added in an upcoming release.
Using Airbolt is currently free. Bring your own OpenAI API keys and pay for token usage as you normally would.
Yes. Airbolt is actively developed, uses semantic versioning for the SDK, and you should lock your SDK version in production.

Ready to Ship AI-Powered Features?

Start building your app in minutes or take a look at our docs before you get started!