Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.erdo.ai/llms.txt

Use this file to discover all available pages before exploring further.

TypeScript SDK Overview

The Erdo TypeScript SDK enables you to integrate Erdo’s AI agents directly into your applications. Whether you’re building a Next.js app, a Node.js backend, or a React frontend, the SDK provides everything you need to invoke agents and render their results.

Packages

PackageDescription
@erdoai/serverServer-side client for invoking Erdo agents
@erdoai/uiReact components for rendering agent results

Installation

npm install @erdoai/server @erdoai/ui

Quick Start

1. Create an API Token

Get your API token from the Erdo dashboard.

2. Invoke an Agent

import { ErdoClient } from '@erdoai/server';

const client = new ErdoClient({
  authToken: process.env.ERDO_AUTH_TOKEN,
});

// Invoke an agent and get the result
const result = await client.invoke('data-analyst', {
  messages: [{ role: 'user', content: 'What were our top products last month?' }],
});

console.log(result.messages);

3. Stream Results

For real-time updates, use streaming:
for await (const event of client.invokeStream('data-analyst', {
  messages: [{ role: 'user', content: 'Analyze our sales data' }],
})) {
  switch (event.type) {
    case 'content':
      console.log('New content:', event.payload);
      break;
    case 'status':
      console.log('Status:', event.payload);
      break;
    case 'done':
      console.log('Complete!');
      break;
  }
}

4. Render Results in React

Never expose your API key to the browser. Use one of two secure patterns: ephemeral tokens or proxy streaming.
Option A: Ephemeral Tokens (Recommended) Your backend creates a short-lived, scoped token. The frontend uses it to stream directly from the Erdo API.
// app/api/authorize/route.ts (SERVER)
import { ErdoClient } from '@erdoai/server';

const client = new ErdoClient({ authToken: process.env.ERDO_AUTH_TOKEN });

export async function POST(request: Request) {
  const { botKey } = await request.json();
  // Add your own RBAC logic here
  const { token, tokenId, expiresAt } = await client.createToken({
    botKeys: [botKey],  // Bot keys (e.g., "my-org.data-analyst")
    expiresInSeconds: 3600,
  });
  return Response.json({ token, tokenId, expiresAt });
}
// app/page.tsx (CLIENT)
'use client';
import { Content } from '@erdoai/ui';

function ChatInterface() {
  const [contents, setContents] = useState([]);

  const handleSubmit = async (query: string) => {
    // 1. Get ephemeral token from your backend
    const { token } = await fetch('/api/authorize', {
      method: 'POST',
      body: JSON.stringify({ botKey: 'data-analyst' }),
    }).then(r => r.json());

    // 2. Stream directly from Erdo API
    const response = await fetch('https://api.erdo.ai/bots/data-analyst/invoke', {
      method: 'POST',
      headers: { Authorization: `Bearer ${token}`, 'Content-Type': 'application/json' },
      body: JSON.stringify({ messages: [{ role: 'user', content: query }] }),
    });

    // 3. Parse SSE stream and render results
    // See integration docs for full streaming example
  };

  return (
    <div>
      {contents.map((item, i) => <Content key={i} content={item} />)}
    </div>
  );
}
Option B: Proxy Streaming Your backend proxies the stream. The API key never leaves your server.
// app/api/invoke/route.ts (SERVER)
import { ErdoClient } from '@erdoai/server';

const client = new ErdoClient({ authToken: process.env.ERDO_AUTH_TOKEN });

export async function POST(request: Request) {
  const { botKey, messages } = await request.json();

  const stream = new ReadableStream({
    async start(controller) {
      for await (const event of client.invokeStream(botKey, { messages })) {
        controller.enqueue(new TextEncoder().encode(`data: ${JSON.stringify(event)}\n\n`));
      }
      controller.close();
    },
  });

  return new Response(stream, {
    headers: { 'Content-Type': 'text/event-stream' },
  });
}
See Integration Patterns for complete examples of both approaches.

Use Cases

Embed Data Analysis

Add AI-powered data analysis to your existing application:
// User asks a question in your app
const result = await client.invoke('data-analyst', {
  messages: [{ role: 'user', content: userQuestion }],
  datasets: ['sales-data', 'customer-data'],
});

// Render charts and insights

AI Tool in Chat Applications

Use Erdo as a tool alongside your LLM for data-intensive queries:
// When your LLM detects a data question, delegate to Erdo
if (isDataQuestion(userMessage)) {
  const erdoResult = await client.invoke('data-analyst', {
    messages: [{ role: 'user', content: userMessage }],
  });
  // Return rich visualizations to the user
}

Automated Reporting

Generate reports programmatically:
const report = await client.invoke('data-analyst', {
  messages: [{ role: 'user', content: 'Generate a weekly sales report' }],
  parameters: {
    dateRange: 'last_7_days',
    format: 'detailed',
  },
});

Environment Variables

Configure the SDK using environment variables:
VariableDescriptionDefault
ERDO_AUTH_TOKENYour Erdo API keyRequired
ERDO_ENDPOINTAPI endpointhttps://api.erdo.ai
// Uses environment variables automatically
const client = new ErdoClient();

Next Steps