Skip to main content

Text Chat

Send text messages to your AI character and handle streaming responses.

Sending Text

Use sendText() to send a message after connecting:

client.sendText('What is the weather like today?');

The method fires a text event over the WebSocket. The server processes the message through the AI pipeline and streams the response back via botResponse events.

info

sendText() throws an EstuaryError with code NOT_CONNECTED if the client is not connected. Always call connect() first.

Receiving Responses

Bot responses arrive as a stream of chunks. Each chunk is a BotResponse object:

client.on('botResponse', (response) => {
if (response.isFinal) {
// Full response is ready
console.log('Complete response:', response.text);
} else {
// Partial chunk -- `partial` contains just this chunk's text
process.stdout.write(response.partial);
}
});

BotResponse Fields

FieldTypeDescription
textstringThe full accumulated response text so far
partialstringThe text content of just this chunk
isFinalbooleantrue when the response is complete
messageIdstringUnique identifier for this response
chunkIndexnumberSequential index of this chunk (starts at 0)
isInterjectionbooleantrue if this is a proactive message, not a reply to user input

Streaming Pattern

A typical response arrives as multiple events:

botResponse { chunkIndex: 0, partial: "The weather",  isFinal: false, text: "The weather" }
botResponse { chunkIndex: 1, partial: " today is", isFinal: false, text: "The weather today is" }
botResponse { chunkIndex: 2, partial: " sunny and", isFinal: false, text: "The weather today is sunny and" }
botResponse { chunkIndex: 3, partial: " warm.", isFinal: true, text: "The weather today is sunny and warm." }

The text field accumulates across chunks, so on isFinal: true it contains the full response.

Text-Only Mode

By default, sending text triggers both a text response (botResponse) and a voice response (botVoice). To suppress the voice response and receive text only, pass true as the second argument:

// Text response only -- no TTS audio generated
client.sendText('Give me a summary of our conversation.', true);

This is useful for UI-driven interactions where voice output is not needed, or to reduce latency and bandwidth when only the text matters.

Interrupting a Response

You can interrupt an in-progress response with interrupt(). This tells the server to stop generating and clears any queued audio playback:

// Interrupt the current response
client.interrupt();

// Optionally specify which message to interrupt
client.interrupt(response.messageId);

Listen for the server's confirmation:

client.on('interrupt', (data) => {
console.log('Response interrupted:', data.messageId);
});

Interjections

Sometimes the character sends a message without being prompted -- for example, a greeting when you first connect, or a follow-up question. These are marked with isInterjection: true:

client.on('botResponse', (response) => {
if (response.isInterjection && response.isFinal) {
console.log('Character said (unprompted):', response.text);
}
});

Example: Chat Loop

Here is a complete example of a text chat loop using Node.js readline:

import { EstuaryClient } from '@estuary-ai/sdk';
import * as readline from 'readline';

const client = new EstuaryClient({
serverUrl: 'https://api.estuary-ai.com',
apiKey: 'est_your_api_key',
characterId: 'your-character-uuid',
playerId: 'user-123',
});

client.on('botResponse', (response) => {
if (response.isFinal) {
console.log(`\nBot: ${response.text}\n`);
rl.prompt();
}
});

const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
prompt: 'You: ',
});

async function main() {
await client.connect();
console.log('Connected! Type a message and press Enter.\n');
rl.prompt();

rl.on('line', (line) => {
const text = line.trim();
if (text) {
client.sendText(text);
} else {
rl.prompt();
}
});

rl.on('close', () => {
client.disconnect();
process.exit(0);
});
}

main().catch(console.error);

Next Steps