Building Chat Interfaces
Learn how to create interactive chat applications with Lumia AI.
Step 1: Setup
Setting up your development environment
First, install the required dependencies:
npm install @ai-sdk/Lumia react @types/react
Create a new component file for your chat interface:
touch components/ChatInterface.tsx
Step 2: Basic Structure
Creating the chat interface component
Set up the basic component structure with state management:
import { useState } from 'react';
import { Lumia } from '@ai-sdk/Lumia';
export default function ChatInterface() {
const [messages, setMessages] = useState([]);
const [input, setInput] = useState('');
return (
<div className="flex flex-col h-[600px]">
<div className="flex-1 overflow-y-auto">
{/* Messages will go here */}
</div>
<div className="p-4">
{/* Input form will go here */}
</div>
</div>
);
}
Step 3: Message Handling
Implementing message display and input handling
Add message components and handle user input:
function Message({ role, content }) {
return (
<div className={`flex ${role === 'user' ? 'justify-end' : 'justify-start'}`}>
<div className={`rounded-2xl px-4 py-2 ${
role === 'user' ? 'bg-blue-100' : 'bg-gray-100'
}`}>
{content}
</div>
</div>
);
}
// Add to your component:
const handleSubmit = (e) => {
e.preventDefault();
if (!input.trim()) return;
setMessages(prev => [...prev, { role: 'user', content: input }]);
setInput('');
// We'll add AI response handling next
};
Step 4: AI Integration
Connecting with Lumia AI for responses
Integrate Lumia AI for streaming responses:
import { streamText } from '@ai-sdk/Lumia';
// Add to your handleSubmit function:
const response = await streamText({
model: Lumia('Lumia-V2-Pro'),
prompt: input,
system: 'You are a helpful assistant.',
onChunk: ({ chunk }) => {
if (chunk.type === 'text-delta') {
setMessages(prev => {
const last = prev[prev.length - 1];
last.content += chunk.text;
return [...prev];
});
}
}
});