Getting Started with Lumia AI
Learn how to set up and start using Lumia AI in your projects.
Step 1: Installation
Setting up Lumia AI in your project
Install the Lumia AI SDK using npm:
npm install @ai-sdk/Lumia
Or using yarn:
yarn add @ai-sdk/Lumia
Step 2: Configuration
Set up your environment and API key
Create a configuration file:
// config/Lumia.ts
import { LumiaConfig } from '@ai-sdk/Lumia';
export const config: LumiaConfig = {
apiKey: process.env.Lumia_API_KEY,
defaultModel: 'Lumia-V2-Pro',
// Optional: Configure default parameters
defaultParams: {
temperature: 0.7,
maxTokens: 1000
}
};
Set up your environment variables:
Get your API key here
# .env.local
Lumia_API_KEY=your_api_key_here
Step 3: Basic Usage
Create your first AI interaction
Create a simple text generation example:
import { generateText } from '@ai-sdk/Lumia';
async function generateResponse() {
try {
const response = await generateText({
prompt: 'Tell me about artificial intelligence',
system: 'You are a helpful AI assistant.'
});
console.log(response.text);
} catch (error) {
console.error('Error:', error);
}
}
Step 4: Project Structure
Organize your Lumia AI project
Recommended project structure:
your-project/
├── src/
│ ├── config/
│ │ └── Lumia.ts
│ ├── services/
│ │ └── ai.service.ts
│ └── components/
│ └── AI/
│ ├── TextGeneration.tsx
│ └── ChatInterface.tsx
├── .env.local
└── package.json
Example AI service:
// services/ai.service.ts
import { generateText, streamText } from '@ai-sdk/Lumia';
import { config } from '../config/Lumia';
export class AIService {
static async generate(prompt: string) {
return generateText({
...config,
prompt
});
}
static async stream(prompt: string, onChunk: (chunk: string) => void) {
return streamText({
...config,
prompt,
onChunk: ({ chunk }) => {
if (chunk.type === 'text-delta') {
onChunk(chunk.text);
}
}
});
}
}