Guides

Building Next.js App Router patterns with Vercel and Clou...

Integrate AI SDKs into Next.js applications while securing API keys and managing rate limits through server-side handling and middleware.

1-2 hours5 steps
1

Configure environment variables

Store AI API keys in .env.local with NEXT_PUBLIC_ prefix for client access and NEXT_PRIVATE_ for server-only use.

NEXT_PRIVATE_AI_API_KEY=your-secure-key
NEXT_PUBLIC_AI_API_ENDPOINT=https://api.example-ai.com/v1
2

Create server-side API route

Implement a protected API route to handle AI SDK requests, preventing direct client-side key exposure.

app/api/ai/route.js
import { NextResponse } from 'next/server';
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: process.env.NEXT_PRIVATE_AI_API_KEY,
  baseURL: process.env.NEXT_PUBLIC_AI_API_ENDPOINT,
});

export async function POST(request) {
  const { prompt } = await request.json();
  const response = await openai.chat.completions.create({
    model: 'gpt-3.5-turbo',
    messages: [{ role: 'user', content: prompt }],
  });
  return NextResponse.json(response.choices[0].message);
}

⚠ Common Pitfalls

  • Exposing NEXT_PRIVATE_ variables in client-side code
  • Forgetting to add .env.local to .gitignore
3

Implement rate limiting middleware

Add middleware to restrict API call frequency per user session using Redis or in-memory tracking.

middleware.js
import { NextResponse } from 'next/server';
import { ratelimit } from '@/lib/ratelimit';

export function middleware(request) {
  const ip = request.ip || request.headers.get('x-forwarded-for') || '127.0.0.1';
  const { success } = ratelimit.limit(ip);
  if (!success) {
    return NextResponse.json({ error: 'Rate limit exceeded' }, { status: 429 });
  }
  return NextResponse.next();
}
4

Build client-side interaction component

Create a React component that sends user input to the protected API route with proper error handling.

components/AIChat.js
export default function AIChat() {
  const [input, setInput] = useState('');
  const [response, setResponse] = useState('');

  const handleSubmit = async () => {
    const res = await fetch('/api/ai', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ prompt: input }),
    });
    const data = await res.json();
    setResponse(data.content);
  };

  return (
    <div>
      <input value={input} onChange={(e) => setInput(e.target.value)} />
      <button onClick={handleSubmit}>Send</button>
      <div>{response}</div>
    </div>
  );
}
5

Set up logging and monitoring

Integrate with Vercel's logging or external services to track API usage patterns and detect anomalies.

Vercel CLI: vercel logs --type=error
Third-party: Sentry or Datadog integration

What you built

This implementation secures AI API keys through server-side handling, prevents rate abuse with middleware, and provides structured client interactions while maintaining compliance with deployment constraints.