Serverless AI: Using Vercel + OpenAI in Real Projects

Subin S KSubin S K
3 min read

Artificial Intelligence is reshaping how we build applications, and the rise of serverless platforms like Vercel makes it easier than ever to integrate AI capabilities into real projects—without managing servers.

In this post, I’ll walk you through how I combine Vercel’s serverless functions with OpenAI’s API to build fast, scalable AI-powered applications.

What Is Serverless AI?

Serverless means you don’t worry about provisioning or managing servers. Instead, your code runs in response to events, such as an HTTP request. This approach fits AI workloads perfectly since:

  • AI inference happens on demand

  • You only pay for what you use

  • It scales automatically with traffic

Using Vercel, a popular serverless platform optimized for frontend and backend workloads, makes deploying AI-powered apps seamless.

Why Vercel + OpenAI?

  • Vercel offers fast deployments, instant scaling, and seamless integration with frontend frameworks like Next.js.

  • OpenAI’s API provides access to powerful AI models (like GPT-4, Codex, DALL·E) via simple REST endpoints.

  • Together, they let you build AI features without maintaining backend infrastructure.

Setting Up a Serverless AI Endpoint on Vercel

Here’s a quick example to create a serverless API route on Vercel that talks to OpenAI.

  1. Create a Next.js app (or use your existing one):
npx create-next-app my-ai-app
cd my-ai-app
  1. Install the OpenAI Node.js client:
npm install openai
  1. Add your OpenAI API key as an environment variable in Vercel or your local .env.local file:
OPENAI_API_KEY=your_openai_api_key_here
  1. Create an API route:

In pages/api/generate.js (or .ts if using TypeScript), add:

import { Configuration, OpenAIApi } from "openai";

const configuration = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});

const openai = new OpenAIApi(configuration);

export default async function handler(req, res) {
  if (req.method !== "POST") {
    return res.status(405).json({ error: "Method not allowed" });
  }

  const { prompt } = req.body;

  if (!prompt) {
    return res.status(400).json({ error: "Prompt is required" });
  }

  try {
    const completion = await openai.createCompletion({
      model: "text-davinci-003",
      prompt,
      max_tokens: 150,
    });

    res.status(200).json({ text: completion.data.choices[0].text.trim() });
  } catch (error) {
    res.status(500).json({ error: error.message || "OpenAI request failed" });
  }
}
  1. Test your endpoint locally:

Use tools like Postman or a simple frontend fetch to send a POST request with JSON { "prompt": "Hello AI" }.

How This Helps in Real Projects

  • Rapid Prototyping: Spin up AI features quickly without backend setup

  • Scalability: Vercel handles traffic spikes seamlessly

  • Cost Efficiency: Pay only for actual usage, no idle server costs

  • Security: API keys stay secure in serverless functions, not exposed to frontend

Next Steps

  • Build a chat interface on top of this API

  • Add streaming responses for real-time interaction

  • Combine with database integrations (e.g., Prisma, PlanetScale) for personalized AI

  • Experiment with other OpenAI models like GPT-4 or DALL·E for images

Final Thoughts

Using Vercel and OpenAI together makes adding powerful AI capabilities to your apps easier than ever. You get all the benefits of serverless — zero maintenance, automatic scaling, and great developer experience — while tapping into cutting-edge AI models.

If you haven’t tried building an AI-powered feature with serverless functions yet, now’s a great time to start.

0
Subscribe to my newsletter

Read articles from Subin S K directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Subin S K
Subin S K

Hey, I’m Subin — a curious and ever-evolving SDE who loves breaking down complex ideas into simple, actionable insights. Whether it’s building full-stack apps, exploring system design, or sharing bite-sized dev wisdom, this space is where I turn my learning journey into stories and tutorials. Let’s grow together, one commit at a time 🚀