I Built a Full-Stack AI App in 2025: From Idea to Insight with Next.js and Genkit

Sumant JadhavSumant Jadhav
3 min read

Tags: ai, nextjs, typescript, firebase, genkit


Hey everyone! We've all been there: staring at a giant spreadsheet of user feedback, knowing there are gold nuggets of insight inside, but having no time to dig for them. This manual, tedious process is where great feedback goes to die.

So, I decided to tackle this problem head-on by building "Feedback Lens," a full-stack AI app that turns a boring CSV into an interactive, strategic dashboard that can even compare feedback over time.


๐Ÿ”Ž The Idea: From CSV to Actionable Insights

My goal was to create a tool where a user could upload a file and instantly understand:

  • What's the overall sentiment?

  • What specific topics are people talking about?

  • How are these trends changing over time?

This meant I needed:

  • A front-end for the UI

  • A database to save results

  • An AI "brain" to do the heavy lifting

I chose Next.js for the frontend, Firestore for the database, and Google's new Genkit for orchestrating the AI calls.


๐Ÿง The AI "Brain": It's All in the Prompt

The core of the application is the AI analysis. Getting this right was key. The biggest lesson? Strict prompting. Giving the AI a controlled set of options ensured the output was clean and consistent.

// A snippet from my Genkit flow prompt
const BATCH_READY_PROMPT = `
...
2.  Topic Extraction: Assign one or more topics from this SPECIFIC list only: ["UI/UX", "Billing", "Performance", "Customer Support", "Feature Request", "Mobile App", "API", "General"].
...
`;

โšก Hitting a Wall: The Inevitable API Rate Limit

My first prototype worked great for 5 rows. But when I tried 100 rows...

My app was making a separate AI call for every single row. I was spamming the AI service, and it rightfully told me to slow down.


๐Ÿงฐ The Solution: Batch, Don't Spam

The fix? Batching.

// Simplified logic for creating batches
const BATCH_SIZE = 15;
for (let i = 0; i < allRows.length; i += BATCH_SIZE) {
  const batch = allRows.slice(i, i + BATCH_SIZE);
  // Now, make ONE API call for the entire batch
  const analyzedBatch = await analyzeBatchOfFeedback(batch);
}

This change:

  • Reduced 100 API calls to just 7

  • Solved the rate limit problem

  • Improved performance massively


๐Ÿ”ข The Final Product: A Strategic Comparison Tool

Once the app could handle large data sets, I added the killer features:

  • Saving report history to Firestore

  • A Comparison Dashboard that shows changes over time


๐Ÿ”ฌ What I Learned

  • AI isn't magic. Good prompts = good results.

  • Scalability is not optional. Batching was the biggest unlock.

  • Ship in layers. Start simple, then add polish.

This project pushed me to think like a dev, a product thinker, and a data storyteller.


๐Ÿš€ Check It Out

Live Demo


Let me know what you think in the comments!

0
Subscribe to my newsletter

Read articles from Sumant Jadhav directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Sumant Jadhav
Sumant Jadhav