Asking an AI to Beat Writer’s Block?

Nityaa KalraNityaa Kalra
4 min read

For some months now, I’ve been immersed in wrapping up my master’s thesis, which involved scraping thousands of book descriptions from various sources (a tedious process that still gives me nightmares!). While the work was exhaustive, a thought kept nudging me: “Did I do all this work just for one project? Perhaps we could do something more?”

So last weekend, I did something about it. I repurposed my lovingly scraped book descriptions and built a little RAG-based application: a story blueprint generator that turns your vibe into a fictional universe. Still unnamed, but the concept is simple: describe the feeling or setting you’re going for, and get a scene sketch or story foundation that matches that mood.

But first things first, what’s all the hype about Retrieval-Augmented Generation, or RAG? I could have simply asked a standard Large Language Model (LLM) to give me a story blueprint, and it likely would have worked just fine, right? What’s the real need for a vector search when a simple database could store all my book descriptions? And most importantly, the question I always love to delve into: Is it even ethical to create a blueprint from existing stories? We shall answer all these critical questions below, revealing how this Vibe Story Creator navigates these waters to become an ethical and powerful creative partner.

The RAG Hype

At its core, Retrieval-Augmented Generation combines the power of large language models (LLMs) with an external knowledge base. Traditional LLMs are trained on vast datasets but can sometimes hallucinate information or provide generic, ungrounded responses. This is where a RAG steps in.

What is a RAG? RAG enhances an LLM’s capabilities by allowing it to retrieve relevant information from a specific, authoritative source before generating a response. Think of it as giving an incredibly intelligent student access to a meticulously organised library right before they answer a complex question. Instead of relying solely on their internal memory, they can look up the most accurate details and ensure their answer is both comprehensive and factual.

But, why not just ask ChatGPT to generate a story blueprint for you?

Fair question. You totally can just ask ChatGPT to generate a story idea from a vibe. And it would probably do a decent job. But here’s the thing: generic prompts give generic answers*. When you use RAG, you’re giving the LLM a context window full of curated knowledge. Instead of pulling from its entire (possibly outdated or biased) training data, it’s focused on the content you chose. In our application, that means feeding the model book descriptions that match the tone or theme of your prompt. So instead of a vague, overused sci-fi plot, you get a blueprint shaped by actual literary patterns*from the kind of books that inspired your prompt in the first place. That makes the generation both more unique and more aligned with real-world narrative structures.

The Indispensable Role of a Vector Store

At first glance, this might feel like over-engineering. Why use a vector database when I could’ve stored my book descriptions in a regular SQL table?

But for our story blueprints, we are not searching for exact keywords, we are searching for vibes. A vector store doesn’t just index your text, it embeds it in a multi-dimensional space using something like sentence-transformers. This means that when a user types: “a quiet but eerie village with forgotten secrets”, we can find summaries that are semantically similar, even if they use totally different words like “remote town”, “haunted past”, or “mysterious history”. This is the power of semantic search and it’s the beating heart of any good RAG app.

Tech Stack & Methods

Here’s what powers this weekend project:

  • LLM: Groq LLM API

  • Vector Database: Qdrant

  • Corpus: scraped book descriptions

  • Frontend: hosted publicly on Streamlit Cloud

Live Demo: https://vibe-story-creator.streamlit.app/ (Make sure to restart the app in case it’s on sleep mode)

Source code: https://github.com/nityaak5/vibe-story-creator?tab=readme-ov-file#vibe-story-creator

Ethics: Is This Plagiarism?

This question matters to me, deeply. As someone researching on bias and responsible AI, I’m careful about how AI uses existing creative work. But this is not plagiarism, and here’s why.

The book descriptions used are publicly available overviews and not the stories themselves. They don’t contain prose, dialogue, or narrative structure, just thematic scaffolding. The LLM uses these as inspiration, not source material. It’s akin to a writer reading widely to get ideas, then synthesising those ideas into something entirely new. The tool acts as a creative partner, designed to overcome common hurdles like writer’s block and suggest directions that a human might not immediately consider. It augments the human creator’s imagination. The ultimate story is still crafted, developed, and refined by the human user.

This web-app is a sandbox, part thesis detour, part narrative experiment. And honestly? It’s been fun to see where vibes can take us. If you’ve made it this far, thank you and if you have a better name than vibe-story-creator, I’m all ears :)

0
Subscribe to my newsletter

Read articles from Nityaa Kalra directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Nityaa Kalra
Nityaa Kalra

I'm on a journey to become a data scientist with a focus on Natural Language Processing, Machine Learning, and Explainable AI. I'm constantly learning and an advocate for transparent and responsible AI solutions.