How I built an in-game AI chatbot/wiki overlay in a month

Weizhen ChuWeizhen Chu
2 min read

I spent a month building GameWiki, an in-game AI assistant that chooses a game-specific knowledge base based on the active window title, then answers either “wiki” lookups (open a specific wiki page) or “strategy” questions using a two-stage API flow + RAG (vector search + keyword search). See the repo on GitHub for code and a demo. GameWiki-Ingame chatbot


Why

LLMs often give confident but incorrect game tips, and watching YouTube walkthroughs takes time. A game-specific local knowledge base grounds answers and speeds up finding reliable guides.

And I, as a player, hate Alt+Tab during gaming. I wanted an assistant that:

  • stays on top of the game (overlay / pinned browser),

  • uses game-specific knowledge to answer questions or provides information.

Google’s free-tier APIs made running embedding + LLM based RAG feasible for an OSS hobby project.

What it does

When the user asks something, the client does a two-layer API call:

    1. Router call (LLM-lite) — rewrite & route:

      • Rewrites the raw query (shortens, clarifies, extracts intent).

      • Decides route: wiki (open wiki page) or strategy (answer using KB).

      • Chooses which KB to query by matching the current window title to KB metadata.

      1. Answer call:

        • If wiki: map game → canonical wiki domain, perform a targeted search/open a URL and return the page preview/link.

        • If strategy: perform hybrid RAG:

          • Vector search (embedding similarity) to get semantically relevant docs.

          • BM25s keyword search to grab exact-match passages (good for names/stats).

          • Merge top hits, build a compact context, and call the downstream LLM with a prompt that includes the context + rewriting note (e.g., “Answer concisely, with steps and links”).


Pseudocode

# map active window to KB
game_kb = map_window_title(active_window_title)

# stage 1: intent + rewrite
intent, q = intent_and_rewrite(raw_user_query)

if intent == "wiki":
    page = open_wiki_page(wiki_url)
else:
    vec_hits = vector_search(game_kb, q, k=10)
    bm25_hits = bm25_search(game_kb, q, k=10)
    hits = fuse_rank(vec_hits, bm25_hits)[:5]
    context = concat_passages(hits)
    answer = llm_answer_with_context(context, q, instruct_grounded=True)

Repo & try it

Code, indexer scripts, and a demo overlay are on GitHub. The project uses Google Gemini (free-tier) for the AI features and supports quick wiki access + AI Q&A.

0
Subscribe to my newsletter

Read articles from Weizhen Chu directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Weizhen Chu
Weizhen Chu