Step Back Prompting Algorithm

Jaskamal SinghJaskamal Singh
4 min read

Step-back prompting is a technique in prompt engineering that encourages language models to first consider high-level concepts and principles before diving into the specific details of a problem.
It involves the model posing a more abstract, broader question related to the original query, then using the insights gained from the abstract question to guide its reasoning on the original problem.

Ek diagram se samjhte hai

Ab simple terms me samjhte hai ki reciprocate rank fusion techique me hota kya hai :

🔄 Kya hota hai "Step-back Prompting"?

Socho tum kisi expert AI se kuch poochhna chahte ho, jaise:

“Mujhe batao ki React ke andar useEffect ka kya scene hai?”

Ab directly AI ko ye query de doge, toh wo apne dimaag (aur vector DB) se kuch info nikaalega — lekin kya guarantee hai ki tumhari query best format mein thi? Ho sakta hai info puri na mile ya relevant na ho.

Toh yahaan aata hai "Step-back Prompting" — ek jugadu technique!


🤔 Example se Samjho

Jaise tumhare paas ek dost hai jo Google mein search karna jaanta hai, par tumhare sawaal thode confusing hote hain. Toh wo kya karta hai?

  1. Pehle tumhara sawaal samajhta hai
    ("useEffect kya hota hai React mein?")

  2. Phir wo ek aur seedha sawaal banata hai
    ("What is useEffect in React and how is it used?")

  3. Phir wo Google pe us naye sawaal se search karta hai
    (Ye hi step hai retrieval ka)

  4. Aur last mein, tumhe simple explanation deta hai
    (Generation step)

Yahi AI bhi karta hai:


🧠 Step-back Prompting ke Steps

  1. User ki asli query lo

    “React mein useEffect ka kya kaam hai?”

  2. Us query se ek aur “backward” ya generalized query banao

    “React framework mein useEffect hook kya karta hai aur use kaise hota hai?”

    Ye step AI karta hai using prompt engineering. Isse zyada broad aur context-rich query banti hai.

  3. Is naye query se knowledge base (jaise Qdrant ya Pinecone) se context fetch karo

    Jis se zyada relevant info milega.

  4. Original query ka use karke final answer generate karo, with better context

    Toh jo jawab milega, wo zyada relevant, accurate aur helpful hoga.


💡 Kyu Zaroori Hai Step-back Prompting?

  • Normal query narrow ho sakti hai — jaise ek chhoti gali.

  • Step-back query ek bada highway banata hai jahan se zyada relevant info milta hai.

  • Ye RAG pipeline ko smart aur context-aware banata hai.


📦 Ek Chhota RAG Flow with Step-back Prompting

  1. User Query →
    “React mein useEffect ka kya use hai?”

  2. Step-back Prompting →
    “React ke lifecycle hooks mein useEffect ka kya role hai?”

  3. Vector DB se context retrieval

  4. Final answer generation →
    Detailed, relevant and context-rich explanation.


Aise samjho jaise pehle thoda peeche hatke sochna padta hai taaki aage ka raasta clear ho jaaye — isiliye "step-back prompting".

Let’s get into the code :

from openai import OpenAI
import requests
from dotenv import load_dotenv
import json
import os
from langchain_google_genai import GoogleGenerativeAIEmbeddings
from langchain_qdrant import QdrantVectorStore

load_dotenv()

# Step 1 - Get user question
question = input("Please enter your question: >  ")

# Step 2 - Generate step-back prompt
step_back_sys_prompt = """
You are a helpful AI assistant which is master in creating step back prompt.

Rules-
1. Follow the strict JSON output as per output schema.
2. Abstract the key concepts and principles relevant to question.
3. Use the abstraction to reason through the question

Output Format-
{{
    "prompt": "string"
}}

Example -
User prompt - Which is best framework to create REST apis in python?
Step back prompt - What is a framework, and which frameworks are available in Python?

User prompt - How to create REST apis in fastapi?
Step back prompt - What are REST APIs, and what steps are involved in creating REST APIs?

User prompt - How to do performance testing in locust?
Step back prompt - What is performance testing, and what are the requirements and steps involved in performance testing?
"""

client = OpenAI(
    api_key=os.environ['GEMINI_API_KEY'],
    base_url="https://generativelanguage.googleapis.com/v1beta/openai/"
)

msgs = [
    {"role": "system", "content": step_back_sys_prompt},
    {"role": "user", "content": question}
]

step_back_resp = client.chat.completions.create(
    model="models/gemini-1.5-flash-001",
    response_format={"type": "json_object"},
    messages=msgs
)

step_back_prompt = json.loads(step_back_resp.choices[0].message.content)
print("\n🧠 LLM Thinking...")
print("LLM created this step back prompt:")
print(step_back_prompt)

# Step 3 - Generate context using step-back prompt
context_sys_prompt = "You are an AI assistant which helps answer the question."

context_resp = client.chat.completions.create(
    model="models/gemini-1.5-flash-001",
    messages=[
        {"role": "system", "content": context_sys_prompt},
        {"role": "user", "content": step_back_prompt["prompt"]}
    ]
)

context = context_resp.choices[0].message.content

# Step 4 - Use context to answer original question
answer_sys_prompt = f"""
You are an AI assistant which helps answer the question based on given context.

Refer to the following context:
{context}
"""

final_resp = client.chat.completions.create(
    model="models/gemini-1.5-flash-001",
    messages=[
        {"role": "system", "content": answer_sys_prompt},
        {"role": "user", "content": question}
    ]
)

print("\nLLM Thinking...")
print("🧠: ", final_resp.choices[0].message.content.replace("*", "").replace("`", ""))

**
OUTPUT :**

So that’s all! 🙌 This was all about the Step technique.
Hope I made it easier for you to understand 😊

Agar yaha tak padh liya to
🙏 Shukriya Doston!

Milte hain agle post mein ek naye concept ke saath.
Happy learning! 🚀

#ChaiCode
#GenAI

0
Subscribe to my newsletter

Read articles from Jaskamal Singh directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Jaskamal Singh
Jaskamal Singh