Whts th Cptl of Frnce? ChatGPT Still Knows It’s Paris.

Have you ever wondered how well ChatGPT can handle really messy questions? Like, questions with typos, weird abbreviations, and random punctuation thrown in just to see if it breaks? Well, we put it to the test with a classic: “What is the capital of France?”, but with a twist.
Objective
The goal here was simple: check if ChatGPT is robust enough to understand and answer questions correctly even when they’re distorted by typos, slang, abbreviations, and unconventional phrasing. Could it still get the right answer without missing a beat?
Methodology
We threw ten intentionally “broken” versions of the question at ChatGPT. Here’s the full lineup:
Whats the capitel of France?
What is the capital of Frans?
Frants capitel?
What hte “capital” of France//
What are the HQ of France?
Cpt of Fr?
Which city is the capital of Franco?
Which cit ishte capital of France
Wchtis thr cspital offrance?
Whats th french cptakl?
Yep, some of these look like someone mashed the keyboard while half-asleep, but that was the point!
Results
Despite the mangled spelling and funky phrasing, ChatGPT nailed every single one. Here’s a quick breakdown:
Question | ChatGPT’s Interpretation | Answer |
1. Whats the capitel of France? | Classic typo, meant “capital of France” | Paris |
2. What is the capital of Frans? | “Frans” = “France” (typo) | Paris |
3. Frants capitel? | “Frants” = “France” | Paris |
4. What hte “capital” of France// | Extra typos and punctuation | Paris |
5. What are the HQ of France? | “HQ” interpreted as “capital” or main city (headquarters) | Paris |
6. Cpt of Fr? | Abbreviations for “capital” and “France” | Paris |
7. Which city is the capital of Franco? | “Franco” assumed to mean “France” (not the dictator…) | Paris |
8. Which cit ishte capital of France | More typos, but the meaning is clear | Paris |
9. Wchtis thr cspital offrance? | Keyboard smash, but question understood | Paris |
10. Whats th french cptakl? | Slightly jumbled, but asking about France’s capital | Paris |
In all cases, ChatGPT correctly identified the question’s intent and answered: Paris.
Conclusion
This test clearly shows that ChatGPT has some serious natural language understanding skills. Even with all the “noise” in the questions, from typos to slang to weird syntax, it can decode what the user really wants to know.
In other words: ChatGPT doesn’t just read words, it gets you, even when your keyboard is having a bad day.
So next time you’re too tired to spell properly, don’t worry, ChatGPT probably knows what you mean anyway. 😉
Subscribe to my newsletter
Read articles from George Perdikas directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
