How to Break Into AI Development Without Really Trying

The RickfactrThe Rickfactr
3 min read

A week ago, I was assigned a take-home engineering problem in an interview. On one hand, I was excited to have a reason to dive into some code, but this was all about creating an AI Chat interface using RAG techniques.

The requirements were pretty clear. An AI Chat UI to query the content of Jack Graham’s Essays using an LLM. Event though I’d never written any code integrating AI functionality and though my Python skills were shallow at best, still, this seemed easy enough… Right?

First off, I thought that since I would be writing code are AI, maybe I should use AI to help me write the code. I immediately downloaded Cursor. I won’t review Cursor here, but let me just say, “Wow!” It’s literally VS Code with an AI assistant that can generate reams of pretty darned good code. Of course, that code needs to be vetted, sometimes corrected (with the aid of the AI assistant?)… If you haven’t already, I suggest you give it a whirl for a free two-week trial. There is an always-free model in addition to subscription models.

Next, I set up Supabase locally as suggested in the requirements.

With Cursor and Supabase installed, I asked it to generate a Python script for data ingestion. WOW! I had it “working” in about 30 minutes. Now, granted, I rewrote that script several times as my understanding of tools and technical issue grew, but the last version still bears a striking resemblance to the AI-generated original.

Cursor introduced me to the Python “requests” module and Beautiful Soup for web scraping. As I dug into the world of Tokens, Vectors, and Embeddings for documents, learning what each of these concepts represents, I discovered — and learned about — the Supabase (postgreSQL) pgvegtor extension, Supabase migrations, LangChain’s WebLoader, RecursiveCharacterTextSplitter, OpenAIEmbeddings and SupabaseVectorStore components. FWIW, LangChain was a god-send.

As I moved on to the backend app, Cursor did a yeoman’s job of scaffolding Node.js and Express.js to implement a REST API. That code required some tweaking along the way to make use of the JavaScript versions of LangChain’s components and the Supabase client, but it basically did the basic job right from the start.

Then I had Cursor scaffold the Chat API using Next.js and Shadcn UI components. This was a reasonably good UI. I did have to hand-code the integration between the LLM search and the match_documents function that I’d set up early on. This also required some refactoring of match_documents and the Python data ingestion script to ensure that the essay titles were always captured when the essay page did not contain a <title/> element.

Lastly, there was some clean-up and documentation to do. Cursor helped with both.

The final project, PaulGrahamEssays, is publicly available on GitHub. If you feel the need to comment, remember, this was a learning project, not the product of years of experience…

In all, I would say this project was about 60% learning new concepts, tools, and techniques and about 40% writing code.

In the end, even if I don’t get a job offer from this company, I have demystified the basics of developing apps with AI integration for myself. This widens the scope of jobs to which I will be applying. That’s all good!

0
Subscribe to my newsletter

Read articles from The Rickfactr directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

The Rickfactr
The Rickfactr

A Principal Software Engineer who still slings code every day to make a living. Brother. Father. Grandpa. Farmer. Biker. Christian. I am a deeply flawed individual who believes in and is betting on the power of God's forgiveness through His Son, Jesus.