Building My First Budget AI Workstation

In late 2024, I decided to build a dedicated AI and deep learning workstation. My goal was to explore GPUs more closely and deepen my understanding of how hardware supports deep learning. I wasn’t aiming for a top-tier rig but rather something affordable and practical—powerful enough for my projects, yet less costly than the high-end setups I often read about.
Before starting, I thought about why building such a workstation might not always make sense. Good components are expensive, and putting a PC together requires technical skills. There are also many free options like Google Colab or Kaggle, and small models can even run on CPUs or Apple silicon without needing a GPU at all. For many people, these options might be more than enough.
Still, I had my reasons. I wanted to learn more about hardware, run and debug code locally, and avoid relying completely on the cloud. I also wanted a setup I could use regularly without worrying about usage limits or costs.
When planning the build, I focused on a few essentials: matching GPU memory to the size of the models I’d be working with, choosing a CPU that wouldn’t bottleneck performance, finding a compatible motherboard, making sure the case could fit the GPU, selecting a power supply strong enough to handle everything, and adding storage and cooling.
For my GPU, I picked an NVIDIA GeForce RTX 3060 with 12 GB of VRAM. It isn’t the fastest card, but it’s powerful enough to let me experiment with frameworks like TensorFlow and PyTorch. To keep things simple and save time, I bought a refurbished Dell Optiplex 9020 mini-tower with an Intel Core i7-4770 (3.4 GHz), 32 GB of RAM, and a 512 GB SSD.
The system came with a 295 W power supply, which was far too weak for the RTX 3060. To fix that, I replaced it with a Thermaltake Smart 500 W PSU. The GPU draws about 170 W, and the total system requirements came to around 465 W, so a 500 W supply gave me some breathing room.
While putting it together, I also thought about scaling to multiple GPUs. That’s a much bigger challenge—cards are large, they need big cases and extra PCIe slots, and they draw far more power. Noise and heat become issues too. For now, I realized it makes more sense to stick with a single GPU locally and rely on the cloud if I ever need more.
Looking back, I see this workstation as more than just a budget project. It helps me learn about AI hardware, reduces my dependence on cloud services, and keeps my data local. It may not compete with massive cloud clusters, but it gives me a flexible and affordable way to work and experiment. My advice is to start small, learn as you go, and build up gradually.
Subscribe to my newsletter
Read articles from Dmitry Noranovich directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
