Day 1: Python Foundations for Scientific Computing 🤖

Welcome to Day 1 of 100 Days of AI!
Today, we’ll revisit Python — not as beginners, but as scientific programmers. Before diving into Machine Learning or Deep Learning, it’s crucial to understand how numerical computation works under the hood.
🔹 Why Scientific Computing in Python?
AI/ML involves handling large datasets and performing millions of mathematical operations (matrix multiplications, vector dot products, optimisation ).
👉 Standard Python (for loops, lists) is too slow for this.
👉 Libraries like NumPy make Python nearly as fast as C because:
They store data in contiguous memory blocks (unlike Python lists).
Operations are vectorized (batch operations at once instead of looping).
They use optimized C/Fortran backends under the hood (BLAS, LAPACK).
🔹 Step 1: NumPy Internals
how Python lists vs NumPy arrays differ.
OUTPUT:
Memory Layout:
🔹 Step 2: Vectorization vs Loops
Example Problem: Add two arrays of size 1 million.
Using Python Loop:
OUTPUT:
Using NumPy Vectorization:
OUTPUT :
NumPy will be ~100x faster. That’s the power of vectorization!
🔹 Step 3: Performance Profiling
Sometimes your code is slow, and you don’t know why. That’s where profiling tools help
OUTPUT:
#TIMEIT automatically runs multiple times to give you average execution time.
Using CProfile :
👉 This shows a function-level breakdown of where time is spent.
🔹 Step 4: Broadcasting (Bonus Concept)
NumPy doesn’t just do element-wise operations — it can automatically expand arrays when shapes don’t match.
👉 NumPy “broadcasted” scalar 5 across the vector [1,2,3] This avoids writing loops.
âś… Takeaways from Day 1
NumPy arrays are faster & memory-efficient than Python lists.
Vectorization replaces loops with bulk operations → huge performance gains.
Profiling tools (Time_it, Cprofile) help detect bottlenecks.
These optimizations are why AI frameworks (TensorFlow, PyTorch, JAX) are built on top of NumPy-like operations.
Subscribe to my newsletter
Read articles from rohit gaikwad directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
