🛒 Building a Jumia Flash Sales Web Scraper that Sends You CSV Reports via Email Every 4 hours

🧠 Why I Built This
I wanted to automate the process of monitoring product listings on Jumia Kenya's Flash Sales section. Manually checking prices, discounts, and stock every few minutes isn’t practical. So I built a Python script that scrapes flash sale data, saves it in a CSV file, and sends that file to my Gmail inbox automatically every 4 hours. No need to lift a finger!
🛠️ What the Script Does
Here’s what my project handles:
✅ Scrapes product details including:
Title
New (discounted) price
Old price
Discount percentage
Ratings
Stock left
Time remaining for the flash sale
✅ Automatically clicks through all flash sale pages using Selenium.
✅ Saves the collected data in a CSV file.
✅ Sends the CSV file to my Gmail inbox.
✅ Repeats the whole process every 4 hours using a loop.
⚙️ Tools and Libraries I Used
Python 3
Selenium
(for automation and navigating pages)BeautifulSoup
(for parsing HTML)smtplib
(to send emails)webdriver-manager
(to avoid downloading ChromeDriver manually)CSV
andtime
(for data handling and scheduling)
😩 Problems I Faced (and How I Solved Them)
1. Pagination Was Tricky
The "Next Page" button didn’t always have the same structure.
Sometimes it had a
<span>
tag, other times just an arrow icon with a different class or attribute.
✅ Solution:
I used a flexible XPath expression with Selenium to find the button by its inner text or attributes and made it resilient to slight HTML changes.
2. CSV Was Not Updating
The CSV only contained data from the first page. It wasn't appending data across all pages.
Also, it seemed like the file wasn’t getting updated on subsequent scrapes.
✅ Solution:
I made sure to place the CSV writing block outside the pagination loop but inside the scraping function.
I used Python’s
os.path.isfile()
to check if the file already existed so I don’t write the headers repeatedly.I switched file mode to
"a"
(append) and added new rows correctly.
3. Sending Email from Gmail
At first, I tried using Yahoo Mail SMTP, but it wasn’t reliable and kept failing.
Then I switched to Gmail, but regular passwords wouldn’t work.
✅ Solution:
I enabled 2FA (Two-Factor Authentication) on my Gmail account.
I generated an App Password and used that in my script.
I used Python's
smtplib
to send the email along with the attached CSV file to myself.
📨 What the Email Looks Like
Every 10 minutes, I receive a fresh email like this:
Subject: Jumia Flash Sales Data Report
Attachment:jumia_flashsale.csv
From/To: My Gmail account
Content: The email body has a brief message and includes the full CSV as an attachment.
🧪 How I Made It Run Continuously
For testing, I used an infinite loop that waits 4 hours between scrapes:
while True:
scrape_jumia()
print("Waiting 10 minutes before next scrape...")
time.sleep(14400)
🔐 Security Tip
Never hardcode your Gmail username and App Password directly into your script. Store them in environment variables or a separate config file and use os.environ.get()
to access them securely.
📁 Final Thoughts
This project taught me a lot about real-world web scraping challenges, especially when it comes to:
Handling dynamic content
Managing browser automation with Selenium
Sending emails securely via SMTP
Building tools that can run independently and reliably
Feel free to check it out on GitHub:
👉 My GitHub Repo - https://github.com/philkasumbi/Jumia-flash-sales-tracker-.git
Subscribe to my newsletter
Read articles from kasumbi phil directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
