Day 7: Overview of Progress and Future Plans

Today, I'm excited to share the progress made so far on my journey and provide a sneak peek into what’s to come. In this post, we’ll touch on a few key points that illustrate how everything is coming together and where we’re heading next.

Flashcard App

I’m thrilled to share my Flashcard App with you all! Check it out here or open my GitHub repository. This app is an interactive tool I built to help master the basics of Python. It allows you to go through different categories like Python syntax, data types, and functions. It’s a great way to reinforce concepts in a fun and engaging way. I built this app to make learning Python more accessible, and it also provides an easy way to quiz yourself on key programming concepts.

The app presents each flashcard with a question, and you can click to flip it to reveal the answer, along with an explanation and sample code. I’ve focused on making the interface simple yet effective, so learners can focus on what matters the most: the content.

Additional Tools: Math Visualizer and Algorithm Visualizer

In addition to the flashcards, I’ve created two other exciting tools that you can check out through the same link. Both of these tools were developed to enhance the learning experience, particularly for those who enjoy interactive learning:

  • Math Visualizer: This tool helps users plot mathematical functions, visualize equations, and solve problems step by step. Whether you’re solving algebraic equations or graphing functions, this tool provides an interactive way to see the results in real time. It’s an excellent resource for anyone who wants to improve their understanding of mathematics and get instant visual feedback on equations.

  • Algorithm Visualizer: The algorithm visualizer is another fun and educational tool. It helps users understand how sorting algorithms work by showing them in action. You can watch algorithms like QuickSort, MergeSort, and BubbleSort, and see how they organize a list of numbers step by step. This tool is perfect for anyone who wants to dive deeper into computer science and understand the logic behind different algorithms.

Data Collecting

Data collection is the cornerstone of data science. Without proper data, even the most sophisticated algorithms and models can’t function. It’s where the raw data is gathered, whether from public datasets, APIs, or web scraping. Without reliable and accurate data, the rest of the data science pipeline can’t function properly.

Python So Far

So far in this journey, I’ve been diving into Python, learning key concepts and tools that will help me in data science. Here’s a brief recap of what we’ve covered so far:

  • Python Syntax: Learning how to write Python code and the key rules behind it.

  • Data Types: Understanding different Python data types like integers, strings, lists, dictionaries, and more, which are the building blocks for any program.

  • Functions: Mastering how to write and use functions, a crucial part of Python programming for code reuse and modularity.

  • Libraries: Getting familiar with essential Python libraries like Pandas for data analysis, Matplotlib and Seaborn for visualization, and NumPy for numerical operations.

These foundational concepts will be applied to more advanced projects in the coming weeks, such as building data pipelines, writing web scrapers, and implementing machine learning models.

My Web Scraper

As a hands-on application of my Python and data collection skills, I plan to develop a web scraper to collect job listings from dev.bg. This project not only supports my own job search but also provides valuable insights into the Bulgarian job market. The goal will be scraping relevant data such as job titles, company names, locations, technology stacks required for each job offer and salary ranges (if any).

The main purpose of this scraper is to collect real-time job data and analyze trends. This includes tracking the number of job openings, average salary ranges, and identifying in-demand skills. By gathering this data, I can create useful insights that could benefit not only me but anyone else looking for job opportunities in Bulgaria.

This project also demonstrates how Python and web scraping can be used to gather useful, real-world data for analysis. The next step will be to clean and preprocess the data, so I can visualize the results and share insights that will help other job seekers understand the trends in the job market.

Future Plans

Looking ahead, I’m excited to continue working on the scraper and evolve it into a full-fledged web app. Over the next few weeks, I will:

  • Develop a full-featured data collection tool that scrapes job listings from dev.bg.

  • Clean and preprocess the data to ensure accuracy and consistency.

  • Analyze the data to extract meaningful insights (e.g., average salary, most requested skills, etc.).

  • Visualize the outcomes by creating charts and dashboards that allow users to quickly grasp key trends in the job market.

  • Build a user-friendly web app that allows anyone to access the job data and insights, helping them make informed decisions about their job search.

The goal is to create a web app that not only serves as a tool for me but for anyone interested in exploring job trends in Bulgaria. By the end of this 30-day journey, I’ll have a fully functioning web app that can provide job seekers with data-driven insights into the current job market.


Stay tuned for more updates and be sure to follow along as I work toward creating a complete web app that can help people like me—and many others—navigate the job market with data-driven insights.

0
Subscribe to my newsletter

Read articles from Anastasia Zaharieva directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Anastasia Zaharieva
Anastasia Zaharieva