Notes on Notes - Product Analytics

We take notes to remember. To create. To focus. But when it comes to note-taking apps, how do users choose the tools they trust?
This project explores how the top-rated note-taking apps perform across Apple and Google platforms — and what their ratings, reviews, and trends reveal about their audiences.
Whether you're building a product or just love clean design, the data might surprise you.
Introduction
We all take notes — to plan, to create, to process. But what makes one note-taking app thrive while another fades?
I scraped and analyzed data from the Apple App Store and Google Play Store to study the most downloaded and reviewed note-taking apps.
This story uncovers how these apps perform over time, across platforms, and what their users tell us — in ratings, volume, and eventually, their words.
Methodology
Using python and npm modules, I scraped 100,019 rows of data from Google Play Store and Apple App Store on :
Platform (iOS, Android)
App names and versions
Ratings and reviews
Timestamp of reviews
I cleaned and aggregated the data using Microsoft Excel to create pivot tables, time series, and comparative charts.
Key Findings
Platform Preferences are real
Some apps feel at home on one platform — and completely out of place on another.
Take OneNote: on the Google Play Store, it holds a solid 4.1★ average, backed by consistent reviews and a loyal Android user base. But on the Apple App Store, the same app drops to 3.1★ — a full point lower. That’s a significant gap, hinting at possible design friction, performance issues on iOS, or simply different user expectations.
Now flip the story. Evernote, a long-time player in the note-taking space, performs decently on iOS with an average 3.6★, but drops drastically to 2.3★ on Android. That’s not just a drop — that’s a product crisis on one of the world’s most widely used platforms.
These kinds of platform splits are more than just numbers — they’re signals.
Signals that user experience, feature stability, and even update cadence may be diverging across ecosystems. For product teams, this is a reminder that shipping the same app to both stores doesn’t mean users will experience it the same way.
Ratings Change Over Time: Average Rating Trends
When you zoom out and track ratings over time, a deeper story begins to take shape — one of stability, recovery, and decline.
Three top apps — OneNote, Notion, and Evernote — tell three very different stories.
OneNote remained remarkably stable from January 2022 to March 2025, holding a consistent average around 4.0★. But starting in April 2025, a subtle downward trend emerges — a signal that something might be slipping. Whether it’s performance, updates, or UX, now is the moment for product teams to listen closer.
Notion took a different path. It started off trending downward from January 2022, hitting a low of 2.6★ by August 2023. But in just four months, it rebounded dramatically to 4.0★ by December 2023 — likely the result of high-impact improvements or long-requested features. Since then, it's held steady near the 4-star mark, with occasional fluctuations.
Evernote, meanwhile, shows a slow and steady decline. From March 2023 to June 2025, the downward slope is unmistakable — pointing to potential issues with feature stagnation, competition, or user dissatisfaction that’s grown over time.
Together, these three trends show how app ratings aren’t just feedback — they’re a living pulse. They reflect not just how users feel today, but how their trust builds — or erodes — over time.
Behavior Beneath the Stars: Month-over-Month (MoM %) Trends
Average rating trends tell one story — but zooming in on month-over-month percentage change uncovers more reactive moments. Here, I overlaid MoM % rating change with review volume to explore how users respond when apps improve… or fail.
Evernote: A Sudden Rebound, A Louder Crowd
After a long decline, Evernote showed a 110% increase in MoM rating from May to June 2025 — its first clear upward signal in over two years. This suggests product teams may be making course corrections, and users are beginning to notice.
But go back a few months, and a different story unfolds:
In Dec 2023, Evernote hit its lowest MoM % rating (~−70%), and at the same time, review count spiked to 1,361 — the highest across all months between 2023–2025.
When the experience declined, users made sure their voices were heard.
And as the experience improved, fewer people left reviews — possibly signaling regained satisfaction.
Notion: A More Positive Cycle
Unlike Evernote, Notion showed a more aligned trend between MoM rating and review volume — as ratings improved, so did review sentiment and volume. This could indicate that users are responding positively to product improvements, or that Notion is actively engaging with feedback loops.
But — and this is important — patterns don’t always equal intent.
Correlation here is interesting, but not yet confirmation.
To validate whether reviews became more positive, I’ll be performing sentiment analysis on the actual text — a crucial next step in understanding the "why" behind the stars.
What’s Next
This is just Part 1.
Next, I’ll be running sentiment analysis on review text across platforms. My goal is to extract:
Common complaints (ads, sync issues, bugs)
Loved features (minimalism, voice-to-text, cloud sync)
Differences in tone between iOS and Android users
Stay tuned for Part 2: The Words Behind the Stars.
Project files & Code
All scraping scripts, cleaned datasets, and chart notebooks live on GitHub → Notes Product Analytics Repo
🔗 Let’s Connect
If you’re a product thinker, data nerd, or just curious about app feedback patterns — I’d love to hear your take.
Also open to feedback if you think I missed something interesting in the data!
Subscribe to my newsletter
Read articles from Likhitha Anuganti directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
