Keeping an Eye on Production Application Logs with Grafana Loki


Logs are the unsung heroes of system reliability, they tell us what’s broken, why it broke, and how to fix it. But traditional logging often feels like searching for a needle in a haystack: endless text files, scattered servers, and manual grepping for clues. While these logs hold critical data, they’re hard to parse in real time and even harder to act on before users notice issues.
That’s where Grafana steps in. By centralizing logs, visualizing patterns, and pairing with tools like Promtail and Loki, Grafana transforms raw, chaotic data into a clear narrative. No more reactive firefighting; instead, you get proactive insights to spot anomalies, predict failures, and keep your system running smoothly. In this post, I’ll share how to built a Grafana-powered logging pipeline that turned our logs from after-the-fact clues into real-time guardians of uptime.
Step 1: Creating an account
Create your grafana cloud account here
After creating an account navigate to home page where you can see your account plan. Initially it might be free trial for X days.
Step 2: Setup Access Policy
In the above image under account there is a option Access policies
, navigate there.
You will see something like this to create access policy and will not have any policy by default.
Clicking on Create access policy
will open this type of pop-up below.
Enter details to create access policy to read and write logs. Then click on create.
Step 3: Create Token and Setup Grafana Cloud
Create token with some token name and with no expiry is suggested but not a good practice obviously. Your wish. Copy the token and keep it safe with you.
Next after creating click on your username in the left nav bar.
URL will look something like this, it will show your username and id of the user in URL.
This kind of page will pop-up like below, and click Details
under Loki
You will see a page like this where the information is provider for your user ID, and endpoint to push logs to from promtail. We will gather information from this page for our promtail config file
Scrolling down, you will come across a section which says sample config for standalone host. We need to copy the url and replace the token section in the url from the token that we created in step 3.
Step 4: Setup Docker as a Standalone Host.
Now We will create a promtail docker container which will act as an agent to push our logs to grafana cloud. Follow the instructions to get your docker container up.
In you project create a folder you make like here I have created logging, in that create a folder with name config
Create a file
promtail.yaml
inside config directory and a file with namedocker-compose.yaml
inside docker folder. Directory structure shown below.
Now copy the content of the files.
promtail.yaml
server: http_listen_port: 0 grpc_listen_port: 0 positions: filename: /tmp/positions.yaml client: url: https://<User Id>:<Your Grafana.com API Token>@<URL> scrape_configs: - job_name: application static_configs: - targets: - localhost labels: job: logs __path__: /var/log/*.log
Make sure to replace
url
with your User Id and tokendocker-compose.yaml
version: "3.8" services: promtail: image: grafana/promtail:latest container_name: promtail volumes: - ./config:/mnt/config - ../src/logs:/var/log command: --config.file=/mnt/config/promtail.yaml
You have to make sure to volume mount the correct directory for pushing logs to grafana logs. In my case my application logs exist at location
../src/logs
in reference todocker-compose.yaml
file and so i mounted that folder to/var/log
.
- Once done, you just need to do
docker-compose up -d
in your folder (for me logging) where application is running and then the logs should get pushed to grafana cloud.
Step 5: Finally, get all logs in Grafana.
You will see all your logs in the logs section which is in the explore section in the home page or just click on explore. It will take you to the logs page.
Select loki from the dropdown
Select job and and the name of the job that you have given in your promtail.config
And thats it..you are now good to go 🚀
Grafana provides multiple features like quering logs based on multiple filters, view live logs, filter logs based on filters like dates and many more 🔥
And that’s it! With Grafana in place, you’ve got a clear way to search logs, track problems as they happen, and filter by what matters (like dates or errors). No more digging through messy files or guessing where things broke.
The best part? This is just the start. Once you’re comfortable, you can tweak it to fit your system even better—maybe add alerts for critical errors or group logs by service. But for now, you’ve got what you need to keep things running smoothly.
Go ahead, try it out—your future self will thank you when things go wrong (and they will, because that’s tech!). Happy troubleshooting!
🚀 Ready to Turn Your Idea into Reality?
At Opengig, we don’t just write code—we build reliable, scalable products that stand the test of time. Whether you’re launching a new app, refining an existing system, or need a team to own the entire process (from design to deployment), we’re here to make it happen—fast.
Subscribe to my newsletter
Read articles from Rushi Gandhi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
