Automated GitHub Repository Access Checker (Read/Write) via Cron

Table of contents

Last Blog Review →
In the last blog we understood, how to write a shell script to get the read/write access of users to the github repository.
Scenario →
Consider we have a script to get the read and write access of users to GitHub repository, but we need this information 2 a week for keeping the audit if authorized users are having correct permissions or not. So, its not possible to manually run the script everytime twice in a week.
Solution →
So, solution for this is running the script with cronjon where the script will automatically run and store the output in a log file, which can be checked at the last weekday. So, we will run the script 2 days a week on Monday and Thursday say at 3:00 AM in night and store the access details in the log file.
First lets create a .env file for storing the credentials to the github
cat /home/github/github_credentials.env GITHUB_USERNAME=username GITHUB_TOKEN=token
Setting the access to the .env file, so that only user gets read and write permission and no others
chmod u=rw,go= /home/github/github_credentials.env
Now we will just source the above env file in our shell script to reference the github credentials in our main shell script so the git repositories can be accessed.
cd /home/scripts/ cat list_github_repo_access.sh #!/bin/bash source /home/github/github_credentials.env # GitHub API URL API_URL="https://api.github.com" # GitHub username and personal access token USERNAME=$username TOKEN=$token # User and Repository information REPO_OWNER=$1 REPO_NAME=$2 # Function to make a GET request to the GitHub API function github_api_get { local endpoint="$1" local url="${API_URL}/${endpoint}" # Send a GET request to the GitHub API with authentication curl -s -u "${USERNAME}:${TOKEN}" "$url" } # Function to list users with read access to the repository function list_users_with_read_access { local endpoint="repos/${REPO_OWNER}/${REPO_NAME}/collaborators" # Fetch the list of collaborators on the repository collaborators="$(github_api_get "$endpoint" | jq -r '.[] | select(.permissions.pull == true) | .login')" # Display the list of collaborators with read access if [[ -z "$collaborators" ]]; then echo "No users with read access found for ${REPO_OWNER}/${REPO_NAME}." else echo "Users with read access to ${REPO_OWNER}/${REPO_NAME}:" echo "$collaborators" fi } # Main script echo "Listing users with read access to ${REPO_OWNER}/${REPO_NAME}..." list_users_with_read_access
Now we will schedule the above script to run twice a week using crontab i.e. every Monday and Thursday at 3:00 AM in a week for our req. github repository user access details by providing following command
crontab -e 0 3 * * 1,4 /home/scripts/list_github_repo_access.sh DevOpsWithMihir Entry_Form >> /home/scripts/repo_access_log.txt
So, here the our script will run on Monday and Thursday every week at 3:00 AM in night to get the user access permissions for the required GitHub repository and store it in the .txt file. Which can be checked at the last of the week saving ample of time to run the script manually instead just checking which user has what permission to required github repository.
Conclusion →
In the this blog we understood, how to automate running a script using crontab saving lot of time for running script manually and saving the data separately. As cron does this job with automation for you and you just need to check the log file at last day of week to see if everything is in place.
Subscribe to my newsletter
Read articles from Mihir Suratwala directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Mihir Suratwala
Mihir Suratwala
Hi, How are you !! Hope you doing good.... I got introduced to Cloud initially. As I went ahead learning what is cloud and how it works, then got to know a field which is DevOps that makes Cloud model more effective. So, as I started working & got good experience on AWS. I have been learning the DevOps tool and technologies on how to use it with the Cloud, which will give me good understanding on how Cloud and DevOps go hand in hand to deploy my applications.