How I set up a wiki for "Game of Rôles", the French better "Critical Role"

There is a show I love on Twitch on the channel of MisterMV: Game of Rôles. It is an actual play of an original table RPG created by FibreTigre. To help the show during its early days, I decided to create a wiki for it: https://gameofroles.wiki. And here is how I did it.

Prerequisites

  • Docker

  • A server with yourdomain.com already pointing at it

  • 10-15 minutes

Deployment and configuration

Everybody knows Wikipedia. But it is less known that Wikipedia is actually running on an open-source software. The Wikimedia Foundation develops its own wiki engine. And they use it for all their projects, including Wikipedia, Wiktionary, and others. This wiki engine is named MediaWiki. That being said now, that's what we are going to use to set up our own wiki :-)

First, you need a server with Docker installed on it. So obviously I would recommend a Linux server for that matter. For example, Digital Ocean proposes a Droplet image with Docker preinstalled on Ubuntu. With a couple of clicks, you can get a Droplet setup with Ubuntu and Docker ready to go. AWS and others have likely similar images in their inventory too.

In Docker Hub, you can find this Docker image: https://hub.docker.com/mediawiki/. This is the official docker image from the Wikimedia Foundation itself. This is definitely what we want to use.

Then, considering your wiki project, decide whether you need a MySQL-like database or the default embedded SQLite. In my case, I decided to use SQLite. Why? I knew the community watching the show was not huge. Hence, I did not need the database to hugely scalable, and a simple SQLite is pretty enough. The show had regularly 2-4K viewers at the time I created the wiki in 2018. Actually nowadays in 2021, it has 16-20K viewers. And the wiki is still standing pretty well.

Now to warn you, there's a little trick to deploy your MediaWiki with Docker. First, create a folder wherever you want to store all the files related to the wiki. Then, start the wiki. Visit your wiki at http://yourdomain.com. Do the setup. And download the LocalSettings.php generated for you.

Here is the code I used to start the wiki:

# Create the folder
mkdir wiki; cd wiki

# Start the wiki setup
docker run --name my-wiki -d \
  -v /wiki/data:/var/www/data \
  -v /wiki/images:/var/www/html/images \
  -v /wiki/extensions:/var/www/html/extensions \ 
  mediawiki

Once you have the file LocalSettings.php, upload it into your wiki folder. Finally, delete the current container. And restart a new container with this LocalSettings.php mounted, and you are ready to go.

# Delete previous container
docker rm -f my-wiki

# Create the start script for wiki
echo -e "docker run --name my-wiki -d -v /wiki/data:/var/www/data -v /wiki/images:/var/www/html/images -v /wiki/extensions:/var/www/html/extensions -v /wiki/LocalSettings.php:/var/www/html/LocalSettings.php --restart=always mediawiki" > start.sh

# Make it executabke
chmod +x start.sh 

# Start your wiki
./start.sh

The command to start the container is kinda long. So I suggest saving it in a start.sh to avoid typing it ever again.

Voilà, then visit yourdomain.com and your wiki is ready to use.

Data Backup

Now that you have a very nice wiki, you want to set up a regular backup in case something goes south. Thanks to Docker and how we mounted our volumes last time, it will be pretty easy.

To do a good backup of your wiki, we are going to:

  1. Backup the SQLite database

  2. Dump the wiki content as XML

  3. Package the SQLite backup, the dump, and the current extensions and images folder

  4. Send the compressed package to your Google Drive

  5. Clean up all the backup

Let's start.

  • Backup the SQLite database

       docker exec my-wiki php maintenance/SqliteMaintenance.php --backup-to /var/www/data/backup.sqlite.bak
    
  • Note: This is a hot backup process. It can freeze your database for a couple of seconds.

  • Dump the wiki content as XML

    The following command will export all the pages into a compressed XML dump.

       docker exec my-wiki php maintenance/dumpBackup.php  --full --uploads --output=gzip:/var/www/data/dump.xml.gz
    
  • Package the SQLite backup, the dump, and the current extensions and images folder

    First, we generate a filename with a timestamp, then from the wiki folder, we archive and compress all the backup components.

       FILENAME=backup-$(date +"%Y%m%d%H%M%S").tar.gz
       cd /wiki
       tar cvzf "$FILENAME" data/backup.sqlite.bak data/dump.xml.gz LocalSettings.php extensions images start.sh
    
  • Send the compressed package to your Google Drive

    With the tool rclone, we upload the compressed archive to our Google Drive.

      rclone copy "$FILENAME" gdrive:/Backups
    
  • Note: I let you check the documentation of rclone to know how to authenticate and get setup your google drive access.

  • Clean up all the backup

       rm -f "$FILENAME" data/backup.sqlite.bak data/dump.xml.gz
    

Once you got everything working, what you can do is create a script backup.sh with all the above commands, then create a cronjob to trigger the backup regularly.

For example, I use the following cron file to back up my wiki hourly:

PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/opt/aws/bin:/root/bin
1 * * * * /wiki/backup.sh

Note: PATH here is used to help cron to know how where to find docker, rclone, and such tools.

0
Subscribe to my newsletter

Read articles from Sonny Alves Dias directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Sonny Alves Dias
Sonny Alves Dias

Curious software engineer