In this blog post, I will try my best to explain how my group got our project autobuilding and autodeploying during one of the courses on Computer Science on Aarhus University. If you're taking this course in 2021, you can see our repository on https://gitlab.au.dk/exsys2021/da6/ikke-lige-pa-staende-fod and website on https://vm33.exsys2021.cs.au.dk/

The Situation

In this course, we have been given two key components for our build setup: A virtual machine, which we use to host our production server, and a gitlab instance which we can use as a git source, and to build our project.

One thing to note however is, that while we can connect from the virtual machine to the gitlab instance, we can only connect to the virtual machine on a specific url on port 80. This means that we will have to get creative with how we deploy updates to our server.

First things first: the building process

Depending on how a project is set up, you will probably need to create a production build of your website first. If you serve static files (eg plain html, css and js) you can skip this chapter.

Gitlab is configured to have pipelines which you can choose to run at specific times. A smart thing to do here is to make the pipelines run when you commit to a specific branch. In our case, we want our ReactJS project to be built when we push to a branch, and made into a docker container for easy deployment. Thus, we created the following .gitlab-ci.yml file:

stages:
  - create-optimized-react-build
  - build-isolated-docker-container

create-optimized-react-build:
  only:
  - master
  stage: create-optimized-react-build
  image: node
  script: 
    - cd student-dashboard
    - echo "Start building App"
    - npm install
    - npm run build
    - echo "Build successfully!"
  artifacts:
    expire_in: 1 hour
    paths:
      - student-dashboard/build/
  cache:
    paths:
      - student-dashboard/node_modules/

build-isolated-docker-container:
  only:
  - master
  stage: build-isolated-docker-container
  image: docker:latest
  services: 
    - name: docker:19.03.8-dind
  before_script:
    - docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" $CI_REGISTRY
  script:
    - cd student-dashboard
    - docker build . -t "$CI_REGISTRY_IMAGE"
    - docker push "$CI_REGISTRY_IMAGE"

The way this works is as follows: in our git source repository, we have a folder named student-dashboard where our code is, and a project folder where things like our timeplan is. We go into the directory of our project, install the required js packages and run the react build command, which compiles and minifies our code to be ready to be served on our production server.

Next we take our build and package it into an isolated docker container. If you don't use react and docker, don't worry. This is just to show how it can be done. We found following guide helpful: https://dev.to/christianmontero/gitlab-ci-cd-example-with-a-dockerized-reactjs-app-1cda

The deployment

Here is the tricky part. Since we can only contact our VM on port 80, we might as well use the port to our advantage. The plan is as follows: create a specific url, that when you send a request to it, will run a shell command that updates the production files.

Firstly, we are using nginx to serve our static files. This means that we will have to tell nginx that there is a specific url it should pass to another backend. To do this, we edited the default nginx configuration file, to be the following:

server {
  listen 80;
  listen [::]:80;
  server_name localhost;
  
  location / {
    root /usr/share/nginx/html;
    index index.html index.htm;
  }
  location /updateContainer/ {
    proxy_pass 127.0.0.1:8080; 
  }
  
  error_page 500 502 503 504 /50x.html;
  location = /50x.html {
    root /usr/share/nginx/html;
  }
}

This file is probably located at /etc/nginx/conf.d/ for people that have installed nginx on their vm from apt. What this file does is, that when you send an url to your website, if the directory of the request is /updateContainer/, nginx looks for a webserver running locally at port 8080 that can serve the request. This is where we need to write a small program.

We chose to use nodejs for this, but you can use any other language with a webserver, such as go or python. Install nodejs and npm on your machine (sudo apt install nodejs) (sudo apt install npm). We made a directory, created a file named index.js with the following code:

const https = require('https');
var shell = require('shelljs');
const fs = require('fs');

const options = {
        key: fs.readFileSync('key.pem'),
        cert: fs.readFileSync('cert.pem')
};

const requestListener = function (req, res) {
    console.log('Got request..');
    res.writeHead(200);
    res.end("Updating images...");
    console.log("Queued container update");
    setTimeout(updateImages, 1000);
}

const updateImages = function(){
    shell.exec('docker pull registry.gitlab.au.dk/exsys2021/da6/ikke-lige-pa-staende-fod:latest');
    shell.exec('docker stop student-dashboard || true && docker rm student-dashboard');
    shell.exec('docker run -p 80:80 -p 443:443 --name student-dashboard -v /home/auuser/nginxconf:/etc/nginx/ -d registry.gitlab.au.dk/exsys2021/da6/ikke-lige-pa-staende-fod:latest')
}

console.log('Starting server...')

Here, you should especially look at the function called updateImages. When a get request is proxied to this node server from nginx, we call the updateImages function which, in our case, pulls an image and runs it with docker. You can chose to do whatever commands you need to do here to update your setup. It could be pulling your repository, and copying the contents into /var/www/html, and thus updating your server this way. You probably also want a deploy token to authenticate your vm with gitlab (https://docs.gitlab.com/ee/user/project/deploy_tokens/)

To get the node server to accept https request, you need to also run the following commands in the directory where your index.js file is:

npm install shelljs
openssl genrsa -out key.pem
openssl req -new -key key.pem -out csr.pem
openssl x509 -req -days 9999 -in csr.pem -signkey key.pem -out cert.pem
rm csr.pem

Then, to run the server, you can type "screen", press enter, type node index.js, and then press "ctrl+a" and then "d". Finally, restart nginx by doing "sudo systemctl restart nginx" and now, we should have something that our gitlab instance (and anyone really) can access, in our case at https://vm33.exsys2021.cs.au.dk/updateContainer/ . You could add some more security to this in a more professional context, by requiring a token to be sent url of the request, but for our needs we think this is sufficient. To send an update request from gitlab, we lastly add the following to our .gitlab-ci.yml file:

stages:
  - create-optimized-react-build
  - build-isolated-docker-container
  - send-update-request-to-vm33

create-optimized-react-build:
  only:
  - master
  stage: create-optimized-react-build
  image: node
  script: 
    - cd student-dashboard
    - echo "Start building App"
    - npm install
    - npm run build
    - echo "Build successfully!"
  artifacts:
    expire_in: 1 hour
    paths:
      - student-dashboard/build/
  cache:
    paths:
      - student-dashboard/node_modules/

build-isolated-docker-container:
  only:
  - master
  stage: build-isolated-docker-container
  image: docker:latest
  services: 
    - name: docker:19.03.8-dind
  before_script:
    - docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" $CI_REGISTRY
  script:
    - cd student-dashboard
    - docker build . -t "$CI_REGISTRY_IMAGE"
    - docker push "$CI_REGISTRY_IMAGE"

send-update-request-to-vm33:
  only:
  - master
  stage: send-update-request-to-vm33
  script:
    - apk add --update curl
    - echo "Sending update request to vm33..."
    - curl 'https://vm33.exsys2021.cs.au.dk/updateContainer/'
    - echo "Update request sent."