Continuous integration and deployment are not just a fancy words, having efficient and strong pipelines can save a lot of time when developing software and maybe more important, can make software development more pleasant. Recently I had to deploy a small REST API written in php, so I had to implement quickly a deployment strategy using GitLab's CI/CD features. Here's a first-person account of how I enhanced our deployment workflow, focusing particularly on a pipeline setup designed for deploying PHP applications via FTP.
We had to use a small REST API written in PHP and it needed only minimal changes to automate its deployment. We wanted to streamline the process so updates could be deployed quickly and without manual intervention. By setting up a GitLab CI/CD pipeline, I ensured that any updates were automatically pushed to production, saving time and reducing errors, making our deployment process more efficient and reliable.
I started by setting the follwing environment variables in the GitLab web interface in project Settings > CI/CD > Variables. It allows to securely store the credentials, avoiding storing them in the repository and to easily be able to change the deployment destination by just changing the variables:
Here's a look at the specific GitLab CI/CD configuration we employed for the deployment:
deploy:
script:
- echo $FTP_USERNAME $FTP_HOST $FTP_DESTINATION
- apt-get update -qq && apt-get install -y -qq lftp
- |
lftp -c "set xfer:log true; \
set xfer:log-file ftp_detailed_transfer.log; \
set ftp:ssl-allow no; \
open -u $FTP_USERNAME,$FTP_PASSWORD $FTP_HOST; \
mirror -v ./php/ $FTP_DESTINATION --reverse --delete --ignore-time --parallel=10 \
--exclude-glob .git/ --exclude-glob .git/* --exclude-glob .gitignore \
--exclude-glob .gitlab-ci.yml \
--exclude-glob .ftpquota \
--exclude-glob ftp_transfer.log --exclude-glob ftp_error.log --exclude-glob ftp_detailed_transfer.log" \
> ftp_transfer.log 2> ftp_error.log || LFTP_EXIT_CODE=$?
- cat ftp_transfer.log
- cat ftp_error.log
- if [ -f ftp_detailed_transfer.log ]; then cat ftp_detailed_transfer.log; else echo "Detailed transfer log not created. Probably no files needed to transfer."; fi
- if [ "$LFTP_EXIT_CODE" != "" ]; then exit $LFTP_EXIT_CODE; fi
# environment:
# name: production
artifacts:
paths:
- ftp_transfer.log
- ftp_error.log
- ftp_detailed_transfer.log
only:
- master
lftp
which is an enhanced version on top of the ftp protocol, allowing deleting files on the destination that do not exist on the source.|
(pipeline char) to mark in the yaml that a multi line is startng and Line Continuation with \
to split the ftp command on multiple lines, in order to improve readabilityxfer:log
and xfer:log-file
to produce a detailed log file that we print in the consolemirror -v ./php/ $FTP_DESTINATION --reverse
: mirrors from local to remote(if no --reverse is specified it mirrors from remote to local)--exclude-glob
to exclude a bunch of files> ftp_transfer.log 2> ftp_error.log
redirect the console output to ftp_transfer.log and the error to ftp_error.log|| LFTP_EXIT_CODE=$?
captures exit status; the special variable $? holds the exit code of the last executed command in Unix-like operating systems, which in this case is lftp
. We prevent the lftp
sending out an error exit code, because the pipeline script will be stopped on error, but we want to display and record the result. So, after displaying everything, we check the output of the lftp command and if exited with and error, we exit the script with an error.cat
on a non existing file would throw and error.This pipeline significantly improved our deployment process, making it faster and more error-resistant. Automating the upload via FTP ensured that human errors were minimized, and using lftp provided robustness against connectivity issues. Most importantly, configuring the pipeline to run only on the master branch meant that our production environment was always in sync with our most stable codebase.
Automating repetitive tasks like deployments not only saves time, but also reduces the chances of errors and provides a less boring overall experience when developing software. Additionally, detailed logs provide visibility, which is key in maintaining operational stability.
An important aspect is to make sure sensitive information is handled securely in CI/CD pipelines, and using environment variables for FTP credentials is a the norm. It keeps those details out of the source code, which is a huge step toward staying safe.
Check how to convert a database from MySql to Sqlite using mysql2sqlite and a docker container with MySQL and Adminer
In this tutorial we're going to implement a simple REST API in nodejs, exploring different options, by installing express directly as a npm package or using the express generator. Then we'll consume the api from browser, curl, backend or curl