Improve the local dev environment

Make your ci pipeline match the local development by using a tools installer.

Aligning the Local development with the CI environment.

Introduction

In my development career, I’ve spent countless hours troubleshooting a failing CI pipeline. Through this experience, I’ve learned that even if you don’t use shell scripts to write your CI pipeline, you still need to install various build tools and their environments.

For instance, when working with Scala, you need to install a JDK. Since the JDK has multiple incompatible versions, it’s crucial to ensure you’re using the correct one. This problem only grows when you consider microservices where each service might use a different version of the JDK.

Taking Control of the Environment

Most of these issues stem from different versions of CLI tools between your local machine and the CI system. For example, you might still use an older version of golangci-lint locally but then discover that the build fails on the CI system. Because there where new errors found by a new version of golangci-lint.

Another common issue arises when building CI pipelines with shell scripts across different operating systems. For example, standard tools like grep can behave differently depending on the OS—what works on Linux might fail on macOS due to slight variations in flags like -E.

So to avoid these issues, we need to ensure that the same versions tools are running locally as in the CI system. Let’s look at some strategies to achieve consistency.

A First try: A setup script

As a first step you might consider to add a setup script and install the components system wide. This is useful for you personal projects. But when you scale up to multiple projects you will have to manually install the tools and make sure the versions match. Or you have to make the script smart enough to setup the right versions for the project. Any time you switch project you will have to run the setup script.

Since developer systems are vastly different and opinionated it will be hard to make a proper setup script which works on all systems.

Here’s an example:

1
2
3
4
5
6
7
8
9
# macOS example using Homebrew (but system-wide)
brew install x

# Redownload each time (less efficient)
curl z > bin/z

# Ubuntu/Debian example (requires sudo, system-wide)
sudo apt-get install ...
sudo snap install ...

Run containers in scripts

A next step for many is to start downloading docker images for a single CLI like:

1
2
#/bin/bash
echo '{"msg": "hello"}' | docker run -i badouralix/curl-jq jq .msg

This method ensures you control the exact version of the tool and even the operating system, leading to a much more predictable environment.

But this quickly falls apart if your CI system runs within a docker and thus it then has to support docker in docker. You also have a mismatch between each OS of the docker. Some containers are not optimized and are hundreds of mb in size quickly eating your CI cache.

Its hard to do the docker mounting of volumes correctly, if you mount a folder which doesn’t exist the docker daemon will create it for you but with the root user. Then there is the ‘user’ inside the docker vs your local user. Again causing access right issues between the two systems.

Advantages:

  • Truly control the version of the CLI tool used and the OS
  • Have isolation of the commands
  • Makes it easy to install new tools

Disadvantages:

  • You loose the interactive CLI interface, which removes the ability to install autocompletion.
  • Ratelimiting of docker
  • Docker images can be many times bigger than the CLI tool packaging them.
  • Docker in docker hell.

Dev containers

Dev containers take the use of Docker for development a step further. Rather than setting up individual containers for specific applications, the entire development environment can be packed into a single Docker image. The project directory is mounted inside the container, allowing IDEs like Visual Studio Code to directly interact with the container. This method ensures consistency between local and CI environments.

This has become very hyped because of native IDE support in visual code. This allows you to debug the code directly inside the docker as if you are developing locally. Because the support for dev containers in Intellij IDEA is still in an experimental state i don’t have that much experience setting it up.

The containers are often without the tools you need for the project so there is still a need to manage the tools them selfs.

Advantages

  • Consistency for all people working on the project
  • The dev environment can be the same OS as the production runtime.

Disadvantage

  • IDE support isn’t stable yet
  • Configuring a dev container is hard with private repositories
  • Setting up the actual tools you need isn’t trivial.

Solving the tool Installation Issue

A tool installer does two things it installs tools like brew, but with an added feature that it automatically switches the ‘activated’ tools depending on the directory you are in.

Many modern tools, particularly those written in languages like Rust and Go, compile into static binaries, making it easier to work with installers like Aqua and Mise.

A tools installer makes it possible to install the tools in parallel and are easy to cache, they will also manage the environment per folder. So when you do:

1
2
3
4
5
6
7
8
cd ~/projects/projectA
go version                          
go version go1.21.3 linux/amd64

# checkout another project
cd ~/projects/projectB
go version
go version go1.23.2 linux/amd64

Mise

A rewrite of ASDF in rust this tool is more in the camp of ‘maximalist’ since it installs tools (asdf), runs tasks (make, just), and environment variables (direnv). At the moment multiple backends are supported so it isn’t even limited to asdf.

All you have to do to setup the local environment is mise install -y and it will install all the tools needed and run hooks to setup the full environment. I suspect this is the more ‘pragmatic’ tool to setup and to get started with the system and also to standardize a company to the same task runner and environment control. One downside is its reliance on extensive shell interaction, which poses security concerns.

But because it implements a task runner, environment control it can tightly integrate these systems into a single powerful CI tool. You can compare it with ‘systemd’.

Advantages:

  • Each plugin is written in bash this makes it very powerful.
  • Can install many tools including python, npm, nodejs, ruby.
  • Has many extra features to handle your environment.

Disadvantages:

  • Unsafe, arbitrary code bash scripts are run on your system when installing tools
  • All the asdf plugins are written in bash.
  • Making a new asdf plugin is not trivial.
  • Because of the huge scope many of the features are still experimental.

** to install**

Give mise a try its easy to install

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
curl https://mise.run | sh
# for bash
echo 'eval "$(~/.local/bin/mise activate bash)"' >> ~/.bashrc
# for zsh
echo 'eval "$(~/.local/bin/mise activate zsh)"' >> ~/.zshrc
# for fish
echo '~/.local/bin/mise activate fish | source' >> ~/.config/fish/config.fish

# if you want to be sure all programs you open use the mise shims:
echo 'export PATH="$HOME/.local/share/mise/shims:$PATH"' >> ~/.profile

An example of a config in mise:

1
2
3
4
5
6
7
8
# .mise.toml
[env]
# supports arbitrary env vars so mise can be used like direnv/dotenv
NODE_ENV = 'production'

[tools]
# specify single or multiple versions
actionlint = '1.7.3'

Alternatively in projects that use ASDF

1
2
# .tool-versions
actionlint 1.7.3
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
name: test
on:
  pull_request:
    branches:
      - main
  push:
    branches:
      - main
jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: jdx/mise-action@v2
      - run: mise run test

Aqua

While Mise is a more comprehensive tool with a broad scope, Aqua focuses purely on securely installing CLI tools. Written in Go, Aqua emphasizes security and ease of use but does not attempt to manage tasks or environment variables.

I see great potential in Aqua, particularly due to its focus on security. The team is also considering adding support for tools like Python, which will further enhance its utility.

Advantages

  • Security first, binaries are cryptographically checked.
  • Easy to add new tools to the registry.
  • Installs CLI tools from download links of github.
  • Manages the CLI tools depending on the directory.
  • Uses a simple yaml file to specify which tools to use.
  • Automatic update by Renovate.
  • Every commit in this project is signed.

Disadvantage:

  • The tool has some rough edges like the install menu.
  • Due to Aqua’s emphasis on security, it currently lacks support for tools that require building from source, such as Python.
  • The ecosystem is still a bit young but its very active.

Here is a sample of an aqua config one to lint github actions files:

1
2
3
4
5
registries:
- type: standard
  ref: v4.44.3 # renovate: depName=aquaproj/aqua-registry
packages:
- name: rhysd/[email protected]

To use it in github actions you will have to do:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
---
name: aqua-setup
on:
  push:
    branches: [main]
  pull_request:
    branches: [main]
permissions: {}
jobs:
  default:
    timeout-minutes: 30
    runs-on: ubuntu-latest
    steps:
      - uses: actions/[email protected]
      - uses: aquaproj/[email protected] # installs and caches the tools
      - run: actionlint -ignore 'Useless cat'

My personal opinion

I believe a combination of dev containers and a tool manager like Aqua or Mise is a strong approach. This setup allows developers the flexibility to work either inside or outside of containers, while ensuring that all tools are consistently managed and versioned. This makes setup consistent and avoids the need to maintain a ‘setup’ script.

For mise it might be a good idea to solve the supply chain security by implementing support for aqua as a backend. When it stabilizes the task feature it can become a competitor with other task runners and be a one stop tool to setup builds.

Further Posts

Since this is only a start of how to setup a better environment where the local machine and CI gets closer together. I will highlight the following in my next post about CI:

  • How to setup the ‘surrounding’ environment for testing like databases, redis, kubernetes.
  • Command runners.
  • Alternatives to bash scripting.
Licensed under CC BY-NC-SA 4.0
Built with Hugo
Theme Stack designed by Jimmy