Product docs and API reference are now on Akamai TechDocs.
Search product docs.
Search for “” in product docs.
Search API reference.
Search for “” in API reference.
Search Results
 results matching 
 results
No Results
Filters
How to Create a Dagger Pipeline on Akamai
Traducciones al EspañolEstamos traduciendo nuestros guías y tutoriales al Español. Es posible que usted esté viendo una traducción generada automáticamente. Estamos trabajando con traductores profesionales para verificar las traducciones de nuestro sitio web. Este proyecto es un trabajo en curso.
Dagger is a free and open source application for automating Continuous Integration/Continuous Delivery (CI/CD) pipelines. It allows administrators and developers to create scripts to assemble, test, build, and even publish a project to a container registry. Dagger includes APIs for several programming languages, providing additional convenience. This guide supplies a brief introduction to Dagger and demonstrates how to create a simple Dagger pipeline.
What is Dagger?
Dagger was originally created by the founder of Docker. It allows users to automate their production pipelines using the language they prefer. The Dagger interface allows users to generate, build, test, and containerize their applications through the use of a detailed SDK. Dagger runs the entire pipeline inside one or more containers, and requires the Docker BuildKit backend to operate. It is designed to be used as part of a CI/CD pipeline, which automates the entire application development life cycle. A mature CI/CD pipeline allows for a quicker, more efficient, and more robust delivery schedule.
The script first imports the Dagger package and opens a session to the Dagger engine. The script then transmits the pipeline requests to the engine using an internal protocol. For each request, the Dagger engine determines the operations required to compute the results. The various tasks are run concurrently for better performance. The results are then processed asynchronously and sent back to the script when everything is resolved. Results can be assigned to a variable and used as inputs to subsequent stages of the pipeline.
In addition to cross-language support, Dagger provides some of the following advantages:
- It allows developers to integrate automated tests directly into their pipeline.
- Because it is developed using a standard programming language, a Dagger script is portable and system agnostic. It can run on any architecture.
- Scripts can run locally or remotely.
- Dagger caches results for later use to optimize performance.
- It is fully compatible and thoroughly integrated with Docker. Docker assists Dagger in dependency management. Scripts are cross-compatible with many other CI/CD environments.
- Relatively little code is required to develop a complex pipeline.
- It can optionally use the Dagger CLI extension to interact with the Dagger Engine from the command line.
- Dagger is scalable and can simultaneously handle many highly detailed pipelines.
Because Dagger is a relatively new application, it does not yet have an extensive user base or many avenues for support. Although the Dagger SDK is very powerful, it is also complex and takes some time to learn.
Due to Dagger’s multi-language support, developers can code their pipeline in their favorite language. It potentially provides the opportunity to use the same programming language as the one used to develop the application. The Dagger SDK/API is available in the following languages:
- Python
- Go
- Node.js
- GraphQL
Dagger recommends the Go SDK for those who are unsure which SDK to use. The GraphQL API is language agnostic. It can serve as a low-level framework for those who want to use a language without its own API.
For more background on Dagger, see the Dagger Documentation and the Dagger Cookbook.
Before You Begin
If you have not already done so, create a Linode account and Compute Instance. See our Getting Started with Linode and Creating a Compute Instance guides.
Follow our Setting Up and Securing a Compute Instance guide to update your system. You may also wish to set the timezone, configure your hostname, create a limited user account, and harden SSH access.
To publish the container, you must have access to a container registry. This guide uses the open source Harbor registry to publish the container. However, it is possible to push the container to any container repository. For information on how to create a Harbor registry on a separate Compute Instance, see the guide on Deploying Harbor through the Linode Marketplace. Before using Harbor, it is necessary to create a project named
dagger
to host the example container.
sudo
. If you are not familiar with the sudo
command, see the
Users and Groups guide.How to Install Dagger
Dagger requires the use of Docker. This guide uses the Python SDK to compose the script. The Dagger module for Python is installed using pip
, the Python package manager. This guide is written for Ubuntu 22.04 LTS users, but is generally applicable to most recent releases and other Linux distributions. To install Dagger, follow these steps.
Install any updates to ensure the system is up to date:
sudo apt update -y && sudo apt upgrade -y
Afterward, reboot the system if advised to do so.
Ensure
git
is installed on the system:sudo apt install git
To prepare for the Docker v2 installation, remove any older releases of the application and then install some additional components:
sudo apt remove docker docker-engine docker.io containerd runc sudo apt install ca-certificates curl gnupg lsb-release
Add the official Docker GPG key to help validate the installation:
sudo mkdir -m 0755 -p /etc/apt/keyrings curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
Add the Docker repository to
apt
, then update the package list:echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null sudo apt update
Install the latest release of Docker Engine and CLI, along with some related packages:
sudo apt install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
To ensure Docker is operating correctly, run the
hello-world
container:sudo docker run hello-world
If everything is working, the container displays the message
Hello from Docker!
:Hello from Docker! This message shows that your installation appears to be working correctly.
Install the Dagger SDK for the appropriate programming language. This guide uses Python to create the example application, so the next steps explain how to download the Python SDK. Use
pip
to install thedagger-io
module. Ifpip
is not yet installed, it can be added using the commandsudo apt install python3-pip
. Python release 3.10 is required.Required Options Dagger uses Docker to create a container, but
sudo
is normally required to run Docker. This means the Pythondagger-io
package must be accessible to the root user. Unfortunately, Python installs modules locally by default. There are a couple of ways to work around this problem:The quickest method is to install the
dagger-io
package globally using thepip -H
flag. This ensures it is accessible to all users. Unfortunately, usingpip
globally can lead to complex and confusing permission issues. This technique should be used with great care, especially in a multi-user environment. To install the package globally, use this command instead of the one below:sudo -H pip install dagger-io
A better workaround is to add the current user to the
docker
group. This allows the user to access Docker without root access, but this requires a reboot to take effect. This guide proceeds as if the user has been added to thedocker
group and installsdagger-io
locally. To add the current user to thedocker
group, use the following command syntax, replacingexample-username
with your actual username:sudo usermod -aG docker example-username
pip install dagger-io
Note SDKs are also available for Node.js and Go. To install the Go SDK, use thego mod init main
andgo get dagger.io/dagger
commands. For the Node.js SDK, usenpm install @dagger.io/dagger --save-dev
.
How to Create a Sample Dagger Pipeline
This guide uses the Python SDK to create an example Dagger CI/CD pipeline. To simplify the process, this guide uses the hello-dagger
demo app to demonstrate the main steps. Dagger recommends using this application when learning how to create a pipeline. However, any application can be used for the demo, provided the appropriate script updates are made.
For more information on the Python SDK or to further customize the application, see the Dagger Python SDK documentation. See the Dagger Quickstart demo for information on how to create this pipeline using either Go or Node.
To create a Dagger pipeline in Python, follow these steps.
Download the Example Application
Dagger has developed a sample React application named hello-dagger
as a teaching aid. To get started, download the application from GitHub using git
.
Clone the application from GitHub:
git clone https://github.com/dagger/hello-dagger.git
Change to the new
hello-dagger
directory and create a newci
directory to contain the scripts:cd hello-dagger mkdir ci
Create and Test a Dagger Pipeline for the Application
The Dagger client enables users to create a multi-stage Python program to define, test, and build an application. This section of the tutorial does not yet publish the application. It is important to test it first to ensure it builds correctly. To create a new pipeline, follow these steps.
Navigate to the
hello-dagger/ci
directory, create a newmain.py
Python project file, and open the file in thenano
text editor:cd ~/hello-dagger/ci nano main.py
At the top of the file, add the required
import
statements, including animport dagger
directive to import the Dagger SDK:- File: ~/hello-dagger/ci/main.py
1 2 3
import sys import anyio import dagger
Define a
main
routine, create a Dagger configuration object, and definestdout
as the output stream:- File: ~/hello-dagger/ci/main.py
5 6
async def main(): config = dagger.Config(log_output=sys.stdout)
Create a Dagger client using
dagger.Connection
, passing it theconfig
object as the default configuration. Create a new client container with the following parameters:- Base the container on the
node:16-slim
image using thefrom_
method. This method also initializes the container. - Use the
with_directory
method to specify both the directory to use as the source and the mount location inside the container. - Specify the current directory as the source of the application using the string
client.host().directory(".")
. - Mount the application inside the
/src
directory of the container. - Exclude the extraneous
node_modules
andci
directories from this process.
Add the following lines to the file to mount the source code at the
src
directory of anode:16-slim
container:- File: ~/hello-dagger/ci/main.py
8 9 10 11 12 13 14 15 16 17
async with dagger.Connection(config) as client: source = ( client.container() .from_("node:16-slim") .with_directory( "/src", client.host().directory("."), exclude=["node_modules/", "ci/"], ) )
Note The Dagger Python SDK makes extensive use of a technique known as method chaining. The methods are processed in the order they appear. Subsequent methods act on the object returned in the previous method.- Base the container on the
The next phase of the pipeline uses
npm install
to install the application dependencies inside the container. Thewith_workdir
method tells Dagger where inside the container to run the command. Thewith_exec
method tells Dagger to runnpm install
at that location. Add the following lines to the script:- File: ~/hello-dagger/ci/main.py
19
runner = source.with_workdir("/src").with_exec(["npm", "install"])
The final section of the Python script automatically runs a test suite against the application. This command uses the
with_exec
method again, withnpm test --watchAll=false
as the test command. If an error results, details are printed to the console via thestderr
stream and the pipeline terminates. At the end of the file, add a call to themain
routine:- File: ~/hello-dagger/ci/main.py
21 22 23 24
out = await runner.with_exec(["npm", "test", "--", "--watchAll=false"]).stderr() print(out) anyio.run(main)
The entire file should look like this:
- File: ~/hello-dagger/ci/main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
import sys import anyio import dagger async def main(): config = dagger.Config(log_output=sys.stdout) async with dagger.Connection(config) as client: source = ( client.container() .from_("node:16-slim") .with_directory( "/src", client.host().directory("."), exclude=["node_modules/", "ci/"], ) ) runner = source.with_workdir("/src").with_exec(["npm", "install"]) out = await runner.with_exec(["npm", "test", "--", "--watchAll=false"]).stderr() print(out) anyio.run(main)
Ensure the proper indentation of each line is maintained.
When done, press CTRL+X, followed by Y then Enter to save the file and exit
nano
.Change back to the main
hello-dagger
project directory and run the Python script usingpython3 ci/main.py
:cd ~/hello-dagger python3 ci/main.py
No errors should be seen and all
npm
tests should pass:Test Suites: 1 passed, 1 total Tests: 1 passed, 1 total Snapshots: 0 total Time: 3.896 s Ran all test suites.
Note Because Dagger first has to ask Docker to download thenode:16-slim
container, it might take a couple minutes before thenpm install
command runs. Subsequent runs of this program take less time.
Add a Build Stage to the Pipeline
After the main components of the pipeline have been created, as described in the previous section, it is time to add the build stage. Most of the main.py
file remains the same in this version of the file. However, the output of the test phase is no longer sent to standard output. Instead, it feeds into the build stage of the pipeline. To add build directives to the file, follow these steps.
Open the
main.py
file used in the previous section:nano ~/hello-dagger/ci/main.py
Make the following changes:
- The file remains the same up until the
runner.with_exec
command. - Remove the command
out = await runner.with_exec(["npm", "test", "--", "--watchAll=false"]).stderr()
and add the following line. This is the same command except the result is assigned to thetest
object. - Remove the
print(out)
command as this statement is reintroduced later in the new program.
- File: ~/hello-dagger/ci/main.py
21
test = runner.with_exec(["npm", "test", "--", "--watchAll=false"])
- The file remains the same up until the
Add new instructions to build the application to include the following details:
- Use the
with_exec
method to definenpm run build
as the build command. - Store the outcome in the
/build
directory of the container using thedirectory
method. The new directory is assigned tobuild_dir
. - The
export
method writes the contents of the directory back to the./build
directory on the host. Theawait
keyword tells the pipeline to wait for the activity to complete before proceeding. - The
entries
method extracts the full list of directories from thebuild
directory and writes the list back to the console.
- File: ~/hello-dagger/ci/main.py
23 24 25 26 27 28 29 30 31 32
build_dir = ( test.with_exec(["npm", "run", "build"]) .directory("./build") ) await build_dir.export("./build") e = await build_dir.entries() print(f"build dir contents:\n{e}") anyio.run(main)
- Use the
After the new build section is added, the entire file should resemble the following example:
- File: ~/hello-dagger/ci/main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
import sys import anyio import dagger async def main(): config = dagger.Config(log_output=sys.stdout) async with dagger.Connection(config) as client: source = ( client.container() .from_("node:16-slim") .with_directory( "/src", client.host().directory("."), exclude=["node_modules/", "ci/"], ) ) runner = source.with_workdir("/src").with_exec(["npm", "install"]) test = runner.with_exec(["npm", "test", "--", "--watchAll=false"]) build_dir = ( test.with_exec(["npm", "run", "build"]) .directory("./build") ) await build_dir.export("./build") e = await build_dir.entries() print(f"build dir contents:\n{e}") anyio.run(main)
Return to the
hello-dagger
directory and run the Python script again:cd ~/hello-dagger python3 ci/main.py
The script displays the list of directories inside the build directory:
build dir contents: ['asset-manifest.json', 'favicon.ico', 'index.html', 'logo192.png', 'logo512.png', 'manifest.json', 'robots.txt', 'static']
Note The first part of the script should run more quickly because the container does not have to be downloaded. However, the build process typically takes about a minute to finish.
Publish the Container to a Registry
At this point, the Dagger pipeline creates, tests, and builds the application. The pipeline is already very useful and could even be considered complete. However, Dagger can also publish the container to a registry to create an even more optimized workflow.
Before publishing the container, the application build is copied into an nginx
container. Any authentication details must be defined in advance.
This example publishes the guide to the Harbor registry. Harbor is a lightweight and easy to use container registry platform that can be installed on a separate system. It provides cloud storage, signing and scanning tools, security, access control, audit mechanisms, and container management. It allows administrators to have control over their own registry and keep it on the same network as their development systems. For more information on using the registry, see the Harbor documentation.
To publish the application, follow these steps.
Open the main.py file again:
nano ~/hello-dagger/ci/main.py
Directly beneath the start of the
async with dagger.Connection(config) as client:
block, add the password details. Use theset_secret
method to provide the registry password. The parameters must be the stringpassword
in quotes, followed by the actual password for the Harbor account. The stringpassword
tells Dagger what type of secret is being defined. Assign the result to thesecret
variable.- File: ~/hello-dagger/ci/main.py
9
secret = client.set_secret("password", "HARBORPASSWORD")
The next change applies to the build stage. The following changes are required to this section of the pipeline:
- Do not assign the result of the build to the
build_dir
variable. Instead, wait for all build activities, including the directory export back to the host, to complete. - Replace the command assigning the directory to
build_dir
with the following lines. - Remove the remainder of the
main
function, up to the commandanyio.run(main)
. Delete the two asynchronousawait
directives and theprint
command.
- File: ~/hello-dagger/ci/main.py
24 25 26 27 28
await ( test.with_exec(["npm", "run", "build"]) .directory("./build") .export("./build") )
- Do not assign the result of the build to the
Define a new container based on the
nginx:1.23-alpine
image and package the application into this container. Add the following details:- This section creates a new container based on the
nginx:1.23-alpine
image. Use the syntaxclient.container().from_("nginx:1.23-alpine")
to instantiate the container. - Use the
.with_directory
method to write thebuild
directory to the roothtml
NGINX directory inside the container. - Assign the container to the
ctr
variable.
- File: ~/hello-dagger/ci/main.py
30 31 32 33 34
ctr = ( client.container() .from_("nginx:1.23-alpine") .with_directory("/usr/share/nginx/html", client.host().directory("./build")) )
- This section creates a new container based on the
To publish the container to the registry, use the
with_registry_auth
andpublish
methods. This example uses Harbor as the target registry, but the container can be published to any Docker-compatible registry. Add the following section to the file, accounting for the following changes:- Enclose the details in an asynchronous
await
call. - For the first parameter of the
with_registry_auth
method, supply the domain name of the registry in the formatregistrydomainname/project/repository:tag
. Replaceregistrydomainname
with the name of your Harbor domain,project
with the project name, andrepository
with the name of the repository to publish to. Thetag
field is optional. - For the remaining parameters, append a user name for the Harbor account along with the
secret
variable. In this example, the account name isadmin
. - In this example,
example.com/dagger/daggerdemo:main
means the container is published to thedaggerdemo
repository inside thedagger
project in theexample.com
registry. The container is tagged with themain
tag. - In the
publish
method, indicate where to publish the container. This information follows the same format as the registry information inwith_registry_auth
and should repeat the same details.
- File: ~/hello-dagger/ci/main.py
36 37 38 39 40 41 42 43
addr = await ( ctr .with_registry_auth("example.com/dagger/daggerdemo:main", "admin", secret) .publish("example.com/dagger/daggerdemo:main") ) print(f"Published image to: {addr}") anyio.run(main)
Note Before publishing a container to a Harbor registry, you must create a project to contain the container. This example publishes the container to thedaggerdemo
repository inside thedagger
project. Ifdagger
does not already exist, the request fails.- Enclose the details in an asynchronous
The entire file should be similar to the following example. Replace
example.com
with the domain name of the Harbor registry andHARBORPASSWORD
with the actual password for the registry.- File: ~/hello-dagger/ci/main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43
import sys import anyio import dagger async def main(): config = dagger.Config(log_output=sys.stdout) async with dagger.Connection(config) as client: secret = client.set_secret("password", "HARBORPASSWORD") source = ( client.container() .from_("node:16-slim") .with_directory( "/src", client.host().directory("."), exclude=["node_modules/", "ci/"], ) ) runner = source.with_workdir("/src").with_exec(["npm", "install"]) test = runner.with_exec(["npm", "test", "--", "--watchAll=false"]) await ( test.with_exec(["npm", "run", "build"]) .directory("./build") .export("./build") ) ctr = ( client.container() .from_("nginx:1.23-alpine") .with_directory("/usr/share/nginx/html", client.host().directory("./build")) ) addr = await ( ctr .with_registry_auth("example.com/dagger/daggerdemo:main", "admin", secret) .publish("example.com/dagger/daggerdemo:main") ) print(f"Published image to: {addr}") anyio.run(main)
Save the file and exit
nano
when finished.From the
hello-dagger
directory, run the Python script again. The script builds the container and pushes it out to the registry. The whole process might take a few minutes to complete. When complete, the script displays the name and tag generated for the image. Make note of the full container name and tag for future use.cd ~/hello-dagger python3 ci/main.py
Published image to: example.com/dagger/daggerdemo:main@sha256:eb8dbf08fb05180ffbf56b602ee320ef5aa89b8f972f553e478f6b64a492dd50
Confirm the container has been successfully built and uploaded. Use the
docker run
command to pull the container back to the host and run the application. Specify the exact container name and address indicated in the output of themain.py
script.docker run -p 8080:80 example.com/dagger/daggerdemo:main@sha256:eb8dbf08fb05180ffbf56b602ee320ef5aa89b8f972f553e478f6b64a492dd50
Navigate to port
8080
of the node, using either the IP address or a fully qualified domain name. The browser should display a “Welcome to Dagger” web page.
Conclusion
Dagger provides a multi-language framework for CI/CD automation in a containerized context. It includes capabilities to construct a pipeline that assembles, tests, builds, and publishes an application using a single script. Dagger includes SDKs for Python, Go, and Node.js, along with a GraphQL API for low-level integration with other languages. For more information on how to use Dagger, consult the Dagger Documentation.
More Information
You may wish to consult the following resources for additional information on this topic. While these are provided in the hope that they will be useful, please note that we cannot vouch for the accuracy or timeliness of externally hosted materials.
This page was originally published on