TLDR

Jaypore CI


  • Configure pipelines in Python

  • Jobs are run using Docker; on your laptop and on cloud IF needed.

  • Send status reports anywhere, or nowhere at all. Email, commit to git, Gitea PR, Github PR, or write your own class and send it where you want.

Getting Started

Installation

You can install Jaypore CI using a bash script. The script only makes changes in your repository so if you want you can do the installation manually as well.

$ cd ~/myrepository
$ curl https://www.jayporeci.in/setup.sh > setup.sh
$ bash setup.sh -y

For a manual install you can do the following. The names are convention, you can call your folders/files anything but you’ll need to make sure they match everywhere.

  1. Create a directory called cicd in the root of your repo.

  2. Create a file cicd/pre-push.sh

  3. Create a file cicd/cicd.py

  4. Update your repo’s pre-push git hook so that it runs the cicd/pre-push.sh file when you push.
    1. Git hook should call cicd/pre-push.sh

    2. After setting environment variables cicd/pre-push.sh calls

      cicd/cicd.py inside a docker container having JayporeCI installed. You can use arjoonn/jci if you don’t have anything else ready.

    3. cicd/cicd.py will run your jobs within other docker containers.

Your entire config is inside cicd/cicd.py. Edit it to whatever you like! A basic config would look like this:

from jaypore_ci import jci

with jci.Pipeline(image='mydocker/image') as p:
    p.job("Black", "black --check .")
    p.job("Pylint", "pylint mycode/ tests/")
    p.job("PyTest", "pytest tests/")

This would produce a CI report like:

β•” 🟒 : JayporeCI       [sha edcb193bae]
┏━ Pipeline
┃
┃ 🟒 : Black           [ffcda0a9]   0: 3
┃ 🟒 : Pylint          [2417ad58]   0: 9
┃ 🟒 : PyTest          [28d4985f]   0:15 [Cov: 65%  ]
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
  • edcb193bae is the SHA that the report is for.

  • Pipeline is the default pipeline stage.

  • 🟒 indicates that the job has passed

  • Black, Pylint, and PyTest are the job names.

  • [ffcda0a9] is the docker container ID for that job.

  • 1: 3 is the time taken by the job.

  • [Cov: 65% ] is custom reporting done by the job. - Any job can create a file /jaypore_ci/run/<job name>.txt and the first 5 characters from that file will be displayed in the report. - Although this is used for coverage reports you could potentially use this for anything you want. - You could report error codes here to indicate WHY a job failed. - Report information about artifacts created like package publish versions.

To see the pipelines on your machine you can use a Dozzle container on your localhost to explore CI jobs.

If you don’t want to do this it’s also possible to simply use docker logs <container ID> to explore jobs.

Concepts

Pipeline config

sequenceDiagram autonumber loop Pipeline execution Pipeline ->> Executor: docker run [n jobs] Executor -->> Pipeline: docker inspect [k jobs] Pipeline ->> Reporter: Pipeline status Reporter -->> Pipeline: Rendered report Pipeline ->> Remote: Publish report Remote -->> Pipeline: ok end
  1. A pipeline is defined inside a python file that imports and uses jaypore_ci. - It can also import other libraries / configs. Do whatever your usecase needs.

  2. A config starts with creating a Pipeline instance. Everything happens inside this context.
    • A pipeline has to have one implementation of a Remote, Reporter, Executor, and Repo specified.

    • If you do not specify them then the defaults are Gitea, Text, Docker, and Git.

    • You can specify ANY other keyword arguments to the pipeline and they will be applied to jobs in that pipeline as a default. This allows you to keep your code DRY. For example, we can specify image=’some/docker:image’ and this will be used for all jobs in the pipeline.

  3. Parts of a pipeline
    1. Repo holds information about the project.
      • You can use this to get information about things like sha and branch.

      • It can also tell you which files have changed using files_changed().

      • Currently only Git is supported.

    2. Executor Is used to run the job. Perhaps in the future we might have shell / VMs.

    3. Reporter Given the status of the pipeline the reporter is responsible for creating a text output that can be read by humans. Along with Text , we also have the Markdown reporter that uses Mermaid graphs to show you pipeline dependencies.

    4. Remote is where the report is published to. Currently we have:
      • GitRemote which can store the pipeline status in git itself. You can then push the status to your github and share it with others. This works similar to git-bug.

      • Gitea can open a PR and publish pipeline status as the PR description on Gitea.

      • Github can open a PR and publish pipeline status as the PR description on Github.

      • Email can email you the pipeline status.

  4. Each pipeline can declare multiple stage() sections.
    • Stage names have to be unique. They cannot conflict with job names and other stage names.

    • Stages are executed in the order in which they are declared in the config.

    • The catch all stage is called Pipeline. Any job defined outside a stage belongs to this stage.

    • Any extra keyword arguments specified while creating the stage are passed to jobs. These arguments override whatever is specified at the Pipeline level.

  5. Finally, any number of job() definitions can be made.
    • Jobs declared inside a stage belong to that stage.

    • Job names have to be unique. They cannot clash with stage names and other job names.

    • Jobs are run in parallel UNLESS they specify depends_on=[β€œother_job”], in which case the job runs after other_job has passed.

    • Jobs inherit keyword arguments from Pipelines, then stages, then whatever is specified at the job level.

Secrets and environment variables

  1. JayporeCI uses SOPS to manage environment variables and secrets.
    • We add secrets/<env_name>.enc to store secrets.

    • We add secrets/<env_name>.key to decrypt corresponding secret files. This is an AGE key file. Do NOT commit this to git!. JayporeCI automatically adds a gitignore to ignore key files.

    • We also add secrets/bin/edit_env.sh and secrets/bin/set_env.sh to help you manage your secrets easily.

  2. It is a good idea to have separate secret files for each developer, each environment respectively.
    • For example, JayporeCI itself only has a single secret file called ci.

How to

See job logs

  • The recommended way is to have a Dozzle container on your localhost to explore CI jobs.

  • You can also run docker logs <container ID> locally.

  • To debug running containers you can docker exec <container ID> while the job is running.

Build and publish docker images

Environment / package dependencies can be cached in docker easily. Simply build your docker image and then run the job with that built image.

1from jaypore_ci import jci
2
3with jci.Pipeline() as p:
4    p.job("Docker", f"docker build -t myimage .")
5    p.job("PyTest", "python3 -m pytest tests/", image="myimage", depends_on=["Docker"])

Define complex job relations

This config builds docker images, runs linting, testing on the codebase, then builds and publishes documentation.

 1from jaypore_ci import jci
 2
 3with jci.Pipeline() as p:
 4
 5    with p.stage("build"):
 6        p.job("DockDev", f"docker build --target DevEnv -t {p.repo.sha}_dev .")
 7
 8    with p.stage("checking", image=f"{p.repo.sha}_dev"):
 9        p.job("Integration", "run test.sh integration")
10        p.job("Unit", "run test.sh unit")
11        p.job("Linting", "run lint.sh")
12        p.job(
13            "Fuzz testing",
14            "bash test.sh fuzz",
15            depends_on=["Integration", "Unit"],
16        )

Run a job matrix

There is no special concept for matrix jobs. Just declare as many jobs as you want in a while loop. There is a function to make this easier when you want to run combinations of variables.

 1from jaypore_ci import jci
 2
 3with jci.Pipeline() as p:
 4    # This will have 18 jobs
 5    # one for each possible combination of BROWSER, SCREENSIZE, ONLINE
 6    for env in p.env_matrix(
 7        BROWSER=["firefox", "chromium", "webkit"],
 8        SCREENSIZE=["phone", "laptop", "extended"],
 9        ONLINE=["online", "offline"],
10    ):
11        p.job(
12            f"Test: {env}",
13            "pytest --browser=$BROWSER --device=$SCREENSIZE",
14            env=env,
15        )

The above config generates 3 x 3 x 2 = 18 jobs and sets the environment for each to a unique combination of BROWSER , SCREENSIZE, and ONLINE.

Run on cloud/remote runners

  • Make sure docker is installed on the remote machine.

  • Make sure you have ssh access to remote machine and the user you are logging in as can run docker commands.

  • Add to your local ~.ssh/config an entry for your remote machine. Something like:

    Host my.aws.machine
        HostName some.aws.machine
        IdentityFile ~/.ssh/id_rsa
    
  • Now in your cicd/pre-push.sh file, where the docker run command is mentioned, simply add DOCKER_HOST=ssh://my.aws.machine

  • JayporeCi will then run on the remote machine.

Use custom services for testing

Some jobs don’t affect the status of the pipeline. They just need to be there while you are running your tests. For example, you might need a DB to run API testing, or you might need both the DB and API as a service to run integration testing.

To do this you can add is_service=True to the job / stage / pipeline arguments.

Services are only shut down when the pipeline is finished.

 1from jaypore_ci import jci
 2
 3# Services immediately return with a PASSED status
 4# If they exit with a Non ZERO code they are marked as FAILED, otherwise
 5# they are assumed to be PASSED
 6with jci.Pipeline() as p:
 7
 8    # Since we define all jobs in this section as `is_service=True`, they will
 9    # keep running for as long as the pipeline runs.
10    with p.stage("Services", is_service=True):
11        p.job("Mysql", None, image="mysql")
12        p.job("Redis", None, image="redis")
13        p.job("Api", "python3 -m src.run_api", image="python:3.11")
14
15    with p.stage("Testing"):
16        p.job("Unit", "pytest -m unit_tests tests")
17        p.job("Integration", "pytest -m integration_tests tests")
18        p.job("Regression", "pytest -m regression_tests tests")

Import jobs with pip install

You can also import jobs defined by other people. Some examples of why you might want to do this:

  • A common lint policy for company / clients.

  • Common deploy targets and processes for things like docs / release notes.

  • Common notification targets like slack / telegram / email.

  • Common PR description checklist for company / clients.

  • Common PR merge policies / review policies etc.

Since JayporeCI has a normal programming language as it’s config language, most things can be solved without too much effort.

Publish Artifacts / Cache

  • All jobs run in a shared directory /jaypore_ci/run.

  • Anything you write to this directory is available to all jobs so you can use this to pass artifacts / cache between jobs.

  • You can have a separate job to POST your artifacts to some remote location / git notes / S3 / gitea

Jobs based on files change / branch name

Some jobs only need to run when your branch is main or in release branches. At other times we want to check commit messages and based on the message run different jobs.

 1from jaypore_ci import jci
 2
 3
 4with jci.Pipeline() as p:
 5    p.job("testing", "bash cicd/lint_test_n_build.sh")
 6    # This job will only be defined when the branch is main. Otherwise it will
 7    # not be a part of the pipeline
 8    if p.repo.branch == "main":
 9        p.job(
10            "publish",
11            "bash cicd/publish_release.sh",
12            depends_on=["testing"],
13        )
14    # The following job will only be run when documentation changes.
15    if any(path.startswith("docs") for path in p.repo.files_changed("develop")):
16        p.job(
17            "build_docs",
18            "bash cicd/build_docs.sh",
19            depends_on=["testing"],
20        )

Test your pipeline config

Mistakes in the pipeline config can take a long time to catch if you are running a large test harness.

With Jaypore CI it’s fairly simple. Just write tests for your pipeline since it’s normal Python code!

To help you do this there are mock executors/remotes that you can use instead of Docker/Gitea. This example taken from Jaypore CI’s own tests shows how you would test and make sure that jobs are running in order.

 1from jaypore_ci import jci
 2
 3
 4with jci.Pipeline() as p:
 5    p.job("testing", "bash cicd/lint_test_n_build.sh")
 6    # This job will only be defined when the branch is main. Otherwise it will
 7    # not be a part of the pipeline
 8    if p.repo.branch == "main":
 9        p.job(
10            "publish",
11            "bash cicd/publish_release.sh",
12            depends_on=["testing"],
13        )
14    # The following job will only be run when documentation changes.
15    if any(path.startswith("docs") for path in p.repo.files_changed("develop")):
16        p.job(
17            "build_docs",
18            "bash cicd/build_docs.sh",
19            depends_on=["testing"],
20        )

Status report via email

You can send pipeline status reports via email if you don’t want to use the PR system for gitea/github etc.

See the Email docs for the environment variables you will have to supply to make this work.

1from jaypore_ci import jci, executors, remotes, repos
2
3git = repos.Git.from_env()
4email = remotes.Email.from_env(repo=git)
5
6# The report for this pipeline will go via email.
7with jci.Pipeline(repo=git, remote=email) as p:
8    p.job("hello", "bash -c 'echo hello'")

Run selected jobs based on commit message

Sometimes we want to control when some jobs run. For example, build/release jobs, or intensive testing jobs. A simple way to do this is to read the commit messsage and see if the author asked us to run these jobs. JayporeCI itself only runs release jobs when the commit message contains jci:release as one of it’s lines.

1from jaypore_ci import jci
2
3with jci.Pipeline() as p:
4    p.job("build", "bash cicd/build.sh")
5
6    # The job only gets defined when the commit message contains 'jci:release'
7    if "jci:release" in p.repo.commit_message:
8        p.job("release", "bash cicd/release.sh", depends_on=["build"])

πŸ’¬ :Select remote based on job status / branch / authors

Note

If you want this feature please go and vote for it on the github discussion.

At times it’s necessary to inform multiple people about CI failues / passing.

For example

  • Stakeholders might need notifications when releases happen.

  • People who wrote code might need notifications when their code breaks on a more intensite test suite / fuzzying run.

  • Perhaps you have downstream codebases that need to get patched when you do bugfixes.

  • Or perhaps a failure in the build section of the pipeline needs one set of people to be informed and a failure in the user documentation building needs another set of people.

While all of this is already possible with JayporeCI, if this is a common workflow you can vote on it and we can implement an easier way to declare this configuration.

Run multiple pipelines on every commit

You can modify cicd/pre-push.sh so that instead of creating a single pipeline it creates multiple pipelines. This can be useful when you have a personal CI config that you want to run and a separate team / organization pipeline that needs to be run as well.

This is not the recommended way however since it would be a lot easier to make cicd/cicd.py a proper python package instead and put the two configs there itself.

Passing extra_hosts and other arguments to docker

Often times you want to configure some extra stuff for the docker run command that will be used to run your job, like when you want to pass extra_hosts or device_requests to the container.

To do such things you can use the executor_kwargs argument while defining the job using job(). Anything that you pass to this dictionary will be handed off to Docker-py and so you can use anything that is mentioned in that documentation.

 1from jaypore_ci import jci
 2
 3with jci.Pipeline() as p:
 4    p.job(
 5        "Pytest",
 6        "pytest",
 7        executor_kwargs={
 8            "extra_hosts": {
 9                # Access machines behind VPNs
10                "machine.behind.vpn": "100.64.0.12",
11                # Redirect localhost addresses to the docker gateway
12                "dozzle.localhost": "172.0.0.1",
13                # Replace production APIs with locally mocked APIs
14                "api.myservice.com": "127.0.0.1",
15            }
16        },
17    )

Using a github remote

If you want to use github instead of gitea, it’s very simple to use.

1from jaypore_ci import jci, repos, remotes
2
3repo = repos.Git.from_env()
4# Specify JAYPORE_GITHUB_TOKEN in your secrets file
5remote = remotes.Github.from_env(repo=repo)
6
7with jci.Pipeline(repo=repo, remote=remote) as p:
8    p.job("Pytest", "pytest ")

Contributing

  • Development happens on a self hosted gitea instance and the source code is mirrored at Github.

  • If you are facing issues please file them on github.

  • Please use Github discussions for describing problems / asking for help / adding ideas.

  • Jaypore CI is open source, but not openly developed yet so instead of submitting PRs, please fork the project and start a discussion.

Reference

Changelog

0.2.31

  • 🎁: Old networks will also be removed automatically for jobs that are older than a week.

0.2.30

  • 🎁: You can pass arbitrary arguments to the docker run command simply by using the executor_kwargs argument while defining the job. Read more in Passing extra_hosts and other arguments to docker.

  • 🎁: SSH remotes are now compatible with Jaypore CI.

0.2.29

  • 🐞: When gitea token does not have enough scope log correctly and exit

0.2.28

  • 🐞: When there are multiple (push) remotes, Jaypore CI will pick the first one and use that.

0.2.27

  • 🎁: Jobs older than 1 week will be removed before starting a new pipeline.

0.2.26

  • βš™οΈ: The Dockerfile inside cicd/Dockerfile now requires a build arg that specifies the version of Jaypore CI to install.

0.2.25

  • 🎁: A dockerfile is now used to send context of the codebase to the docker daemon instead of directly mounting the code. This allows us to easily use remote systems for jobs