[SOLVED] Pipeline structuring and deduplication

Hey folks, just installed Drone today and am working on porting over a GitHub Actions workflow. I’m a bit confused about how to model some things, and whether or not I can make my work any easier. :slight_smile:
I’m building a game. My current workflow looks like this:

  • On push to master, run cargo check to make sure the code is at least valid. Run the check on Linux, Windows, and macOS.
  • On push to release, create release builds on Linux, Windows, and macOS. Then, only if those three release builds finish, publish all three artifacts at once.

I recognize there’s a bit of friction with caching pipeline artifacts and passing them to the publish pipeline, but I want to check my understanding of how Drone is structured. For the master check, I’m guessing I’ll need three separate pipelines. Linux will run in Docker, while Windows/macOS will run in exec runners. The release step will also need three pipelines, and then a fourth to aggregate the results of the first three. Is that accurate? I was initially hoping I could somehow conditionally execute pipeline steps based on platform, to run some things in Docker or on exec runners and just swap out the steps. But that doesn’t look possible.

Assuming I’m correct, how do I perform the release aggregation such that the individual platform release pipelines are all or nothing? Essentially, each platform release build would create and cache its release, and the final pipeline would retrieve that cache and publish.

Finally, I have lots of duplication in these steps. In particular, installing Linux dependencies is identical in all pipelines, as is the basic build/check procedure. Is there any way to simplify this with anchors? For instance:

kind: pipeline
type: docker
name: check linux

  branch: master

  disable: true

  - name: clone
    image: alpine/git
      - apk add --no-cache git-lfs
      - git clone --recursive $DRONE_REPO_LINK .
      - git checkout $DRONE_COMMIT

Those clone steps are identical to every pipeline, and the Linux installation steps are needed across Linux pipelines. Can I break those out into anchors so I’m not having to fix these processes in half a dozen places if they ever change?

Thanks a bunch.

In Drone, each pipeline is a separate Yaml document, and you cannot use anchors across Yaml documents per the Yaml specification. So unfortunately this would not be an option for you. With that being said, Drone does have support for Jsonnet, which is a configuration language from Google that can be used to generate Yaml files and has some great features that allow de-duping. See Jsonnet | Drone

See also: How to reduce Yaml boilerplate

Nice. I eventually stumbled across pipeline dependencies, which answered one of my questions. I’ve been trawling Jsonnet tutorials for this and not finding the answer, though. Say I have something like:

local linux_dependencies = [
    "apt-get update -qq",
    "apt-get install -qqy llvm-dev libclang-dev clang libspeechd-dev pkg-config libx11-dev libasound2-dev libudev-dev zip",

And I want to plunk that array into various other arrays of step commands, at the top level, i.e.:

local clone_commands = [
    "git clone --recursive $DRONE_REPO_LINK .",
    "git checkout $DRONE_COMMIT",

local clone = {
    name: "clone",
    commands: clone_commands,

local linux_clone = {
    name: "clone",
    image: "alpine/git",
    commands: [
        "apk add --no-cache git-lfs",

    kind: "pipeline",
    name: "linux",
    type: "docker",
    clone: {
        disable: true,
    steps: [
        linux_clone, // Wrong because of embedded array, but `clone` works

How do I do that?


Oh, one more thing. I just looked at the Drone Jsonnet docs, and it looks like only a single pipeline per file is supported? Is that accurate? I just switched to having three platform-specific pipelines and one publish pipeline that depends on all of them. I just added a second pipeline to test, and it seems like last wins. Arrays don’t work either.

I’m also getting a:

  os: linux
  arch: amd64

in my pipeline definitions. I didn’t insert this.


Pipelines usually don’t share any state. So your choices are

  • publish build artifacts to some staging environment and promote them from there in your final pipeline or
  • simply run each pipeline without publishing once and if all pass run each again with a final publishing step in the end, so each pipeline is publishing it’s own os/arch-depending artifacts.

I finally figured this out.

One thing that might be incredibly useful to mention on the jsonnet
reference page: if you have a list of multiple pipelines which your file
returns, failing to provide --stream gives you this cryptic error:

 line 1: cannot unmarshal !!seq into yaml.RawResource

Whereas drone jsonnet --stdout --stream works as expected.

I lost days to thinking I either didn’t understand jsonnet or didn’t
understand how to structure pipelines. I just saw that option, wondered
WTF it did, tried it, and was surprised to see it work. :slight_smile: Maybe the CLI
could be made to handle this case by default? Presumably the server is
smart enough to determine if it needs to stream or, whatever
non-streamed is…

We do mention this in the command line tool reference documentation, but perhaps we need to make this more prominent: drone jsonnet | Drone

I think we could have probably saved you some time if we knew you were using the command line tools. I assumed you were getting these issues with the server. Probably a good time to remind that when posting to the forums, providing steps to reproduce is helpful so that we have a more complete picture of the issue.