How to share files between Pipeline steps

Drone automatically creates a shared volume for each Pipeline that is mounted to every Step at /drone/src. This shared volume is known as the Workspace. The Workspace is the working directory of every Step and the location to which your repository is cloned.

The shared Workspace allows you to generate files and artifacts and share them between steps in your Pipeline. For example when you run npm install it creates a node_modules file in the working directory, which is then available to subsequent steps in your Pipeline.

You can verify this behavior with the below configuration:

kind: pipeline
name: default

steps:
- name: foo
  image: alpine:3.8
  commands:
  - touch foo.txt

- name: bar
  image: alpine:3.8
  commands:
  - ls foo.txt

Conceptually you can visualize this as Drone executing the below Docker commands. Note that these are not the exact commands or parameters that Drone is executing, and are only meant to help visualize how this works.

docker volume create workspace

docker run --volume=workspace:/drone/src alpine:3.8 /bin/sh -c "touch foo.txt"
docker run --volume=workspace:/drone/src alpine:3.8 /bin/sh -c "ls bar.txt"

Temporary Volumes

Sometimes you need to share files or artifacts among Steps that are outside of your workspace. For example, you may want to share the /go directory or /root/.m2. This can be achieved with temporary volumes:

kind: pipeline
name: default

steps:
- name: foo
  image: alpine:3.8
  volumes:
  - name: m2
    path: /root/.m2
  commands:
  - touch /root/.m2/foo.txt

- name: bar
  image: alpine:3.8
  volumes:
  - name: m2
    path: /root/.m2
  commands:
  - ls /root/.m2/foo.txt

volumes:
- name: m2
  temp: {}
6 Likes

thank you for this faq entry :heart: is there also a way how we can share a temporary volume between two pipelines instead only between the steps of one pipeline?

is there also a way how we can share a temporary volume between two pipelines instead only between the steps of one pipeline?

This is not. This is because pipelines fanout across multiple machines and there is no guarantee that two pipelines will run on the same machine. There are plugins that handle caching resources in remote storage (such as S3) that you could use, or you could create your own.

2 Likes

aaah, i see. i was not aware of this fact, hence it makes totally sense. thanks for elaborating!