I’m trying to run Node.js script wrapped in docker container as a build step of Drone pipeline but keep getting Error: Cannot find module '/drone/src/index.js' (indicating that it didn’t manage to find my Node.js app). I have tried to execute “ls” as a command of this build step and noticed that it ends up displaying files and folders not from step’s container but from repository itself.
Step is super simple, internally it just runs node index.js
- name: create tag
image: someprivaterepo/tag-manager:1.0.1
environment:
COMMIT: ${DRONE_COMMIT}
REPOSITORY: ${DRONE_REPO}
commands:
- ls
Probably I incorrectly understand how pipelines work but I had an assumption that specifying image for pipeline step means that this image is going to be executed with his own files. However, seems like it gets executed with files of github repository containing drone.yml definition.
So my question is: is it possible to run docker container containing Node.js/shell script as a pipeline step?
specifying image for pipeline step means that this image is going to be executed with his own files. However, it seems like it gets executed with files of github repository containing drone.yml definition.
When you specify the image for the pipeline step, the commands are executed inside the image you specify. you can read more about this here:
The image still has all of its own files, however, in addition to these files it has your source code mounted as a volume. without access to your source code, you would be unable to compile, run tests, etc. You can read more about how this works here:
I was so confused to see files from actual repository that I didn’t notice that it also has files from step’s image Just rechecked and can confirm that it works as expected