How do I get logs for a detached step in docker pipeline?
Also asked on stackoverflow
The logs for detached steps are displayed in the user interface. Here is a real world example:
Sample yaml:
https://github.com/drone/hello-world/blob/0041a84cf0291468f953f45e42d6859ee15b2c74/.drone.yml
Sample execution:
https://cloud.drone.io/drone/hello-world/291/1/2
I am only executing a python script which has a print statement if that makes a difference.
I took a look at your screenshot and, although I have limited information, I do not think you should be using detached steps.
The purpose of a detached step (which is the same as a service) is to launch a long-running process (such as postgres, redis, etc) for the duration of the pipeline without blocking the pipeline. For example, you want to start a postgres database as a step in your pipeline, but since postgres will run indefinitely, you do not want it to block subsequent steps. So when you define a detached step, Drone will start the container and immediately move on to the next step in the pipeline. And at the end of the pipeline, Drone will kill any detached steps and services that are still running.
In your example, the bigquery step is both detached and the last step in your pipeline. As a result, Drone will start the bigquery container and then proceed to the next step. Since it is the very last step, the pipeline will terminate and the bigquery container will be killed. My guess is that the bigquery container is started and then immediately stopped before python is able to even initialize, which is why you don’t see any logs.
So based on the screenshots you have provided and the behavior you are describing, I am not sure you should be using a detached step.