Pass variables between workflows in K8s pipeline deployment

Pass variables between workflows in K8s pipeline deployment

I want to pass a variable between a workflow to another in a harness pipeline.

The case:

My first workflow deploys an API and builds a load balancer k8s service, this service has a public IP and it is the data that I want to pass to my next workflow. The next workflow in the pipeline is a web application that consumes IP, it needs an environment variable called API_URL in the k8s deployment.

How can I solve it?

1. Add a bash script step after API service deployment

We add a bash script step after my API Rollout deployment, this script has the next variables:

  • Name: Get LoadBalancer IP (whatever you want)
  • Script Type: BASH
  • Script: export URL=$(kubectl get svc exemplar-api-svc -n development --template="{{range .status.loadBalancer.ingress}}{{.ip}}{{end}}") # Get the load balancer from k8s service
  • Script Output: URL
  • Active the Publish output in the context box
    • Publish Variable Name*: api_service
    • Scope: Pipeline

We are able to use the variable ${context.api_service.URL} wherever we want in the pipeline

** Note: I add some steps in order to test the URL **

2. Add workflow variable in our web service (the one that requires API URL for consuming it)

We are able to use Workflow Variables in our workflows, in this case, I add API_URL variable and the default value was http://${context.api_service.URL}, remember, the load balancer IP from our API service was stored in api_service.URL` in the previous step.

3. Use Workflow Variable in our service automatically.

To do this I used the values.yml file that my web service has. I just add a new variable called 'apiServiceUrland its default value was${workflow.variables.API_URL}. This because we are able to access to workflow variables in the services via worflow.variables` object.

And that’s it…


Thanks for the contribution!!