Automating Pull Requests

We have several repositories that contain code and documentation. The docs are generated with sphinx from code and sometimes some additional markdown or rst is added.

What we’d like to happen:

  • User writes code/tests/comments/docs
  • User opens a pull request
  • Pull request is reviewed, changes pushed, and ultimately it’s merged
  • Merging should kick off a new pipeline to generate documentation
  • Docs are generated and published to github pages

How we get the docs to GH Pages isn’t set in stone.

Right now what I was trying to do was to have my pipeline trigger builds when a pull request is opened

trigger:
  event:
    - pull_request
  branch:
    exclude:
      - drone_docs

Then my pipeline would run the generation and open a new PR from the drone_docs branch to the master branch. GH pages was pulling from the docs folder. This caused an awful loop where drone started trigger both the push and pr on drone_docs and continually building them one after the other and adding many commits to my drone_docs branch. I had to delete the branch and kill the builds several times before it stopped. This is obviously my fault and not Drone’s. I gave it some bad configuration.

Does anyone else build docs in their pipelines and publish them to github pages? There are other ways to handle docs. I could build the HTML and put it on the gh-pages branch, for example. Then maybe exclude all builds from that branch except the ones for docs?

Any examples, workflows, or ideas would be great!

1 Like

we have a github pages plugin that you can use:
http://plugins.drone.io/drone-plugins/drone-gh-pages/

in the below example we always build the static website but we only publish to github pages on push (when you merge a pull request it triggers a github push webhook).

kind: pipeline
name: default

steps:
- name: build
  image: node
  commands:
  - npm install
  - npm run build

- name: publish  
  image: plugins/gh-pages
  settings:
    username: octocat
    password: p455w0rd
    pages_directory: public/
  when:
    event:
    - push
    branch:
    - master

trigger:
  branch:
    exclude:
    - gh-pages

the above example uses hard coded credentials for demo purposes only. you will want to use secrets to pass sensitive data to the plugin, and in this case, you may want to use an ssh key instead of username and password (see plugin docs).

1 Like

Wow, I worked way too hard on this :slight_smile:. Thanks for the answer @bradrydzewski. This works perfectly and is exactly what I wanted.

Here’s the gist of what I’m doing - minus some specific things to our use case.

---
kind: pipeline
name: all

steps:
  -
    name: test
    image: base/python:alpine
    commands:
      - pip install -r requirements-test.pip
      - make test

trigger:
  event:
    - push
  branch:
    exclude:
      - gh-pages
---
kind: pipeline
name: docs

steps:
  -
    name: build-docs
    image: base/python:alpine
    commands:
      - make docs
  - name: publish
    image: plugins/gh-pages

trigger:
  event:
    - pull_request

depends_on:
  - all

This uses the defaults. My docs are sphinx and are located in /docs/src. They are generated in /docs. My .gitignore has (among other things)

docs/*
!docs/src

And my make docs command runs my sphinx build. Then the static stuff is automatically pushed to gh-pages as per the defaults of the plugin. My site is published from that branch. Hopefully that helps someone else.

1 Like