Starlark: maximum file size exceeded for large amount of pipelines

Hi! I’m generating lots of pipelines with starlark to run some tests in parallel, and I’ve run into an issue of a hardcoded limit of the resulting YAML file size here.
For my scenario 1MB is not enough. Is it possible to make this value configurable, similar to DRONE_STARLARK_STEP_LIMIT?

hey @Andrii_Kasparevych yes, we would accept a pull request similar to DRONE_STARLARK_STEP_LIMIT that makes the size limit configurable.

Thanks for reply! I will try to spare some time to implement that, but I’d be grateful if this got implemented from your side in meanwhile :slight_smile:

Hey @brad , the PR is here. Please check.