Forum Discussion
gmcclintock
5 years agoNew Contributor
What we’ve done is made “parent” pipelines to generate this type of thing, then we can pass it to the child and have it all available as a pipeline parameter to be reused - this also allows us to parallelize multiple jobs
eg: we have one parent and one child pipeline for Salesforce CDC - parent grabs the “job” record from our DB that has the case object, which connection to use, the path for storage, and the date range filters, and then it passes those all to a singular child pipeline for execution all at the same time.
We’ve done this as well for other cases like yours, just to keep it clean and knwo where to look and we’ve found it a bit easier to manage than chasing the json through pass through.