09-02-2022 01:53 PM
Is there a way that I can trigger a task from a pipeline?
I currently have a pipeline that passes a file path to a Pipeline Execute snap. There might over 2500 file paths that I need to pass into the Pipeline Execute, which will clearly be over and above the 64 depth limitation of the snap. I hesitate to send use the batch size feature of the Pipeline Execute because each file can contain over 1 million records to process. My gut tells me that this amount of throughput through a single instance of a pipeline would be too much.
Rather than using the Pipeline Execute, I’d like to pass my file paths to a task, essentially turning my child pipeline into a parent pipeline and enable me to get around the 64 depth limitation.
Is this possible?
09-02-2022 01:57 PM
You can call the task URL: Invoking a SL pipeline with a REST call
09-03-2022 05:20 PM
Hi Kuya,
Good day, I had a similar requirement before and I took the path in using the ultra task high availability api
Snaplex must have at least one node setup as a feed-master
** Requirement was to send a JSON payload, the pipeline wrapper will be accepting JSON content-type by invoking REST Post from the pipeline runner
With ultra task, this pipeline can be upscale depending from your requirement (ultra task instance) with high availability for request to be served (max in flight)
Attached PoC pipelines
Runner - main pipeline that will invoke the ultra task pipeline
Runner_2022_09_04.slp (26.1 KB)
Do Something - the child pipeline will be executed
Do Something_2022_09_04.slp (5.3 KB)
open_api_save_random_image - the ultra pipeline (can also be the child pipeline)
open_api_save_random_image_2022_09_04.slp (22.5 KB)
Ultra task instance was set to 3
Hope this helps
Thanks,
EmEm
09-03-2022 09:31 PM
Hi EmEm, so happy to connect with you again. Hope you’re doing well. Salamat kuya for the nice wrapper solution.
My plan next week is to discontinue using my task’s CloudURL and begin using the SecuredURL that points to our groundplex snaplex. I believe by doing this, I can resolve the issue of being limited to 10 concurrently running tasks. Based on what I’ve been reading, I shouldn’t have concurrency limitations when I’m running on the groundplex. Do you have any experience using this method?
If this method works, that will be great. I’d like to see if this works before introducing ultra pipeline tasks. We’re not using those right now so I’d probably have to present a really good use case for it and (maybe?) the added expense that comes along with it.
I’ll let you know how it goes. Btw, I like your description label. MVP Rock Star!. That’s so cool!
09-04-2022 12:11 AM
Only when invoking the cloud url of the triggered task has this limitations
No limits when invoking the on-premise url