- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-23-2023 10:12 AM
Hello All,
I have a use case where I have to schedule my pipeline dynamically based on entries from an input table? What’s the best way to implement this? For Example: If I have a pipeline loadSalesTable, I have to look to my input scheduling table to decide what day and time to trigger the pipeline and accordingly I have to schedule it.
I don’t think we have any straightforward approach using Scheduled or Triggered task, looking for a work around. Thanks in advance.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-26-2023 11:34 PM
As @bojanvelevski mentioned you can use the Metadata snap pack to achieve it,
Solution Approach:
- Create all tasks that you want to run dynamically as Scheduled.
- Read data from the source table with info on when to run a pipeline.
- Read the specific Task with the help of the Metadata snap pack.
- Assign new values and update the Task with the Metadata snap pack.
Set values below in the pipeline parameter:
source_proj: orgName/ProjectPath
target_proj: orgName/ProjectPath
taskToUpdate: Name of the task
Sample pipeline:
UpdateTaskInfo.slp (3.7 KB)
Let us know if this worked.
Cheers 🙂

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-24-2023 02:12 PM
Hi there @Senthilnaga91.
The Snaplogic Snaps will be suitable for this use case. You have all CRUD operations at your disposal. Let me know if you still need help on this.
Regards,
Bojan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-26-2023 09:39 AM
I have provided more details in my previous reply. Please refer.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-26-2023 11:34 PM
As @bojanvelevski mentioned you can use the Metadata snap pack to achieve it,
Solution Approach:
- Create all tasks that you want to run dynamically as Scheduled.
- Read data from the source table with info on when to run a pipeline.
- Read the specific Task with the help of the Metadata snap pack.
- Assign new values and update the Task with the Metadata snap pack.
Set values below in the pipeline parameter:
source_proj: orgName/ProjectPath
target_proj: orgName/ProjectPath
taskToUpdate: Name of the task
Sample pipeline:
UpdateTaskInfo.slp (3.7 KB)
Let us know if this worked.
Cheers 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-26-2023 12:32 AM
I think your requirement is to trigger the a pipeline when ever there is a change in record in source table.
For that you have to create pipeline to save the current count in SLDB and compare the count of record in current execution. This pipeline to compare and check the record count mismatch need to be schedule 15 or 30 min and if there is count mismatch, call the pipeline to load data to Sales table.
