Forum Discussion
sravan - I'm sure others have their own ways to do things like this, but I'm partial to expression libraries, especially when you have multiple definitions to work with. Download the attached zip file and import the pipeline and file to your project.
In the file, you will notice that it is named based on the table name, in this case "test_table", and the contents have an "xref" object defined with the source key names, and the desired key names.
In the pipeline, the Mapper is using the Object.mapKeys() method to translate the key names by doing an Object.get() call to ensure that the source key name has been defined in the expression file and will default to the source key name from the input view if it doesn't exist.
Also in the pipeline is a dynamic reference to the library expression file based on the table_name pipeline parameter value, and aliased as "tbldef" so it can be referenced in an expression consistently regardless of the actual table name being processed. This will allow you to have one definition file per table, making it easier to maintain. Assuming you are using a parent pipeline to call a child and passing the name of the table to be processed as a pipeline parameter, this should be easy for you to incorporate.
Hope this helps!
- jason_steindorf3 years agoNew Contributor III
The first screenshot shows that data flows through and ‘last_hours’ is set 40
And the second screen shot will show that data STOPS flowing when set to 45, ideally i want to go back 1 week(168hr)
- koryknick3 years agoEmployee
If you execute the pipeline rather than use Validate Pipeline to see the data previews, can you see if there are any documents that pass your “Remove child pipelines” filter?
My guess is that you’re seeing an artifact of a small value in your User Settings “Preview Document Count” (default is 50). You could also try setting that to a higher value and see if you preview passes any records past the filter.
Even if this is the case, you may run into the 1000 record limit of the first API call to get the list of pipeline executions. There are 2 settings on the REST Get snap (Has Next and Next URL) that will allow you to paginate through the responses. I have attached a sample pipeline that shows how to use these settings, specifically for the Pipeline Monitoring API.
Pipeline Monitoring Pagination_2022_09_08.slp (8.6 KB)- jason_steindorf3 years agoNew Contributor III
I added in the the hasNext and nextUrl but still no luck, can we set up a zoom meeting for next week
- jason_steindorf3 years agoNew Contributor III
i was able to modify your pipeline to my needs, thanks again, greatly appreciate your help
- koryknick3 years agoEmployee
Great! Happy I could help!
- koryknick3 years agoEmployee
@jason.steindorf - when you say it stops returning data, do you mean it returns data but nothing past about 40 hours? Or if you specify a last_hours value greater than 40, the call doesn’t return any records?