kandulah
3 years agoNew Contributor II
S3 bucket creation
Hi
Is there a possibility to create S3 bucket through a pipeline using any of the existing snaps offered by snap Logic?
Thanks
NK
Hi
Is there a possibility to create S3 bucket through a pipeline using any of the existing snaps offered by snap Logic?
Thanks
NK
Hi @mitsos,
I experimented and I put together a couple of pipelines, where I think a solution for your problem might lie. First i created some dummy data in the AzureDynamicTable pipeline for bulk load with an Azure SQL - Bulk Load Snap (which is located in the child pipeline AzureChildDynamicTable). That is because the value for the Table Name in the Snap Settings can only be passed using a pipeline parameter.
In the parent pipeline, AzureDynamicTable, I add an additional field to the documents for bulk load. The field’s name is $tableName, and I assign a value to it thusly: “dbo.temp_dailies_” + $_time.toString().substring(0,10). Then I pass it as a parameter value for the parameter name tableName of the child pipeline.
I then have to remove the extra field in a Mapper in the child pipeline.
Here are the pipelines:
AzureChildDynamicTable_2020_06_04.slp (3.9 KB) AzureDynamicTable_2020_06_04.slp (8.1 KB)
I hope you can find your answer in there.
BR
Dimitri
@dimitri.hristovski thank you so much for taking the time - I think the clue that I needed was in the Mapper snap. I think I have figured it out, but I need to check a couple more things tomorrow.
This works! Thank you!!!
It is a bit slow, but think it will be fine. There are about 100k rows in each pipeline.
So, _tableName → the underscore means what, exactly?
It has to be a pipeline parameter, right? How does the Execute Pipeline snap “know” to take only one value from the $tableName column?
Correct, _tableName is a pipeline parameter.
The Pipeline Execute Snap takes the $tableName field of each document it receives at the input view, and passes it to the child pipeline as the tableName pipeline parameter.
One thing to note that I may have inadvertently omitted. If you are certain that all of the documents that will show up on the input view of the Pipeline Execute Snap will have the same value for $tableName, you can check the Reuse checkbox in the settings for the Pipeline Execute Snap.
That way all of the documents will be sent to the same instance of the child pipeline, which will immensely improve the execution time. Perhaps it was obvious from your description of the problem, but nonetheless, I put it to you now.
BR,
Dimitri
Thank you very much - this has been very helpful!!