cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Dynamically taking S3 path from metadata

Shrabanti
New Contributor II

Hi All,
I need to give a s3 path dynamically from redshift metadata table in S3 File writer.
Because it has csv formatter before it, not able to give redshift execute and mapper to pass the value.
Any suggestion.

Untitled

Regards,
Shrabanti

5 REPLIES 5

Minovski
New Contributor II

Hi Shrabanti,

You can create a child pipeline that will contain at least the CSV Formater and the S3 File writer snaps. From there you can send the path value in Pipeline Parameters and you can read the path from the parameter dynamically in the created child pipeline.
image

Please let me know if you have any questions.

Regards,
Nikola Minovski

Shrabanti
New Contributor II
  1. I have passed the value from parent.I can see the value in json
    image

  2. I have declared pipeline parameter in child pipeline
    image
    3.But in s3 file reader I am not getting the value I am passing.

image

Is it something wrong I am doing here?

Thanks & Regards,
SHrabanti

winosky
New Contributor III

Hi Shrabanti, I would use the dropdown on the right and click the button on the upper right in the dropdown to visually see if the parameter(s) isnโ€™t showing values.
If you still canโ€™t see the value, checking the parent execute snap and child parameter to verify that there arenโ€™t any typos.

@Shrabanti As per your 1 step, only directory you have mention, should have filename as well.(because in your file write you didnโ€™t add any filename)
In your step 2, donโ€™t use $ac_feed_path (even you didnโ€™t change technically shouldnโ€™t issue)
Before step 3, use Pipeline execute snap and pass ac_feed_path value from pipeline input param.
Then it should work.
You can pass ac_feed_path to execution label so can track which file has been processed in Dashboard.

image