ContributionsMost RecentMost LikesSolutionsUpload files to ADLS Gen2 Using Rest API Our Requirement is to write csv files to ADLS Gen2 Blob Storage We do not have Hadoop snap pack licences to use HDFS Writer We are checking other options to write files to Azure ADLS Gen2 Are there any sample snap using REST Put which upload files to Azure ADLS Gen2 Oracle execute with setting session variables I would like to set date format and timestamp format for the session and run the select statement in oracle execute snap like this Begin execute immediate ‘alter session set nls_date_format=’‘MM/DD/YYYY’‘’; execute immediate ‘alter session set nls_timestamp_format=’‘MM/DD/YYYY HH:MI:SS’‘’; End; "Select * from " + $tableName + " Where updateDateYear >= " + $year If I do this in 2 seperate oracle execute snaps , it is opening 2 seperate sessions I want run this in single snap Re: Oracle Table Partition Name as parameter eval did not work for me I solved it by adding a mapper in front of oracle-execute and mapped parameters to mapper output and used the mapper output in the sql and it worked My new select statement is “select * from obia1." + $tableName + " partition (” + $partName + “) where rownum < 11” $tableName and $partName are outputs from mapper Oracle Table Partition Name as parameter I have a child pipeline called from Pipeline Execute which is passing partition name as pipeline parameter to Oracle-Execute snap Oracle-Execute has a sql => “select * from obia1.sales_invoice_lines partition (” + _partName + “) where rownum < 11” the value of parameter is not substituted in sql statement before it is executed It fails with error => Failure: SQL operation failed, Reason: error occurred during batching: ORA-00972: identifier is too long , error code: 17081, Resolution: Please check for valid Snap properties and input data. SolvedRe: HDFS Writer Writing to ADLS Gen2 This is resolved : The issue was the Role Assignments For Service principal on Storage Account The Service Principal must be Storage Blob Data Contributor at the Container level Re: HDFS Writer Writing to ADLS Gen2 I Used abfs://aus@test123.dfs.core.windows.net/ebaw I get New error REST API service endpoint returned error result: status code = 403 Resolution: Please check the values of Snap properties. Reason: REST API service endpoint returned error result: status code = 403, reason phrase = This request is not authorized to perform this operation using this permission., refer to the error_entity field in the error view document for more details HDFS Writer Writing to ADLS Gen2 I am trying to upload a file to adls gen2 storage using OAuth method to connect The Account creation and validation was successfull But the Writer Fails with following error The ApplicationID/ClientID used is assigned as contributor to the storage account The HDFS Writer Config Where test123 is Storage account aus is the Directory within I tried adl://test123.dfs.core.windows.net/aus It did not work Any ideas I Used abfs://aus@test123.dfs.core.windows.net/ebaw I get New error REST API service endpoint returned error result: status code = 403 Resolution: Please check the values of Snap properties. Reason: REST API service endpoint returned error result: status code = 403, reason phrase = This request is not authorized to perform this operation using this permission., refer to the error_entity field in the error view document for more details Thanks -Venkat SolvedRe: Pipeline To Output one File per Date Hi tlikarish the samples helped in understanding how to create and pass parameters Thank you GV Pipeline To Output one File per Date I need to get Distinct update_dt and write one file per date from a oracle table sufixing the filename with date value. can you provide a sample pipeline which has similar functionality I have created 2 pipelines one with Oracle execute – ( Select trunc(update_dt) from inv_lines_stage group by trunc(update_dt) mapper For Each Pipleline 2 with Oracle Select – (Select * from inv_lines_stage where trunc(update_dt) == $_updateDt) _upsateDT defined as parameter in pipeline settings mapper CSVFormater FileWriter My problem : I am not able to join these pipelines and pass the update_dt as parameter to 2nd pipeline