Salesforce to Azure Data Lake Storage
Submitted by @stodoroska from Interworks This pipeline selects opportunities from Salesforce, matches them according to their types and stores them as a csv file into Azure Data Lake Storage. Configuration In SalesForce you should configure an account. In the File Writer, you need to define the location of the Azure Data Lake Storage and Azure account. Sources: Salesforce Opportunity Targets: File (Azure Data Lake Storage) Snaps used: Salesforce Read, Join, Filter, CSV Formatter, File Writer Downloads SFDC Joined Data to ADLS.slp (9.9 KB)2.9KViews1like0CommentsMQTT to Azure Data Lake
Submitted by @stodoroska from Interworks Listens to messages on MQTT queue, converts them into JSON messages and writes them to Azure Data Lake Configuration You need to configure respective MQTT queue and account with server and also Azure account and ADLS location. Sources: MQTT queue Targets: Azure Data Lake Snaps used: MQTT Consumer, JSON parser, JSON Formatter, File Writer Downloads MQTT Consumer to Azure.slp (5.8 KB)3.2KViews0likes0CommentsRead and Write to Azure DataLake Store (ADL)
A simple pipeline that reads and writes to Azure Data Lake Store using the HDFS reader/Writer snap. The HDFS snap is configured to use the Azure DataLake Store account. The HDFS writer is configured as below to write to ADL The ADL URI is of the format where “fielddatalakestore” is the name of the container adl://fielddatalakestore/data/demo/refined/output_csv/ Attached is the sample pipeline Read Write to Azure DataLake Store (ADL)_2018_06_25.slp (7.1 KB)3.1KViews0likes0Comments