Salesforce to Azure Data Lake Storage
Submitted by @stodoroska from Interworks This pipeline selects opportunities from Salesforce, matches them according to their types and stores them as a csv file into Azure Data Lake Storage. Configuration In SalesForce you should configure an account. In the File Writer, you need to define the location of the Azure Data Lake Storage and Azure account. Sources: Salesforce Opportunity Targets: File (Azure Data Lake Storage) Snaps used: Salesforce Read, Join, Filter, CSV Formatter, File Writer Downloads SFDC Joined Data to ADLS.slp (9.9 KB)2.9KViews1like0CommentsMQTT to Azure Data Lake
Submitted by @stodoroska from Interworks Listens to messages on MQTT queue, converts them into JSON messages and writes them to Azure Data Lake Configuration You need to configure respective MQTT queue and account with server and also Azure account and ADLS location. Sources: MQTT queue Targets: Azure Data Lake Snaps used: MQTT Consumer, JSON parser, JSON Formatter, File Writer Downloads MQTT Consumer to Azure.slp (5.8 KB)3.2KViews0likes0CommentsRead and Write to Azure DataLake Store (ADL)
A simple pipeline that reads and writes to Azure Data Lake Store using the HDFS reader/Writer snap. The HDFS snap is configured to use the Azure DataLake Store account. The HDFS writer is configured as below to write to ADL The ADL URI is of the format where “fielddatalakestore” is the name of the container adl://fielddatalakestore/data/demo/refined/output_csv/ Attached is the sample pipeline Read Write to Azure DataLake Store (ADL)_2018_06_25.slp (7.1 KB)3.1KViews0likes0CommentsReading and Writing Data to Azure Blob Storage and Azure Data Lake Store
Organizations have been increasingly moving towards and adopting cloud data and analytics platforms like Microsoft Azure. In this first in a series of Azure Data Platform blog posts, I’ll get you on your way to making your adoption of the cloud platforms and data integration easier. In this post, I focus on ingesting data into the Azure Cloud Data Platform and demonstrate how to read and write data to Microsoft Azure Storage using SnapLogic. For those who want to dive right in, my 4-minute step-by-step video “Building a simple pipeline to read and write data to Azure Blob storage” shows how to do what you want, without writing any code. What is Azure Storage? Azure Storage enables you to store terabytes of data to support small to big data use cases. It is highly scalable, highly available, and can handle millions of requests per second on average. Azure Blob Storage is one of the types of services provided by Azure Storage. Azure provides two key types of storage for unstructured data: Azure Blob Storage and Azure Data Lake Store. Azure Blob Storage Azure Blob Storage stores unstructured object data. A blob can be any type of text or binary data, such as a document or media file. Blob storage is also referred to as object storage. Azure Data Lake Store Azure Data Lake Store provides what enterprises look for in storage today and it: Provides additional enterprise-grade security features like encryption and uses Azure Active Directory for authentication and authorization. Is compatible with Hadoop Distributed File System (HDFS) and works with the Hadoop ecosystem including Azure HDInsight. Includes Azure HDInsight clusters, which can be provisioned and configured to directly access data stored in Data Lake Store. Allows data stored in Data Lake Store to be easily analyzed using Hadoop analytic frameworks such as MapReduce, Spark, or Hive. How do I move my data to the Azure Data Platform? Let’s look at how you can read and write to Azure Data Platform using SnapLogic. For SnapLogic Snaps that support Azure Accounts, we have an option to choose one of Azure Storage Account or Azure Data Lake Store: Configuring the Azure Storage Account in SnapLogic can be done as shown below using the Azure storage account name and access key you get from the Azure Portal: Configuring the Azure Data Lake Store Account in SnapLogic as shown below, uses the Azure Tenant ID, Access ID, and Secret Key that you get from the Azure Portal: Put together, you’ve got a simple pipeline that illustrates how to read and write to Azure Blob Storage: In the next post in this series, I will describe the approaches to move data from your on-prem databases to Azure SQL Database.7.8KViews0likes2Comments