Creating a zip folder in Azure blob storage
I have a requirement where i receive a request with path to many documents which are stored in Azure. My goal is to copy all these files to a temp folder in Azure and then zip it. ( I am not able to figure out this part - zip the folder in azure how can i achieve this via snaplogic ) . I can later download this zip file. Can someone please guide me ?Solved4.1KViews0likes2CommentsRead Write to Azure Blob Storage (wasb)
A simple pipeline that performs a read/write to Azure Blob Storage also referred to as WASB This is done using the HDFS snap. That has an Azure Storage Account associated with it. Below is a screenshot of how the HDFS writer is configured The URI to access the file on WASB is of the format below: where “field-storage-account-container” is the name of the container wasb:///field-storage-account-container/data/demo/raw/ Attached is a sample pipeline Read and Write to Azure Blob Storage_2018_06_25.slp (6.8 KB)4.5KViews0likes0CommentsReading and Writing Data to Azure Blob Storage and Azure Data Lake Store
Organizations have been increasingly moving towards and adopting cloud data and analytics platforms like Microsoft Azure. In this first in a series of Azure Data Platform blog posts, I’ll get you on your way to making your adoption of the cloud platforms and data integration easier. In this post, I focus on ingesting data into the Azure Cloud Data Platform and demonstrate how to read and write data to Microsoft Azure Storage using SnapLogic. For those who want to dive right in, my 4-minute step-by-step video “Building a simple pipeline to read and write data to Azure Blob storage” shows how to do what you want, without writing any code. What is Azure Storage? Azure Storage enables you to store terabytes of data to support small to big data use cases. It is highly scalable, highly available, and can handle millions of requests per second on average. Azure Blob Storage is one of the types of services provided by Azure Storage. Azure provides two key types of storage for unstructured data: Azure Blob Storage and Azure Data Lake Store. Azure Blob Storage Azure Blob Storage stores unstructured object data. A blob can be any type of text or binary data, such as a document or media file. Blob storage is also referred to as object storage. Azure Data Lake Store Azure Data Lake Store provides what enterprises look for in storage today and it: Provides additional enterprise-grade security features like encryption and uses Azure Active Directory for authentication and authorization. Is compatible with Hadoop Distributed File System (HDFS) and works with the Hadoop ecosystem including Azure HDInsight. Includes Azure HDInsight clusters, which can be provisioned and configured to directly access data stored in Data Lake Store. Allows data stored in Data Lake Store to be easily analyzed using Hadoop analytic frameworks such as MapReduce, Spark, or Hive. How do I move my data to the Azure Data Platform? Let’s look at how you can read and write to Azure Data Platform using SnapLogic. For SnapLogic Snaps that support Azure Accounts, we have an option to choose one of Azure Storage Account or Azure Data Lake Store: Configuring the Azure Storage Account in SnapLogic can be done as shown below using the Azure storage account name and access key you get from the Azure Portal: Configuring the Azure Data Lake Store Account in SnapLogic as shown below, uses the Azure Tenant ID, Access ID, and Secret Key that you get from the Azure Portal: Put together, you’ve got a simple pipeline that illustrates how to read and write to Azure Blob Storage: In the next post in this series, I will describe the approaches to move data from your on-prem databases to Azure SQL Database.7.8KViews0likes2CommentsUsing Azure Storage/Azure Blob Storage
I am trying to create an Azure Blob Reader but I keep encountering this error: Error: Directory not found or access denied: wasb:///storage@janoswsl.blob.core.windows.net/; Reason: com.snaplogic.snaps.hadoop.BinaryAzureAccount cannot be cast to com.snaplogic.snaps.hadoop.KerberosAccount; Resolution: Please check if the directory exists, you have access right and all Snap properties and credentials are valid. Failure: Unable to read from wasb:///janoswsl.blob.core.windows.net/storage/globalweather.xml, Reason: java.io.IOException: Failed to open input stream for WASB path: wasb:///janoswsl.blob.core.windows.net/storage/globalweather.xml, detail: The specifed resource name contains invalid characters., Resolution: Please address the reported issue. I also tried replacing wasb with wasbs but I still get the same error. Here are some screenshots of my pipeline and I used cyberduck tool to verify I can successfully connect to the Azure storage. I am not sure if using a trial account for Azure is causing this. and a cyberduck screenshot6.2KViews0likes5Comments