The Databricks ETL Snap Pack allows users to write documents from any SnapLogic pipeline into a Delta Lake table, model the data in flight, and orchestrate DBMS operations. Users can quickly create and manage Databricks accounts and create pipelines that can replicate data from any SaaS and database sources using SnapLogic’s arsenal of 600+ snaps. These features will lower the time to value and total cost of ownership for enterprises working to modernize as well as greenfield projects involving Databricks.
Databricks - Bulk Load
Databricks - Delete
Databricks - Insert
Databricks - Merge Into
Databricks - Multi Execute
Databricks - Select
Databricks - Unload
Ease of use
Automatically configure and manage Databricks mount point and JDBC driver.
Connect any SnapLogic Snap to Databricks Snap to have data flow into Delta Lake table.
Using SnapLogic you can now transform data as it moves to Databricks in an ETL fashion or transform the data after it has landed in a delta lake table in an ETL fashion.
Account Storage Integration
Databricks Account (Source: ADLS Gen2)
Databricks Account (Source: ADLS Blob Storage)
Databricks Account (Source: AWS S3)
Databricks Account (Source: Google Cloud Storage)
Databricks Account using JDBC (Source: Any SQL Database)