Created by @pkona
Integrations involving ingesting data into Hive requires creating tables on Hive and creating the schema for data formats like Avro.
This pattern generates the AVRO schema based on querying the table metadata for a Oracle table. It also generates the create table statement for Hive.
Both the schema file and DDL file are written to HDFS, SnapLogic Database and returned as output of the pipeline.
Sources: Oracle Table
Targets: Generate Hive table create statement and AVRO schema
Snaps used: JSON Generator, Oracle Execute, Hive Execute, Mapper, Copy, Join, JSON Formatter, File Writer, Pipeline Execute
Pattern 3 - Step 1.1 - Ingest - Generate Hive table Avro based on Oracle Table metadata.slp (22.0 KB)
Pattern 3 - Core - Write schema to HDFS.slp (8.2 KB) (child pipeline called by the ingest pipeline)