cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

2021 Enterprise Automation Award Nomination: Illumina: Scaling up data share across the organization while safeguarding data integrity

Karen
Former Employee

What were the underlying reasons or business implications for the need to automate business processes? Describe the business and technical challenges the company had.

Before onboarding SnapLogic, we were using Informatica IDQ tool for our Data Quality (DQ) Automation and Executing DQ validation rules on our business data sets.

Implementing new DQ rule or modifying existing rule was comparatively huge effort due to below reasons :

  1. Informatica IDQ skillset was limited to specific team members.

  2. Creating a DQ rule in Informatica IDQ based on a SQL is time consuming, as we cannot reuse same SQL that we use to analyze our data sets & each DQ rule in Informatica IDQ needs to be developed from scratch. A kind of doing reverse engineering on SQLs submitted by Data Analysts & Data Engineers.

  3. Maintaining & Managing Informatica IDQ tools incurs a separate cost.

  4. Onboarding a new data source system in Informatica requires additional efforts for configurations.

Describe your automation use case(s) and business process(es).

Because of drawbacks of Informatica IDQ that we faced, we wanted to explore on how we can leverage capabilities of SnapLogic to implement Data Quality framework that is SQL friendly i.e. Everyone who knows SQL can implement DQ validation checks easily, thus removing skill dependency on Informatica IDQ and saving huge amount on Cost of Maintenance for Informatica IDQ.

We implemented an Automated Data Quality Validation Framework using SnapLogic & Snowflake (Cloud Database) .

Benefits:

  1. Development time significantly reduced.
    we were able to deploy new DQ rules or enhance existing DQ rule for our ever growing data sets in comparatively 50 times faster compared to Informatica IDQ.

  2. No dependency on additional skill set (Informatica IDQ tool).
    Everyone who can write SQL statements can create, modify & deploy DQ rules easily.
    This gives flexibility to Data Analysts & Data Engineers to quickly deploy DQ rules for their data sets and thus removing any dependency on separate person or team who specializes in Informatica IDQ skillset.

  3. Onboarding new data sources is comparatively easy in SnapLogic.
    i.e., Structured or Unstructured data can be easily consumed using various prebuilt Snaps.

  4. Flexibility to leverage best features of SnapLogic & Snowflake both,We got additional capabilities to trigger โ€˜Alertsโ€™ for DQ failure or DQ event occurrence using Email Snap pack in SnapLogic, and there is some much explore to customize our framework using various Snap packs that are available.

Describe how you implemented your automations and processes. Include details such as what SnapLogic features and Snaps were used to build out the automations and processes.

Our Automation Framework is built on 2 main components :

A. SnapLogic :

Complete DQ framework is created using various capabilities & features provided by SnapLogic tool.

This framework is created as a template that is scalable & can modified as per future requests with minimal efforts.

Below tasks were successfully accomplished using SnapLogic.

  1. Reading Data from multiple sources like Hana, Snowflake, Excel Files etc. using various prebuilt data reading Snap packs.

  2. SQLs created by Data Analysts & Data Engineers act as parameter for our framework, this parameter values will change for each DQ scenario & our framework will generate results based on incoming parameters values.

  3. We make use of Task Scheduler feature of SnapLogic to schedule our Master Job, that triggers all DQ checks that are defined in our config table maintained in Snowflake.

  4. Email Snap Pack is used to trigger alerts to users in case of
    i. DQ rule failure
    ii. If a desired event occurs in data set
    iii. Generate summarized DQ execution report over email.
    iv. REST & Twilio snap packs can be used to generate text message
    Alerts.

Few examples of the Snap packs that are used in this framework are
โ€ข Flow โ€“ Pipeline Execute, Router, Join, Sort, Union
โ€ข JDBC โ€“Execute, Select
โ€ข Transform โ€“ Mapper
โ€ข Email โ€“ Email Sender

B. Snowflake :

Itโ€™s a database to store our results & provides analytical capabilities.

  1. This help to maintain any config data for our framework in SnapLogic

  2. Store results based on DQ rule execution.

  3. Provides further analytical capabilities by making our automated Data Quality execution results available over Visualization Dashboard (like Tableau, Power BI) that is easily accessible to all our users.

Users can get summarized as well detailed view of Data Quality Results over dashboards & If required drill down to check invalid records for a particular rule.

We are also able to generate Historical Trend Analysis for Data Quality Rules that are deployed in our prod environment.

What were the business results after executing the strategy? Include any measurable metrics (ie. productivity improvement, speed improvement, % reduced manual inefficiencies, etc.)

Implementing Data Quality framework on Snaplogic provided us great benefits & opened up so many possibilities to explore using various snap packs.

  1. Productivity Improvement :
    we can use same SQLs for implementing DQ checks that are used for data analysis, thus saving a lot of time by avoiding setup time in Informatic IDQ.

  2. Speed Improvement :
    we are no longer dependent on Informatica IDQ skillset, as we can deploy DQ rules almost 10 times faster using our SQL knowledge.

  3. Reduced Manual Inefficiencies :
    DQ framework is a templatized model i.e., one time effort is required to setup pipelines & framework in Snaplogic, after that it is reused to execute multiple DQ rules (SQL queries).
    So unlike informatica IDQ where each rule requires manual mapping creation & configuration, we donโ€™t spend manual efforts in doing repetitive tasks using out new framework.

  4. Cost effective :
    We donโ€™t have spent extra money on maintaining Informatica IDQ servers & IDQ specific resources to implement our DQ checks.
    we can achieve all Data Quality related requirements using SnapLogic & snowflake.

  5. Easy Maintenance :
    Modifying any existing DQ rules is as easy as updating a SQL statement in a config table.

6.Scalable & Easy Customizable :
Having DQ framework on SnapLogic & Snowflake, gives us so much flexibility to customize (using various Snap packs) & scale as per different business requirements.
We can leverage features of both SnapLogic & Snowflake into this framework.

  1. Enabling Analytical Capabilities using Visualization Tools.
    we are able to do trend analysis using historical executions that are stored and also create customized dashboards for daily monitoring & Executive Summarized Reports.

Who was and how were they involved in building out the solution? (Please include the # of FTEs, any partners or SnapLogic professional services who were involved on the implementation)

This solution was designed & implemented by myself (Ruchik Thakkar) with support & guidance from my manager Jim Ferris. Both of us are FTE for Illumina.

#Anything else you would like to add?

Onboarding SnapLogic was a real game changer.
It not only allowed in usual Data Onboarding Process but we were able to implement it successfully for developing a Data Quality Framework.
SnapLogic provided us a way where we can templatized this solution,
so itโ€™s easy to reuse for other teams, easy to customize as per Business Requirements & Scalable as needed.

Screen Shot 2021-11-13 at 8.44.35 AM
Screen Shot 2021-11-13 at 8.44.43 AM

0 REPLIES 0