Recent Discussions
Infosys: Modernising the Client's Landscape Containing 500+ Legacy Interfaces
2022 Partner Innovation Award Nomination What were the underlying reasons or business implications for the need to innovate and transform the business? Client organisation was carved out from a large conglomerate and they inherited a large set of legacy applications built on disparate legacy technologies. The client wanted to modernise the landscape and also look at utilising a platform led approach, in a new cloud based setup Following were the key major challenges: Heterogenous integration landscape with 500+ interfaces spread across various legacy technologies like SAP PI, WebMethods and Cast Iron with EOL approaching fast Minimal documentation of existing landscape High maintenance cost of several operative environments with respect to platform and application support Challenges with skill availability on legacy platforms leading to high cost and business continuity risk High license and Infrastructure cost Selecting the right platform that meets future needs Migrating 500+ interfaces involving multiple internal stakeholders, external vendors and third party with an aggressive timeline Which customer did you implement this on? Siemens Energy Global GmbH Describe your strategy and execution to solve the business challenge. Considering the business challenges and criticality as well as technical complexity, Infosys and Siemens Energy worked in close collaboration to devise a robust strategy and execution plan. Strategy After evaluating various platforms, Snaplogic was shortlisted as the platform of choice for its robust capabilities The timeline of the project was aligned to the license expiry date of the legacy integration platforms Wave wise approach was finalised based on business criticality, application availability and technology grouping Ensure zero business impact with on par performance and no end system impact 5.Feasibility checks to confirm connectivity with various applications 6.Focus on maximizing the use of standard features and reduce customization for better maintainability Rationalisation of number of interfaces for better performance and cost reduction in collaboration Execution Cloud and On-premise Application integrations have been performed utilising the various SNAPS available In built SNAPS have been used to connect to various applications like Workday , SAP HANA etc. REST/SOAP connectors have been used for the Integration of Non-Standard Applications to ensure security Handle security aspects of Internal and third apps/vendors using various authentication mechanism Real Time Vs Batch processing requirements have been executed using various connector and the supporting parameters provided by Snaplogic Snaplogic Dashboard used extensively for optimized monitoring of the pipeline executions Parallel Processing of threads for improved performance Snaplogic Architects and SMEs were consulted for architecture, design and performance related aspects Who was and how were they involved in building out the solution? To migrate 500+ legacy interfaces, a peak team of 25 people from Infosys were involved, with support from Siemens Energy integration experts and application teams When we encountered issues that required support from Snaplogic Product team, regular interactions were organised to look for suitable solutions What were the business results after executing the strategy? Siemens Energy achieved the following business results with the successful execution of the migration program: numbers of interfaces reduced by approx. 1/3rd in the Snaplogic landscape reduction in technical complexity with reduction from 4 platforms to single platform, leading to significant cost reduction for platform/application support and skills modern cloud based setup with assured future scalability Introduction of APIs Improved predictability of business continuity What was the ROI the customer gained from executing the strategy? Below are the overall ROI gained by customer with the success of the migration program: ~50% reduction in license and infrastructure cost ~30% projected cost reduction for platform and application support Anything else you would like to add? The following were the key highlights of the migration program: There was focus on maximizing the use of standard features and reduce customization for better maintainability Extremely complex requirements were also implemented with standard snaplogic feautres to eliminate the existing custom code Component tracker was introduced which is an innovative way to maintain and access information about interfaces at a single placedmiller3 years agoFormer Employee3.2KViews6likes0CommentsHampshire Trust Bank: Building a More Agile Financial Services Organization
2022 Business Impact Award Nomination Submitted by ⋮IWConnect on behalf of Hampshire Trust Bank. What were the underlying reasons or business implications for the need to automate business processes? As a UK-based specialist bank focused on providing asset finance, specialist mortgages, and development finance solutions, Hampshire Trust Bank (HTB) is continuously innovating and implementing sophisticated solutions to optimize internal business processes. These processes range from Business Intelligence (BI) and reporting to screening and payments integrations. HTB has a clear strategy of acquiring best-of-breed applications to maximize business efficiency and value with new systems that share data and processes within the existing software ecosystem. Each system has its own technology stack on one side and data formats on the other. As a result of this difference, data interoperability became one of the main challenges affecting HTB operational efficiency. Furthermore, considering that data contains personally identifiable information, it required cryptography solutions such as X509 and PGP to be implemented. Once data has been consolidated, screening and monitoring processes are orchestrated to ensure data validity. Paramount among HTB’s concerns were to: support the increased processing of payment transactions per month improve the process of data screening to ensure data validity become more agile and increase customer satisfaction With process resilience of critical importance for any financial services organization, SnapLogic was used to move workload between data centers or strategically use the cloud depending on availability and requirements. Describe your automation use case(s) and business process(es). Use Cases Integration of payments, general ledger, direct debit, and direct credit transactions from asset finance software Alfa to internal and external payment systems Integration of more than 100 reports from Alfa into HTB’s data warehouse Integration of Finova Core Banking Platform’s processes for retail savings customers Risk and fraud detection with implementation of transactions screening with Comply Advantage Business Processes Fully automated processes for customer and employee screening Payments integration with Bottomline and other external providers Consolidation of data into a unified Snowflake Data warehouse for advanced BI analysis and reporting Describe how you implemented your automations and processes. Planning, defining and documenting The process starts by defining the business flow, requirements, and edge cases. Once these things have been defined, the design is documented in detail. A team of business users and technical experts reviews the documentation and approves the design. Implementation The implementation of automation starts by creating pipelines and other assets such as parameterizations or storage locations. Pipelines are executed as Scheduled, Triggered and Ultra tasks. JMS consumers in asynchronous mode are used to read messages from message queues. REST Snaps are used to read data and execute processes for REST endpoints. PGP Snaps are used for cryptography operations, and Script Snap is used for custom cryptographic operations. Data is transformed with a combination of mapper snaps. Results are written into various destination file systems by using File Writer Snaps. Log messages are produced and transferred to Splunk. JIRA tickets are automatically created by using JIRA Create Snap in the case of exceptions. Additionally, email messages and xMatters alerts are sent to the users. Expression libraries are used extensively to parametrize configurations across multiple staging environments. Once the development is complete, the detailed quality assurance process on multiple environments starts before the solution is finally deployed in the production environment. What were the business results after executing the strategy? Integrations combined with specialized banking systems allow the processes to complete faster, more efficiently and with fewer errors across all those systems, dramatically optimizing business processes. Currently, there are more than 50 different systems and apps automated and interoperable in production. By using SnapLogic automations, HTB can support the increased demand and process tens of thousands of payment transactions per month, as well as thousands of screening operations each month. There are tens of thousands of pipeline executions that result in millions of documents every day. Automation plays a key role in both digital transformation and developing a more agile business. Team members can now focus on higher-priority tasks such as strategy, security and customer relationship management. Who was and how were they involved in building out the solution? The solutions were built by a team composed of internal domain experts, consultants from the external systems, architects, integrators and Quality Assurance professionals. Five FTEs were engaged in the projects. In a relationship that lasts three years, ⋮IWConnect acts as the integration implementation partner and assists in the implementation of multiple projects, each lasting from three to 12 months.dmiller3 years agoFormer Employee2.7KViews2likes0CommentsTyler Technologies: Building an Integration Toolkit for their Platform
What were the underlying reasons or business implications for the need of application integrations? The Courts and Justice division of Tyler Technologies primarily focuses on our Enterprise Justice platform and our connected products to provide first class business functionality to State and Local government court systems. When new clients purchase our software, they often have existing applications that require integration with our platform. To account for these integration requirements, we have built an Integration Toolkit that provides inbound API and outbound configurable publishing support to our clients. Additionally, we provide custom development for integrations that a client may be unable to build through our toolkit because of functionality gaps, knowledge gaps, or lack of development resources. Using this development model, we have compiled over 200 XML/SOAP APIs the past 15 years and several customizable interfaces through our publisher. As we modernize our platform and develop new connected applications, we must maintain functionality and interfaces for our existing clients while providing new feature sets for modern applications. Recently, we developed a new application using server-less AWS technologies and restful APIs to provide browser-based warrant entry functionality for court officials and officers. Delivery timelines and limited customization options in a new product provided a unique opportunity to modernize our integration strategies while maintaining functionality for our existing clients. Describe your application innovation strategy and how you executed the strategy. Our long-term modernization strategy involves moving towards cloud native solutions. We would also like to expand the functionality of our toolkit to allow our clients to be more self-sufficient at building and owning their application integrations. Given our monolithic application, we need mechanisms to provide iterative improvements while maintaining our existing functionality and our expected level of service. The new warrants application is my division’s first foray into a server-less application. While many aspects of server-less environments are compelling, we needed a mechanism to communicate with our existing server-based platform. Traditionally, we would develop a custom .net solution to merge the two systems built on our integration framework. While this solution has served us well in the past, development effort is required to modify or enhance these solutions, which can distract from modernization efforts. Rather than building the required middleware on our platform, we decided this would be a great opportunity to leverage an out-of-the box integration platform with the hopes that SnapLogic could provide a mechanism to build client specific integrations through configuration and allow development teams to remain focused on modernization efforts. The first step was to build out a custom Snap that would allow us to interface with our existing XML/SOAP API services in a model that would fit well within the SnapLogic application. We translated our API schemas into JSON schemas so that we could build custom endpoints that more accurately modeled the business requirements of the new server-less application, while abstracting out the underlying XML structures. Next, we leveraged this custom Snap to build out new application specific ultra-tasks that provided the specific functionality required without updating our underlying APIs. This allowed for the flexibility required for greenfield development. With Enterprise Justice platform integration abstracted, we branched out to other areas of the SnapLogic platform. We implemented interfaces that would handle SNS subscriptions, interact with a Legacy IBM MQ solution, and various SFTP sites. Additionally, the REST Snaps allowed interaction with the REST/JSON services provided by the new server-less application. With the integration up and running, we were able to leverage our local Snaplex environment to provide logging solutions through S3 to DataDog, enabling dashboard and monitoring support to an application that has lacked a unified logging strategy. Leveraged SnapLogic functionality: Ultra tasks, Custom Snaps and Accounts, REST Snaps, File Access Snaps, Script Snaps (with side loaded Apache libraries), and JMS Snaps Attached Documents Network Architecture.png: Displays our Snaplex environment and connected systems Buffer Queue Interaction.png: Displays one example of our SnapLogic integration patterns Project Config.expr: Expression library to configure log settings Log Message Handler.slp: Logging pipeline that writes to local disk (mounted file gateway) for the DataDog Agent to consume Error Message Handler.slp: Error handling that remaps system and business errors as necessary and logs business errors to the Enterprise Justice Application for user resolution. Also writes to local disk (mounted file gateway) for the DataDog Agent to consume. The “log message” Snap is our proprietary custom Snap. Who was and how were they involved in building out the solution? (Please include the # of FTEs, any partners or SnapLogic professional services who were involved on the implementation) The core development team consisted of 3 FTEs with another 10 developers rotating in as necessary to build specific functionality pertaining to additional integrations, and logging and monitoring solutions. We were able to attend Snap development training, SnapLogic training, and worked with our account manager and SnapLogic technical support as we encountered challenges. What were the business results after implementing these application integrations? By abstracting our SOAP/XML API messages, we were able to focus on the requirements of integrating new and existing applications through configuration and not custom development. This allowed for rapid development and flexibility to deliver interfaces through conversion activities, changing requirements, and newly developed functionality and applications. Using existing SnapLogic logs and our custom logging solution, we were able to provide integration Dashboards and monitors to allow our stakeholders a real-time view of integration health and alerts when issues arise. On an average day we handle 20k SNS webhook messages through our ultra-tasks. Additionally, we process thousands of triggered synchronous API requests, tens of thousands of JMS messages, and 100k+ log writes. From performance testing we know our environment can handle 3-5 times an average days volume without issue, which gives confidence to our customers that we can handle their operations during peak volumes. Anything else you would like to add? SnapLogic provides an effective abstraction of our existing Integration Toolkit to allow flexibility in developing new integration solutions though configuration. This allows customizable API endpoints that are application specific while allowing development to modernize our existing platform. Error Message Handler.slp (73.6 KB) Log Message Handler.slp (13.5 KB) ProjectConfig.expr (725 Bytes)dmiller3 years agoFormer Employee2.7KViews2likes0Comments⋮IWConnect: Helping NTT Accelerate Delivery of IT Services to its Customers
What were the underlying reasons or business implications for the need to innovate and transform the business? As a global leader in IT infrastructure and services and part of the $20 billion IT services provider NTT Ltd., Tokyo-based NTT is widely acclaimed for delivering a secure and connected future that empowers people, clients and community, transforms businesses and technologies and drives innovative outcomes. With high customer expectations, the company’s impressive organic growth and its industry-leading position setting the backdrop, NTT was looking to accelerate its delivery of IT services to its customers, improve business performance and optimize employee efficiency by leveraging data and technology. Paramount among NTT’s concerns for maintaining customer satisfaction and accelerating IT services delivery were to: automate and improve the process of data exchange between various external applications and NTT’s ServiceNow IT Service Management Software, replace the outdated integration platform legacy integration tool with a new integration platform that will contain existing customer data and transition seamlessly to allow continued operations without any downtime, and ensure adherence to the company’s high standards for service quality by improving the process of documenting company projects and processes. Built on top of the SnapLogic Platform, NTT used ⋮IWC legacy integration tool to SnapLogic Accelerator – an accelerator that requires no coding and automates code migration to address specific issues and challenges, including: manual migration of existing code to a new integration solution (SnapLogic) affecting employee productivity and complete code migration, smooth transitioning to SnapLogic without downtime that might affect customer service, absence of thorough documentation of implemented past projects and processes, making the transitioning process even more challenging. Which customer did you implement this on? NTT Describe your strategy and execution to solve the business challenge. At a macro level, SnapLogic was used to replace the outdated legacy integration tool integration platform and act as a bridge between external applications and NTT’s ServiceNow ITSM Software. Leveraging SnapLogic’s capabilities, NTT achieved optimal business performance and improved process and project documentation, accelerating time to value and improving customer satisfaction. At a micro level, ⋮IWC legacy integration tool to SnapLogic Accelerator connected the dots between the new SnapLogic integration platform and the outdated legacy integration tool integration platform. The innovation is particularly evident in the development of an accelerator that is able to migrate code/integrations from an outdated integration platform to SnapLogic fast and easily. ⋮IW Cisco legacy integration tool to SnapLogic Accelerator automatically gathers existing code/integrations from legacy integration tool and transfers them to SnapLogic by reusing legacy integration tool configuration and components. The accelerator lessens the time needed to migrate the existing integrations from legacy integration tool to SnapLogic, eliminates a significant percentage of manual work associated with it and ensures there is no impact on the end systems or end clients. The accelerator enabled the new integration platform to function seamlessly, thus improving business performance and accelerating IT services delivery. The planning phase consisted of: Researching and analyzing how to transition from an outdated integration platform to SnapLogic fast, easily, and without loss of existing data Researching and analyzing legacy integration tool as an integration platform Defining the approach Designing the solution The ⋮IWC legacy integration tool to SnapLogic Accelerator was built following SnapLogic best practices and reverse engineering. The design of the accelerator included mapping apples-to-apples of the original functionality of legacy integration tool to a respective code in SnapLogic that reuses the legacy integration tool XSLT configurations and staging capabilities and creates a piece of code that basically replicates legacy integration tool behavior in a different platform. All of the mechanisms –HTTP calls, XML and XSLT transformations, queueing, and keeping the lifecycle of a ticket, among others –are preserved. After the code is exported from legacy integration tool, it is imported into SnapLogic, and with just a configuration and some unit testing, the code is ready to go live. Who was and how were they involved in building out the solution? 1 FTE from ⋮IWConnect for building the accelerator A team of ~25 people – SnapLogic developers, legacy integration tool developers, Quality Assurance engineers, Project Managers – for migrating all of the existing clients fast and efficiently with the best quality What were the business results after executing the strategy? As a result of the innovative SnapLogic Migration Accelerator, NTT experienced a wide range of benefits that positively impacted its business performance, customer relations and employee efficiency, including: An exponential improvement of business performance as a result of the fast and easy replacement of an outdated integration platform with a new one Complete, automated and secure migration of existing code to SnapLogic Smooth transitioning to a new integration platform while keeping the existing code intact and maintaining operational efficiency Increased employee productivity as a result of the elimination of manual code migration or writing code from scratch Ensured service quality by improving control and traceability of the company’s projects and processes Enabled NTT to provide a unified client experience through NTT’s Digital Services Platform What was the ROI the customer gained from executing the strategy? ⋮IWC legacy integration tool to SnapLogic Accelerator has delivered a wide range of benefits for NTT and its employees. Among them: The existing code is migrated from legacy integration tool to SnapLogic automatically, thus saving employees time and delivering value to customers as quickly as possible. The accelerator facilitates secure and smooth transitioning from one integration platform to another without affecting NTT’s clients and their systems. Time-effective integration migration project has resulted in a 50% increase in delivery speed. Standardized project documentation allows processes to be easy to follow and decreases chances for errors. Costs for delivering IT services to clients have been reduced while also delivering higher-quality services on a more consistent basis.dmiller3 years agoFormer Employee2.5KViews2likes0Comments2021 Enterprise Automation Award Nomination: The Worcester Polytechnic Institute: Move to the Cloud
What were the underlying reasons or business implications for the need to automate business processes? Describe the business and technical challenges the company had. In 2016, Worcester Polytechnic Institute started their transition phase of moving their legacy ERP system, Banner to Workday. The need for the transition was primarily to increase efficiency for the departments, students, and faculty. Business Challenges: Departments like the Registrar, Academic Affairs, Student Affairs, Athletics, and the President’s Office were impacted by the slowness of obtaining information they needed for reports, analytics, and real-time covid information from the applications they interacted with, which negatively impacted the business and the student experience. The greatest impact to the University was obtaining student data and historical information for analytical analysis after from day to day operational Business. Technical Challenges: Because the departments were impacted by the slowness of data updates and integrations, the IT department determined that they needed a better solution to acquire student and HCM data for their systems to operate efficiently. The school needed a more robust solution that would improve business continuity. Legacy system provided limited capabilities and did not see feature updates that would address the Universities concerns. In addition, Banner had limited support to assist in providing a cross-functional exchange of information between Salesforce, Tableau, and Amazon Redshift that further limited them from delivering additional integrations to the departments. This prompted the team to identify Workday as their new student and HCM system, and a need for an integration platform. When considering an integration platform, the IT department needed a platform that was compatible with Workday; since it was going to be their single-source of truth. They needed an integration tool with capabilities that would help them in the future such as broad connectivity and compatibility with other applications and systems (including data-based connections), the ability to consume data while at REST, and the ease-of-use. At the beginning of the transition, there were hundreds of integrations that needed to be migrated from Banner to Workday, so having a product that was easy to use reduced development wait time and increased efficiency moving integration from build to production. Describe your automation use case(s) and business process (s). Since adopting SnapLogic, WPI has achieved many integration use cases: Move to the cloud: They did a phased approach where they first migrated HCM and Finance from Banner to Workday, and then continued to move Student to Workday. Analytics and Reporting: Bring data from various data sources (eProjects, Salesforce, Workday, etc.) into Amazon Redshift, their enterprise cloud data warehouse and build analytics and dashboards on Tableau. For example, They’d bring in data from their WPI eProjects so that the research department could view the trends and information on projects The Registrar would have access to analytics and reports around student applications, acceptance, and enrollment Create Covid-19 dashboards for the President, staff, faculty and student to stay up-to-date on Covid-19 cases, vaccination rates, # of people recovered from Covid-19, etc They’d also bring in data from different sources into Workday Prism, Workday’s analytics application, so that the Registrar can build their reports directly in Workday SnapLogic is also used to run process workflows for all departments like Finance, HR, University Advancement, Enrollment, Undergrad Enrollment, Library, Athletics, Police, Parking, Benefits, Payroll, Cherwell, and others. All the departments benefit from SnapLogic to have their operational processes run. Describe how you implemented your automations and processes. Include details such as what SnapLogic features and Snaps were used to build out the automations and processes. The IT team took a phased approach for their Banner-Workday transition project by first migrating their HCM and Finance over and then migrating Student as a second phase approach. Snaps used: Workday, Workday Prism Analytics, Tableau, SQLServer, Salesforce along with all other basic snaps like Joiner, Router, mapper etc What were the business results after executing the strategy? Include any measurable metrics (ie. productivity improvement, speed improvement, % reduced manual inefficiencies, etc.) The IT team is self-sufficient such that they don’t have to be dependent on 3rd party consultants to build/maintain integrations, or rely on point-to-point, out-of-the-box integrations. Everyone on the IT team can build and deliver integrations using SnapLogic. This provides redundancy and team collaboration, which helps create a simplified data architecture allowing the department to construct a data management strategy rather than focusing on short-term quick fixes (ie. OOTB integrations). Metrics include: The migration from Banner to Workday – Finance and HCM – took about 2 years and the team migrated over 10 years’ worth of data. The team is currently moving millions of data each day into Amazon Redshift for data analytics. Real-time data is updated into Workday and Tableau for reporting Over 250 integrations have been developed to run the campus’s processes Who was and how were they involved in building out the solution? (Please include the # of FTEs, any partners or SnapLogic professional services who were involved on the implementation) 5 developers use SnapLogic today to build and deliver integrations for the entire campus. Anything else you would like to add? The WPI IT team uses SnapLogic as their enterprise integration platform and will continue to build up the skills for SnapLogic so that integrations are scalable and don’t need to rely on consultants to manage integrations.Karen4 years agoFormer Employee2.4KViews0likes0Comments2021 Enterprise Automation Award Nomination: Illumina: Scaling up data share across the organization while safeguarding data integrity
What were the underlying reasons or business implications for the need to automate business processes? Describe the business and technical challenges the company had. Before onboarding SnapLogic, we were using Informatica IDQ tool for our Data Quality (DQ) Automation and Executing DQ validation rules on our business data sets. Implementing new DQ rule or modifying existing rule was comparatively huge effort due to below reasons : Informatica IDQ skillset was limited to specific team members. Creating a DQ rule in Informatica IDQ based on a SQL is time consuming, as we cannot reuse same SQL that we use to analyze our data sets & each DQ rule in Informatica IDQ needs to be developed from scratch. A kind of doing reverse engineering on SQLs submitted by Data Analysts & Data Engineers. Maintaining & Managing Informatica IDQ tools incurs a separate cost. Onboarding a new data source system in Informatica requires additional efforts for configurations. Describe your automation use case(s) and business process(es). Because of drawbacks of Informatica IDQ that we faced, we wanted to explore on how we can leverage capabilities of SnapLogic to implement Data Quality framework that is SQL friendly i.e. Everyone who knows SQL can implement DQ validation checks easily, thus removing skill dependency on Informatica IDQ and saving huge amount on Cost of Maintenance for Informatica IDQ. We implemented an Automated Data Quality Validation Framework using SnapLogic & Snowflake (Cloud Database) . Benefits: Development time significantly reduced. we were able to deploy new DQ rules or enhance existing DQ rule for our ever growing data sets in comparatively 50 times faster compared to Informatica IDQ. No dependency on additional skill set (Informatica IDQ tool). Everyone who can write SQL statements can create, modify & deploy DQ rules easily. This gives flexibility to Data Analysts & Data Engineers to quickly deploy DQ rules for their data sets and thus removing any dependency on separate person or team who specializes in Informatica IDQ skillset. Onboarding new data sources is comparatively easy in SnapLogic. i.e., Structured or Unstructured data can be easily consumed using various prebuilt Snaps. Flexibility to leverage best features of SnapLogic & Snowflake both,We got additional capabilities to trigger ‘Alerts’ for DQ failure or DQ event occurrence using Email Snap pack in SnapLogic, and there is some much explore to customize our framework using various Snap packs that are available. Describe how you implemented your automations and processes. Include details such as what SnapLogic features and Snaps were used to build out the automations and processes. Our Automation Framework is built on 2 main components : A. SnapLogic : Complete DQ framework is created using various capabilities & features provided by SnapLogic tool. This framework is created as a template that is scalable & can modified as per future requests with minimal efforts. Below tasks were successfully accomplished using SnapLogic. Reading Data from multiple sources like Hana, Snowflake, Excel Files etc. using various prebuilt data reading Snap packs. SQLs created by Data Analysts & Data Engineers act as parameter for our framework, this parameter values will change for each DQ scenario & our framework will generate results based on incoming parameters values. We make use of Task Scheduler feature of SnapLogic to schedule our Master Job, that triggers all DQ checks that are defined in our config table maintained in Snowflake. Email Snap Pack is used to trigger alerts to users in case of i. DQ rule failure ii. If a desired event occurs in data set iii. Generate summarized DQ execution report over email. iv. REST & Twilio snap packs can be used to generate text message Alerts. Few examples of the Snap packs that are used in this framework are • Flow – Pipeline Execute, Router, Join, Sort, Union • JDBC –Execute, Select • Transform – Mapper • Email – Email Sender B. Snowflake : It’s a database to store our results & provides analytical capabilities. This help to maintain any config data for our framework in SnapLogic Store results based on DQ rule execution. Provides further analytical capabilities by making our automated Data Quality execution results available over Visualization Dashboard (like Tableau, Power BI) that is easily accessible to all our users. Users can get summarized as well detailed view of Data Quality Results over dashboards & If required drill down to check invalid records for a particular rule. We are also able to generate Historical Trend Analysis for Data Quality Rules that are deployed in our prod environment. What were the business results after executing the strategy? Include any measurable metrics (ie. productivity improvement, speed improvement, % reduced manual inefficiencies, etc.) Implementing Data Quality framework on Snaplogic provided us great benefits & opened up so many possibilities to explore using various snap packs. Productivity Improvement : we can use same SQLs for implementing DQ checks that are used for data analysis, thus saving a lot of time by avoiding setup time in Informatic IDQ. Speed Improvement : we are no longer dependent on Informatica IDQ skillset, as we can deploy DQ rules almost 10 times faster using our SQL knowledge. Reduced Manual Inefficiencies : DQ framework is a templatized model i.e., one time effort is required to setup pipelines & framework in Snaplogic, after that it is reused to execute multiple DQ rules (SQL queries). So unlike informatica IDQ where each rule requires manual mapping creation & configuration, we don’t spend manual efforts in doing repetitive tasks using out new framework. Cost effective : We don’t have spent extra money on maintaining Informatica IDQ servers & IDQ specific resources to implement our DQ checks. we can achieve all Data Quality related requirements using SnapLogic & snowflake. Easy Maintenance : Modifying any existing DQ rules is as easy as updating a SQL statement in a config table. 6.Scalable & Easy Customizable : Having DQ framework on SnapLogic & Snowflake, gives us so much flexibility to customize (using various Snap packs) & scale as per different business requirements. We can leverage features of both SnapLogic & Snowflake into this framework. Enabling Analytical Capabilities using Visualization Tools. we are able to do trend analysis using historical executions that are stored and also create customized dashboards for daily monitoring & Executive Summarized Reports. Who was and how were they involved in building out the solution? (Please include the # of FTEs, any partners or SnapLogic professional services who were involved on the implementation) This solution was designed & implemented by myself (Ruchik Thakkar) with support & guidance from my manager Jim Ferris. Both of us are FTE for Illumina. #Anything else you would like to add? Onboarding SnapLogic was a real game changer. It not only allowed in usual Data Onboarding Process but we were able to implement it successfully for developing a Data Quality Framework. SnapLogic provided us a way where we can templatized this solution, so it’s easy to reuse for other teams, easy to customize as per Business Requirements & Scalable as needed.Karen4 years agoFormer Employee2.4KViews0likes0CommentsQ and A with Kalyan Venkat: Timely Metrics from Varied Applications While Meeting Audit Requirements
What were the underlying reasons for the need of data? Need accurate & timely operational and strategic key business metrics from varied applications - has been challenging using multiple ETL tools, data refresh schedules, and data integration points Describe your data strategy and how you executed the strategy. The design focuses on building patterns where 80% of data movement between Source (applications) objects and Target objects– Az Cloud Datalake is tied to executing patterns in SnapLogic. The pattern approach eliminates significant development and testing effort required towards new and updated data from source systems. The use of patterns in data movements allows for well thought of design that is required to trigger data movement only on those pattern certified objects. The data fetch process within a pattern is also designed to be codeless meaning like to like attributes between source and are dynamically mapped in the pipeline – this eliminates need to manage and maintain coding required toward querying data from source and updating target system. Example: one of the enterprise applications has 350+ tables that requires to be updated in Azure Cloud Datalake - instead of developing one to one - 350+ SnapPipelines - using a pattern framework significantly expedited development and delivery schedule - in total there were just 45 Pipelines to address Data refresh both Delta and Full load to Az Cloud Datalake. Comparing with legacy ETL solution that required One to One workflow for each source and target table refresh. Who was and how were they involved in building out the solution? We leveraged SnapLogic offshore Snap Development resources (2) and Internal (1) FTE to deliver Az BigData Solution What were the business results after achieving this data strategy? Using SnapLogic truly expedited design, build and deliver data solutions and met business needs timely. The ability to integrate data from 13+ systems (Enterprise Apps, Cloud Apps, Third-Party SAAS data and list goes on) - most importantly key data metrics are captured during Pipeline Execution, Data Orchestration, Data Ingestion and Post Process Certification - this is a key requirement from Internal Audit to ensure Data Integrity is not compromised Anything else you would like to add? So far we have deployed over 3000+ Snap Pipelines in production and at most we allocated a half hour per week to review any exceptions during Pipeline processing in production. This clearly demonstrates the strength of SnapLogic as a product to move tons of data between source and target and by applying well thought out design ahead greatly benefits towards building robust, and flexible Snap pipelinesdmiller3 years agoFormer Employee2.3KViews1like0CommentsWD-40: Moving Digital Transformation when there are Delays
What were the underlying reasons or business implications for the need of application integrations? The WD-40 Company’s underlying reason for application integration was to keep our digital transformation program moving while our new ERP implementation was delayed. Our team was challenged by our VP of Technology, Doug Cyphers to connect our legacy ERP with no exposed API suite to our new Salesforce CRM application, SnapLogic played a key role in accomplishing this. The Salesforce application delivered ROI returns that paid for themselves in less than one year. The team delivered needed functionality and traditionally back-office order processing visibility to our sales, merchandisers and marketing teams. The outdated legacy ERP applications with no exposed API suite was a challenge to connect 12 critical business functions during 3 different releases in 3 different Salesforce Clouds: Consumer Goods Cloud, Sales Cloud and Services Cloud. The team worked fearlessly to connect data types that included: National Account and contact management Lead journey campaign management Requesting, viewing and amending sales orders Inter-department workflow Sales Forecasting Grocery Market Planning and Broker Activities Sales Merchandiser Activities Enhances Mobile and outlook integration capabilities The business and technical challenge faced by WD-40 Company was the need for a more modern and scalable CRM application, that met the needs of growing business to quickly scale our teams’ capabilities. Application integrations helped by connecting the Salesforce CRM to the other major strategic applications within the Enterprise Application Portfolio and complete technology ecosystem, surgically removing the legacy CRM and replacing it with a modern SaaS solution in Salesforce. Describe your application innovation strategy and how you executed the strategy. Our guiding principles & innovation strategy had several focus areas: Provide a Consistent User Experience Reduce License Costs - If it is not their primary application, we don’t want to pay for license costs for a secondary application use, when needed an integration would be leveraged to provide visibility to key data points If one application has the functionality “out-of-the-box”, that would be considered our first option for the source system. Look to Consolidation Tools and Applications – WD-40 Company has a small but mighty team; thus supportability and sustainability needs to be focused on reducing applications footprints. Easy to use, low-code, no-code options should be considered our first option – we don’t have the luxury of an in-house development team. Think Globally and Avoid Bi-Directional Synchronization whenever possible SnapLogic was key in fulfilling these tenets by providing a best of class software as a service middleware application and easily scaling to our changing needs/requirements. The Salesforce connectors have been greatly utilized in our use cases and we are very happy with the scheduled, near real time, and real time integrations of 3 major strategic application win ERP, CRM and our advanced supply demand planning tool – Atlas, John Galt. Who was and how were they involved in building out the solution? Shawna Barczi, WD-40 Salesforce Administrator – provided WD-40 Salesforce subject matter expertise and assisted in requirements gathering and testing. Carrie Craig, WD-40 Director of Enterprise Business Applications – provided program vision and overall architectural strategy guidance Craig Douglas, WD-40 EDI & eCommerce Manager – SnapLogic administrator and provided Legacy ERP integration assistance Rhea Jurf, WD-40 Manager, IT Project and Portfolio Management – provided project management for all Salesforce and the integration projects Daylene Maughan, WD-40 Enterprise Applications Administrator – provided Legacy ERP integration assistance Ali Sharifian, SnapLogic Professional Services Architect – designed and developed the SnapLogic integrations Ryan Shaw, Slalom Principal Consultant – provided Salesforce integration consulting assistance. Claudia Sittman – Slalom Salesforce Consultant - – provided Salesforce integration consulting assistance. What were the business results after implementing these application integrations? Our business results are that we are now able to integrate our applications with Salesforce via 15 integrations in SnapLogic, spanning over 30 pipelines. Over the past year, over half a billion documents have been processed through in the WD-40 SnapLogic Production environment. This resulted in just some of these added benefits: Within Salesforce Consumer Goods Cloud (CGC) the merchandiser team did $60K in additional orders in the first 2 months based on information and notifications compiled from Salesforce CGC. The team would not have been able to target a “Host Buy” to those stores in need without that information. Our sales team went into their merchandised locations and notated a “needs order” when there was an item that was in stock according to the system, all with their mobile phones while in the store location. This allowed for 2 host buys to be initiated, one for ~$20K and one for ~$40K. To mitigate these uncovered issues help us get product into these locations. The result for the rest of FY22 was approximately 50% reduction in Out of Stocks and a Net 155K savings. There was a significant forward-looking opportunity to improve Merchandizing and Sales Compliance. This proved true to have an additional 112K sales based upon better data access within the first month of launch. Our merchandiser team previously was only achieving 80% compliance in stores that they visited every month. With the additional percentage of sales generated from these placements, and a more trustworthy measurement of our compliance in real-time, the team utilizes this information to have more data-led conversations with our Merchandising Execution Team partners and our merchants. WD-40 Company has had multiple conversation that have resulted in conversion to a display in Garden with a 100% compliance within the 1st quarter of rollout of Salesforce Consumer Goods Cloud. This translated to 1,000 more units sold every week over the first 3 weeks after launch. Anything else you would like to add? We are very proud of what we accomplished especially because the legacy ERP application had no exposed API suite or modern capabilities existing in most software technologies today. We were challenged to think out of the box and I just kept telling the team to try it. It Should work!dmiller3 years agoFormer Employee2.3KViews0likes0CommentsGlobalLogic: Managing and Transforming Data in Real Time with Multiple Integrated Applications
What were the underlying reasons or business implications for the need to automate business processes? Following were some of the challenges that we had which implicated the need to automate our business process: a) More than 24 hours to get data Sync from oracle ERP to various reporting tools. b) Lot of Manual intervention was required in order to operate Project Managerment activity. To achive this Financial Force project Order to Invoice has been implemented. c) Manual addition and removal of each GlobalLogic Employee’s in their respective JIRA/GPS(Global People Service) groups. d) Post COVID company runs on hybrid model and no need to assign designated seats to each employees, so integrated the IBM Tririga as Integrated work ManagementSystem and integrated all Employee, department, Project Data to IBM Tririga Application. e) Different Logic/Platform/Environment to gather the same set of data resulting in erroneous results, f) Delay in Time to completion Describe your automation use case(s) and business process(es). GlobalLogic wanted to have some latest Integration tool which have the capability to send the data to realtime / Batch / Scheduled/Cron job manner so all the data transportation can happen across the various systems along with data transformation also if required and replace their existig legacy integration tool which doesn’t have capability to sync data to realtime and cron job manner. GL has mainly use case of integrating various On-Premises and cloud applications,In order to achieve this Objective GL selected SnapLogic Integration Platform after reviewing multiple other ETL tools available in industry. Using SnapLogic, GL is able to achieve the multiple parallel application integrated within a period of 6-9 months from the time of getting initial license of this Product. Following were some of the major use cases: a) Integration with Oracle ERP Data to Salesforce (Scheduled Job and real-time). b) Extract Employee data from Oracle ERP and assign them in their respective Atlassian JIRA groups with the help of atlassian JIRA APIs. c) GL has its own GLO page where recruiter can create vacancy and generate the offer letter, in order to achive it needs real-time data from Oracle ERP based on recruiter inputs. d) On the various base data feed from Oracle ERP to Anaplan in order to achieve forcasting data. e) Real-time data sync between IBM Tririga and Google Gsuite Calendar Integration and make both the system in sync of all reservations of Conference room. f) Take the Ex-employee of GlobalLogic from Oracle ERP and assign them to designated project in GL GPS(Global People Service) so that they can followup with HR team for any query and clarification once they are out of GL and based on their FNF settlement again revoke the access. Describe how you implemented your automations and processes. In order to achieve our all use case defined to move data from one application to other system mainly from Oracle ERP and Data WareHouse to Salesforce, Atlassian JIRA, GLO Application, IBM Tririga, HireEZ,Eightfold, SQL Server, Google Gsuite, Anaplan, Coupa, Signy and some of the external Vendors application. We have achieved these with help of core and some premium Snap Packs like REST API, Salesforce, MySQL, SQL server, Anaplan, FeedMaster features, Oracle, Email, SOAP etc and basic core Snaps. What were the business results after executing the strategy? Using SnapLogic post integration the whole Data transportation from ERP to Salesforce improved nearly by more than 90% improvement.the Millions of records from ERP data reaches in less than 15 mins which was earlier happening 3-4 employees manual data extract from ERP in Excel and generate the report based on Excel formula. Due to these Integration Refreshed data is available in Salesforce for reporting. The flexibility, speed and ease of use of the SnapLogic Platform allowed us to remove the time based bottleneck of getting the data and get the refreshed report and increase in productivity by more than 75%. Before this implementation Managers and Business owners had to wait 24-48 hours to get the various reports on either entity, and also 2 days behind data, but with this implementation Business owners get the refreshed data in one click which is near realtime data of across entity. Who was and how were they involved in building out the solution? Initially this was handled alone by me and the initial Implementation who basically were responsible in developing and setup the Snaplogic Org Administration Configuration setup Groundplexes, FeedMasters user groups etc. in order to setup structured way of SnapLogic Org. When we joined GlobalLogic then we took the charge of SnapLogic as an Technical Architect and primary role was to configure the SnapLogic Application ready to use as per Business standards considering snapLogic design principles. We have involved also in purchase of additional licenses of Snaplogic Orgs and Nodes, FeedMaster as per requirements need with help of SnapLogic sales team and was co-ordinating with Harmeet Grewal and our CSM Nidhi Sohal. Later we hired one more SnapLogic-experienced employee (Abhishek.raj2@globallogic.com) and we took charge of all existing and new Integration along with created various automated reusable SnapLogic Pipelines which can save the lot of effort while doing repetitive type of work based on different input values. Currently we are integrating with multiple Application like Salesforce, Atlassian, Coupa, Signy, Anaplan, LearnUpon, Oracle ERP, DWH,Google Gsuite, IBM Tririga, Hire EZ, Eightfolde,iMocha and many more to come. SnapLogic Technical Architect : Subhash Chandra Integration Lead Consultant : Abhishek Raj Anything else you would like to add? GlobalLogic Investment in SnapLogic Integration Platform which is a true Enterprise Platform in the cloud has tremendous benefits in speeding up their Business Objectives related to all integration held and Data Availability resulting in increase growth and productivity. Below is the Integration scope roadmap of SnapLogic in GlobalLogic Below is the Integration scope roadmap of SnapLogic in GlobalLogicdmiller3 years agoFormer Employee2.2KViews3likes0CommentsCaterpillar: Using SnapLogic for Multiple Use Cases
What were the underlying reasons or business implications for the need to automate business processes? Many Describe your automation use case(s) and business process(es). Outbound Idoc Traffic: Handling Outbound Idoc traffic from multiple SAP envs presents a challenge in many ways. Our first generation: Our Outbound Idoc Distribution integration automates Idoc data handling using an Idoc Document Listener snap to capture all Outbound Idoc types for a particular SAP connection and route them to IDoc type-specific queue in MQ SERIES. SnapLogic integrations that process IDoc type-specific data read in the data from the queues and process it to various target systems. A second generation version is coming soon that will run on our new Groundplex and use NAS storage in place of MQ. Error Handling: I built an Error Pipeline that have been widely adopted in our ORG stack across 200+ project folders. The pipeline offers ServiceNow INC ticket creation and email functionality. Because pipelines that reference an error handler do not fail on errors our external scheduling tool (Tidal) is removed from the Support model. Usage tracking/Customer Charges: I built the first generation of our SnapLogic Usage integration, which read execution log data from the SL public run time API. As we grew, a rebuild was performed changing the data pull from an every hour full pull to a once a day read that scans recursively across all project folders using pagination in the REST snap. This latest version has proven sturdy in handling our rapidly growing execution load. Tidal Automation: I built the first generation of our SnapLogic Tidal Automation Model. This model utilizes a Web Service Adapter in Tidal to make API calls to SnapLogic via Datapower using OAuth2 Authentication. This is a self-contained, reliable, model, but does have a max run time of 1 hour due to the 1 hour cloud FW timeout. A second generation model is about to start development that will use a Tidal agent instead of a Web Service Adapter and will include a perl script. The script will handle OAuth2 and loop through execution status checks against the SL public run time API. Initialization Errors: Built integration that generates tickets for pipelines that fail though they references an error pipeline. These types of failures fall into a category of infrastructure or initialization issues. This pipeline closes a glaring hole in the support model as node level events or pipeline initialization errors result in error pipelines also failing. This means integrations fail without generating ticket and/or emailsdmiller3 years agoFormer Employee2.2KViews1like0Comments