Bridging Legacy OPC Classic Servers(DA, AE, HDA) to SnapLogic via OPC UA Wrapper
Despite significant advances in industrial automation, many critical devices still rely on legacy OPC Classic servers (DA, AE, HDA). Integrating these aging systems with modern platforms presents challenges such as protocol incompatibility and the absence of native OPC UA support. Meanwhile, modern integration and analytics platforms increasingly depend on OPC UA for secure, scalable connectivity. This post addresses these challenges by demonstrating how the OPC UA Wrapper can seamlessly bridge OPC Classic servers to SnapLogic. Through a practical use case—detecting missing reset anomalies in saw-toothed wave signals from an OPC Simulation DA Server—you’ll discover how to enable real-time monitoring and alerting without costly infrastructure upgrades216Views4likes2CommentsScalable Analytics Platform: A Data Engineering Journey
Scalable Analytics Platform: A Data Engineering Journey - Explore SnapLogic's innovative Medallion Architecture approach for handling massive data, improving analytics with S3, Trino, and Amazon Neptune. Learn about cost reduction, scalability, data governance, and enhanced insights.176Views2likes0CommentsIndustrial IoT – Turbine Lubrication Oil Level Monitoring & Alert Mechanism via OPC UA and SnapLogic
In the energy sector, turbine lubrication oil is mission-critical. A drop in oil level or pressure can silently escalate into major failures, unplanned shutdowns, and expensive maintenance windows. In this blog, we showcase a real-world implementation using SnapLogic and OPC UA, designed to: 🔧 Continuously monitor turbine lubrication oil levels 📥 Ingest real-time sensor data from industrial systems 📊 Store telemetry in data lakes for analytics and compliance 📣 Real-time Slack alerts to engineers — before failures strike This IIoT-driven solution empowers energy providers to adopt predictive maintenance practices and reduce operational risk259Views2likes1CommentIndustrial IoT – OPC UA Real-Time Motor Overheat Detection and Auto-Shutdown Using SnapLogic
Industrial motors are critical assets in manufacturing and process industries, where overheating can result in costly downtime or catastrophic failure. In this blog, we demonstrate how SnapLogic and OPC UA were used to build a real-time, event-driven pipeline that detects motor overheating, initiates an automated shutdown, logs events for auditing, and notifies the maintenance/engineering team211Views3likes0CommentsRevolutionizing Software Testing: How LLMs are Powering Automated Test Case and Data Generation
Tired of writing endless test cases and crafting complex test data manually? Discover how Large Language Models (LLMs) are transforming the QA landscape by automating test case and test data generation with remarkable accuracy and speed. In this article, we explore how LLMs—when paired with tools like SnapLogic Agent Creator—can accelerate testing cycles, boost coverage, and reduce QA efforts by up to 90%. Step into the future of intelligent, AI-driven software testing385Views6likes0CommentsUnlocking the Power of LLMs with OpenAPI Tool Integration
Large Language Models (LLMs) are revolutionizing the way we interact with digital systems, from conversational agents to intelligent automation. But to truly harness their capabilities, especially in enterprise and developer ecosystems, it’s essential to bridge the gap between LLMs and external systems through tools—specifically APIs. This is where OpenAPI plays a pivotal role. What is OpenAPI? OpenAPI (formerly Swagger) is an open-source specification that defines a standard, machine-readable format for describing RESTful APIs. It enables developers and automated systems to understand an API’s structure—including endpoints, request parameters, authentication methods, and response types—without relying on traditional documentation or access to source code. Its adoption spans industries such as technology, finance, and healthcare, thanks to its interoperability with a wide array of tools and frameworks. Why OpenAPI Matters for LLMs Integrating OpenAPI with LLMs enhances their ability to interact with real-world systems. Here's how: Universal Interface: OpenAPI acts as a universal bridge to RESTful APIs, making it possible for LLMs to interact with services ranging from cloud infrastructure to productivity apps. Standardized Format: The standardized schema helps LLMs accurately interpret API functionality—including expected inputs and outputs—without ambiguity. Accelerated Tool Creation: Developers can efficiently build LLM-compatible tools by parsing OpenAPI definitions directly. Seamless Integration: With broad support from API tooling ecosystems, OpenAPI enables quick embedding of LLM agents into existing workflows. Supports Tool Calling: Tool calling allows LLMs to autonomously select and invoke relevant APIs based on user prompts—a key feature unlocked by structured OpenAPI descriptions. Enabling LLM Tool Calling with SnapLogic To connect LLMs with OpenAPI-defined tools, the OpenAPI Function Generator Snap plays a crucial role. This component converts any OpenAPI spec into a tool object that LLMs can use through the Tool Calling pipeline in SnapLogic. Input Options for the Generator Snap The generator supports multiple input methods: URL: Directly fetch the OpenAPI spec from a provided URL. Text Editor: Paste the raw spec into a built-in editor. Input Document: Pass the OpenAPI string as part of an input document via expression. File Upload: Select a spec file stored in the SLDB. Output Structure The generated tool output includes: sl_tool_metadata: Metadata such as security parameters, headers, and base URLs. json_schema: A schema of the input parameters. These tools can be passed into the Tool Calling Snap, which then resolves runtime variables like headers and endpoint URLs dynamically. Developers can chain this with an HTTP Client Snap to perform real API calls based on LLM outputs. Passing Through the Tool Calling Snap When the tool is passed through the Tool Calling Snap, it dynamically processes and resolves several key components using the metadata and user input: Resolved URL: The base URL and path parameters from the OpenAPI spec are combined with user-supplied values to generate the final API endpoint. Headers: Custom headers, or content-type headers are filled in based on the OpenAPI security definitions or context provided by the LLM. This resolved output makes it simple for downstream snaps (like HTTP Client) to directly execute the API call. Action tools with HTTP Client Snap Once the Tool Calling Snap generates the resolved tool data, this output can be piped directly into an HTTP Client Snap for execution: This setup effectively turns a static OpenAPI definition into a fully dynamic and executable workflow, allowing LLMs to autonomously interact with real services. Real-World Use Cases With the right configuration, LLMs can interact with virtually any OpenAPI-compliant service. This opens up a wide range of practical applications across productivity tools, developer APIs, data services, and more. Example Use Case: Load Products from FakeStore API and Save as CSV in GitHub Gist This example shows how an LLM can orchestrate a two-step integration using OpenAPI specs and tool calling via SnapLogic: Fetch Data: Retrieve product data from FakeStore API. Transform & Upload: Format the data as CSV and post it as a public GitHub Gist using GitHub’s Gist API. Main Pipeline (download) Loop Pipeline (download, github openapi file, fake store openapi file) Prompt to LLM: “Load all products from FakeStore API and upload them as a CSV file to GitHub Gist.” Pipeline Flow Breakdown Step 1: FakeStore API Tool Call OpenAPI Tool: FakeStore API spec (loaded via URL or file). LLM Task: Recognize the available /products endpoint and trigger a GET request to retrieve the full list of products. Tool Calling Snap Output: Resolved URL to https://fakestoreapi.com/products, method GET, no authentication needed. Step 2: GitHub Gist API Tool Call OpenAPI Tool: GitHub Gist API spec, with token-based authentication defined in sl_tool_metadata. LLM Task: Use the POST /gists endpoint, and construct the request body with: description: e.g., "FakeStore Products Export" public: true files: A JSON object with one file (e.g., "products.csv": { content: "<csv data>" }) Step 3: Summarize the Result LLM Task: Extract and present key details from the final Gist API response, such as: Total number of products exported Link to the created Gist (e.g., html_url) Confirmation message for the user Final Result:638Views0likes0CommentsUsing SAP S/4HANA Snaps to extract Operational Data Provider (ODP) data
SAP's Operational Data Provisioning (ODP) provides a technical infrastructure that you can use to support two different application scenarios. The first of these is Operational Analytics for decision making in operative business processes. The other is data extraction and replication. It has been around for quite some time and has replaced all other methods when extracting doing full data as well as delta extraction into BW/4HANA. SAP has down-ported the framework to older versions of their ERP Suite like SAP ECC, CRM, SRM, SCM to make it widely available and that lead many 3rd party vendors to tap into the Remote Function Calls build for ODP to leverage the framework and it's capabilities. In February of 2024 SAP has released a Note 3255746 stating that going forward, SAP will not permit the use of RFCs for ODP data extraction. SAP clearly states that the RFC modules are only intended for SAP-internal applications and can be modified by SAP at any time, without notice. It also reserves the right to put in place technical measures that will restrict and audit the unpermitted use of RFC modules of the ODP Data Replication API. It further goes on to state any issues experienced or caused by third-party applications or customer applications by using the RFC modules, are entirely at the risk of the customer, and SAP is not responsible for resolving such issues, nor will it provide any support. As a solution SAP states that the Operational Data Provisioning (ODP) framework for data extraction and replication is exposed by an official and externally-available OData API and that all customer and third-party applications should be built using this API as it provides a stable interface (link to documentation). This blog aims to show you how you can accomplish the extraction of data from the ODP framework via OData and our SnapLogic S/4HANA Snap Pack. What is ODP and what can I do with it I mentioned already that SAP made the ODP the standard extraction method for BW/4 HANA. The Operational Data Provisioning (ODP) framework for data distribution provides a consolidated technology for data provisioning and consumption. ODP supports data extraction and replication for multiple targets and supports capturing of changes at source. This means that you are able to extract a wide variety of sources through the framework. Below is a drawing often used by SAP to depict the capabilities and visualize the various options. All of these ODP scenarios (often referred to as contexts) can be consumed by SnapLogic. ODP - SAP Extractors (ODP_SAP): This context exposes BW DataSources as Operational Data Providers. A BW DataSource defines an extraction structure that is populated by extraction code, which implements the logic to retrieve relevant data from the ABAP system. The ODP framework, without additional configuration, supports DataSources released by the application owner. Most of these DataSources are part of the SAP Business Suite. ODP - ABAP CDS Views (ODP_CDS): ABAP Core Data Services (CDS) enhance ABAP SQL to enable the definition and consumption of semantically rich data models natively. This enhancement boosts productivity, consumability, performance, and interoperability. The key concept for expressing the semantics of ABAP CDS models is annotations, which are attached to elements of the model. By applying annotations with analytical semantics, ABAP CDS views can be annotated so that the resulting metadata allows the generation of transient Operational Data Providers. ODP - BW (ODP_BW): SAP BW/4HANA exposes its data as Operational Data Providers for extraction purposes. All relevant SAP BW/4HANA InfoProviders are supported as sources. ODP - SLT Queue (ODP_SLT): Utilizing the ODP infrastructure and the trigger-based replication of the SAP Landscape Transformation Replication Server (SLT), data can be transferred in real-time from multiple source systems to one or more target systems. The SLT Replication Server serves as a provider for the ODP infrastructure, making tables from SAP sources available as delta queues. How do I expose an ODP Data Source via OData To capture deltas, the source data is written to an Operational Delta Queue through an update process. The process supports package-based full extraction for very large volumes using the operational delta queue. After the initial full extraction, it can also capture and extract deltas for changes using the operational delta queue. However, the ODP data source must support deltas for this functionality. SnapLogic can then retrieve data from this delta queue, which also aids in monitoring the data extraction and replication process. Exposure of the data is achieved using the SAP Gateway Foundation and by generating an SAP OData service with the SAP Gateway Service Builder for the ODP Data Source. This service (URL) can be accessed from an external OData client application via OData/HTTP. Generating the OData Service in the Gateway Service Builder The process is entirely non code, everything that is required in SAP will be generated including the OData service in the Gateway Service Builder. Simply navigate to transaction SEGW in SAP. Start by creating a new Project using the first icon in the menu bar marked in red in the below screenshot. In the popup give your project a name and description, choose Service with SAP Annotations and select the package you want the generated code to be stored under. If you do not need to transport the generated code to other SAP systems you can choose $TMP as the package Once the project is created select Data Model and choose Redefine -> ODP Extraction from the context menu. A wizard will now guide you through the steps in which you can select from all possible ODP data sources. Go through the wizard by choosing the Next button until the wizard finishes by generating the Model. When the Model is generated you continue to generate the OData Services by choosing the Generate Runtime Objects button. Choose the Defaults for all settings and take note of the Technical Service Name. This is the name that your OData service will have once it is generated. Activate the OData Service To activate the newly created service head to transaction /IWFND/MAINT_SERVICE in SAP. Choose the Add Service button. Enter the Technical Service Name and hit Enter. Then select the Technical Service Name from the List at the bottom and click the Add Selected Services button. Once activated you can find the service in the Service Catalog by filtering by Technical Service Name How do I consume the OData Service in SnapLogic On the SnapLogic side you use the SAP S/4HANA Read Snap to read data from the exposed OData Service. The OData Service uses an account that points to the OData Version 2 catalog containing all services as show in the screenshot blow. On the Settings screen of the SAP S/4HANA Read Snap, search for the OData Service in the API Object or select the service from the dropdown list. The service creation from the extractor gives you two entities to use: One with a name starting with EntityOf<objectName>, FactsOf<objectName> or AttrOf<objectName> depending on the type of extractor or view that represents the data source model. One one that starts with DeltaLinksOf<objectName> that exposes information about current and past delta tokens. By default, if you send a request to the first service, you will retrieve a full dataset, just like you’d work with any other OData service. However, if you add the Prefer: odata.track-changes special request header as show in the screenshot below the OData Service will subscribe to the Operational Delta Queue (ODQ) and return __delta links that allow you to request the delta that accumulated. Be sure that you removed the default entry in the Output Entry Limit field that represents the top=X URL attribute in OData in the Snap. If I kept the default entry during my tests, our S/4HANA system would not register the request in the ODQ at all. Lastly make sure that you check the Raw response checkbox for the Snap to add the __delta link to the output for downstream snaps. Retrieving ODP data in packages Should you want the ODP data to be retrieved in packages, you can specify an additional value in the Prefer header called odata.maxpagesize=<N> where <N> is the number of packages that should be prepared for retrieval, for example, odata.maxpagesize=50. This starts a background job in SAP for paging and the data is cached in ODQ. Using the odata.maxpagesize value you can specify the quantity of packages to be retrieved from the delta queue. The end of the response contains a relative link in the attribute __next, which can be used to retrieve the next package. This link can be fed into the Has Next and Next URL fields to make the S/4HANA Read Snap page through the packages stored in ODQ by the background job. How Can I make use of the Deltatoken in a SnapLogic Pipeline If subscribed successfully to ODQ the output of the S/4HANA Read snap will contain the __delta link. This link contains the query parameter !deltatoken. This URL can be used in subsequent calls to retrieve the delta update from the point in time when the token was issued to your subscription. You can either store the URL or only the token and use it in subsequent runs of the pipeline or alternatively use the DeltaLinksOf<objectName> described above to retrieve all tokens that have been issued for your subscription and then use the last token in the list to read the delta. Understanding the information received with a delta update With every row that you receive from a delta you also get two additional fields named ODQ_ENTITIYCNTR and ODQ_CHANGEMODE. The two fields will need to be considered in the target to understand if they are changed or new records. Tracking and Terminating a Subscription. Every OData Service generated for ODP also contains two additional Entities, SubscribedToFactsOf <objectName> to determine whether a delta subscription exists for the user and TerminateDeltasForFactsOf<objectName> to terminate delta extraction for the user. In our S/4HANA system the entities where not exposed through the ODP catalog but could be called without issues. The attached pipeline makes use of the SubscribedToFacts service to check if we have subscribed in a previous run to ODQ.2.7KViews3likes0CommentsHow to Read SAP Tables the RFC Way
Reading directly from tables in your SAP ERP system has, in the past, always been impossible, not from a technical point of view, but due to licensing restrictions. Today, I see more customers of SAP that do have a license in place to access the database of their S/4HANA system. Still, one of the problems why SAP did not allow this in the past remains: there are simply no datatypes available to you as they reside in what SAP calls the Data Dictionary or DDIC for short. If you go through the application layer to read the table, you get datatypes through the ABAP runtime, and one of your options is, therefore, using SAP Remote Function Call modules. That said, this blog is not about finding the suitable RFC-enabled function module to read tables but more about the vast potential of SnapLogics Expressions, which you can use in almost all input fields of any Snap including the output of SAP RFC's. So, let's dive right into it and see what it can do for you. Prerequisites The RFC Execute Snap can only run on SnapLogic Groundplexes; it requires SAPs JCo and native libraries. The Linux installation of our Groundplex comes with all libraries, and there is no need for additional configuration, while the Windows version does not. For RFC to work on a Windows-based Groundplex, you have to download JCo from SAP, install the Windows 2013 Redistributable Package, and add the folder that contains the SAP JCo native library to the Windows PATH variable as described in our Documentation. Configuring an account for SAP Snaps is straightforward. It can be done directly on an Application Server or via the Central Instance. Calling the RFC Function Module Reading an SAP Table the RFC way requires an RFC-enabled function module on the SAP side and the SAP Execute Snap in SnapLogic. I am sure other function modules can directly read a specific table, but I would like to focus on RFC_READ_TABLE in this blog. This RFC-enabled function module allows you to specify the table you want to read from the outside, including a WHERE clause if you choose to use it. The function module has been around for many decades and is used by many projects at SAP customers. While it has restrictions that we will look at a bit further down, I think it is an option you should consider over writing a custom function module yourself. Using the SAP Execute Snap The configuration of the SAP Execute Snap is pretty straightforward, as shown in the screenshot below; all you need to do is specify RFC_READ_TABLE as the function module to use, and you are done. Ideally, you should place a Mapper Snap in front and behind the SAP Execute Snap to understand the function module's behavior. Once you validate this pipeline, you can use the Mapper Snap before the SAP Execute Snap to specify the table you want to read by mapping the table name as a string to the QUERY_TABLE input parameter. Optionally, the OPTIONS input array allows you to specify a WHERE clause, as shown in the screenshot below. Validating the pipeline for a second time will return the data read from the table specified in the field DATA with all fields packed into a single string. The restriction of the function module becomes apparent when you look at the definition of this field inside the SAP Gui in transaction SE37. The field is of type TAB512, and the length is 512 characters. This means that tables with a combined length of all fields more significant than 512 will only partially return information or not at all. Transforming these strings into usable data with the correct types can then be achieved by using the output table named FIELD. This table contains the type, length, and offset information used to split the string into individual fields by leveraging the .map, .toObject, match, parse functions for int float, date, and the substr functions in SnapLogics Expressions. Match is used to find the type in the FIELD output table, susbstr returns the part of the DATA string that contains the data of the field, parse(Int, Float, Date) parses the string into the respective SnapLogic type, .toObject creates a new object with the name of the fields and the parsed content and map adds each row to an array in the SnapLogic Document. $DATA.map(d=> $FIELDS .toObject( f=> f.FIELD NAME , f=> match f.TYPE { 'I' => parseInt(d.WA.substr(f.OFFSET, f.LENGTH)) , 'N' => parseFloat(d.WA.substr(f.OFFSET, f.LENGTH)) , 'P' => parseFloat(d.WA.substr(f.OFFSET, f.LENGTH)) , 'D' => Date.parse(d.WA.substr(f.OFFSET, f.LENGTH), 'yyyyMMdd') , _ => d.WA.substr(f.OFFSET, f.LENGTH).trim() } ) ) Validating the pipeline for the third time now will present you with a perfectly structured output.2.1KViews0likes0CommentsOracle to Redshift Migration Using Generative Integration
It is a quick and easy task to move data at scale without writing complex SQL statements or code. Select the read and write Snaps you need, give them the right parameters and tune as much as you need to, and in minutes your task is done.1.1KViews1like0CommentsUsing Mustache Templating with the Prompt Generator Snap in SnapLogic
In the world of AI-driven data integration, the ability to dynamically generate prompts is crucial for creating adaptable and responsive workflows. The Prompt Generator Snap in SnapLogic leverages Mustache templating to allow users to craft dynamic text outputs based on input data. This whitepaper aims to educate users on the fundamentals of Mustache templating and how to effectively utilize it within the Prompt Generator Snap.1.6KViews3likes0Comments