Using SAP S/4HANA Snaps to extract Operational Data Provider (ODP) data
SAP's Operational Data Provisioning (ODP) provides a technical infrastructure that you can use to support two different application scenarios. The first of these is Operational Analytics for decision making in operative business processes. The other is data extraction and replication. It has been around for quite some time and has replaced all other methods when extracting doing full data as well as delta extraction into BW/4HANA. SAP has down-ported the framework to older versions of their ERP Suite like SAP ECC, CRM, SRM, SCM to make it widely available and that lead many 3rd party vendors to tap into the Remote Function Calls build for ODP to leverage the framework and it's capabilities. In February of 2024 SAP has released a Note 3255746 stating that going forward, SAP will not permit the use of RFCs for ODP data extraction. SAP clearly states that the RFC modules are only intended for SAP-internal applications and can be modified by SAP at any time, without notice. It also reserves the right to put in place technical measures that will restrict and audit the unpermitted use of RFC modules of the ODP Data Replication API. It further goes on to state any issues experienced or caused by third-party applications or customer applications by using the RFC modules, are entirely at the risk of the customer, and SAP is not responsible for resolving such issues, nor will it provide any support. As a solution SAP states that the Operational Data Provisioning (ODP) framework for data extraction and replication is exposed by an official and externally-available OData API and that all customer and third-party applications should be built using this API as it provides a stable interface (link to documentation). This blog aims to show you how you can accomplish the extraction of data from the ODP framework via OData and our SnapLogic S/4HANA Snap Pack. What is ODP and what can I do with it I mentioned already that SAP made the ODP the standard extraction method for BW/4 HANA. The Operational Data Provisioning (ODP) framework for data distribution provides a consolidated technology for data provisioning and consumption. ODP supports data extraction and replication for multiple targets and supports capturing of changes at source. This means that you are able to extract a wide variety of sources through the framework. Below is a drawing often used by SAP to depict the capabilities and visualize the various options. All of these ODP scenarios (often referred to as contexts) can be consumed by SnapLogic. ODP - SAP Extractors (ODP_SAP): This context exposes BW DataSources as Operational Data Providers. A BW DataSource defines an extraction structure that is populated by extraction code, which implements the logic to retrieve relevant data from the ABAP system. The ODP framework, without additional configuration, supports DataSources released by the application owner. Most of these DataSources are part of the SAP Business Suite. ODP - ABAP CDS Views (ODP_CDS): ABAP Core Data Services (CDS) enhance ABAP SQL to enable the definition and consumption of semantically rich data models natively. This enhancement boosts productivity, consumability, performance, and interoperability. The key concept for expressing the semantics of ABAP CDS models is annotations, which are attached to elements of the model. By applying annotations with analytical semantics, ABAP CDS views can be annotated so that the resulting metadata allows the generation of transient Operational Data Providers. ODP - BW (ODP_BW): SAP BW/4HANA exposes its data as Operational Data Providers for extraction purposes. All relevant SAP BW/4HANA InfoProviders are supported as sources. ODP - SLT Queue (ODP_SLT): Utilizing the ODP infrastructure and the trigger-based replication of the SAP Landscape Transformation Replication Server (SLT), data can be transferred in real-time from multiple source systems to one or more target systems. The SLT Replication Server serves as a provider for the ODP infrastructure, making tables from SAP sources available as delta queues. How do I expose an ODP Data Source via OData To capture deltas, the source data is written to an Operational Delta Queue through an update process. The process supports package-based full extraction for very large volumes using the operational delta queue. After the initial full extraction, it can also capture and extract deltas for changes using the operational delta queue. However, the ODP data source must support deltas for this functionality. SnapLogic can then retrieve data from this delta queue, which also aids in monitoring the data extraction and replication process. Exposure of the data is achieved using the SAP Gateway Foundation and by generating an SAP OData service with the SAP Gateway Service Builder for the ODP Data Source. This service (URL) can be accessed from an external OData client application via OData/HTTP. Generating the OData Service in the Gateway Service Builder The process is entirely non code, everything that is required in SAP will be generated including the OData service in the Gateway Service Builder. Simply navigate to transaction SEGW in SAP. Start by creating a new Project using the first icon in the menu bar marked in red in the below screenshot. In the popup give your project a name and description, choose Service with SAP Annotations and select the package you want the generated code to be stored under. If you do not need to transport the generated code to other SAP systems you can choose $TMP as the package Once the project is created select Data Model and choose Redefine -> ODP Extraction from the context menu. A wizard will now guide you through the steps in which you can select from all possible ODP data sources. Go through the wizard by choosing the Next button until the wizard finishes by generating the Model. When the Model is generated you continue to generate the OData Services by choosing the Generate Runtime Objects button. Choose the Defaults for all settings and take note of the Technical Service Name. This is the name that your OData service will have once it is generated. Activate the OData Service To activate the newly created service head to transaction /IWFND/MAINT_SERVICE in SAP. Choose the Add Service button. Enter the Technical Service Name and hit Enter. Then select the Technical Service Name from the List at the bottom and click the Add Selected Services button. Once activated you can find the service in the Service Catalog by filtering by Technical Service Name How do I consume the OData Service in SnapLogic On the SnapLogic side you use the SAP S/4HANA Read Snap to read data from the exposed OData Service. The OData Service uses an account that points to the OData Version 2 catalog containing all services as show in the screenshot blow. On the Settings screen of the SAP S/4HANA Read Snap, search for the OData Service in the API Object or select the service from the dropdown list. The service creation from the extractor gives you two entities to use: One with a name starting with EntityOf<objectName>, FactsOf<objectName> or AttrOf<objectName> depending on the type of extractor or view that represents the data source model. One one that starts with DeltaLinksOf<objectName> that exposes information about current and past delta tokens. By default, if you send a request to the first service, you will retrieve a full dataset, just like you’d work with any other OData service. However, if you add the Prefer: odata.track-changes special request header as show in the screenshot below the OData Service will subscribe to the Operational Delta Queue (ODQ) and return __delta links that allow you to request the delta that accumulated. Be sure that you removed the default entry in the Output Entry Limit field that represents the top=X URL attribute in OData in the Snap. If I kept the default entry during my tests, our S/4HANA system would not register the request in the ODQ at all. Lastly make sure that you check the Raw response checkbox for the Snap to add the __delta link to the output for downstream snaps. Retrieving ODP data in packages Should you want the ODP data to be retrieved in packages, you can specify an additional value in the Prefer header called odata.maxpagesize=<N> where <N> is the number of packages that should be prepared for retrieval, for example, odata.maxpagesize=50. This starts a background job in SAP for paging and the data is cached in ODQ. Using the odata.maxpagesize value you can specify the quantity of packages to be retrieved from the delta queue. The end of the response contains a relative link in the attribute __next, which can be used to retrieve the next package. This link can be fed into the Has Next and Next URL fields to make the S/4HANA Read Snap page through the packages stored in ODQ by the background job. How Can I make use of the Deltatoken in a SnapLogic Pipeline If subscribed successfully to ODQ the output of the S/4HANA Read snap will contain the __delta link. This link contains the query parameter !deltatoken. This URL can be used in subsequent calls to retrieve the delta update from the point in time when the token was issued to your subscription. You can either store the URL or only the token and use it in subsequent runs of the pipeline or alternatively use the DeltaLinksOf<objectName> described above to retrieve all tokens that have been issued for your subscription and then use the last token in the list to read the delta. Understanding the information received with a delta update With every row that you receive from a delta you also get two additional fields named ODQ_ENTITIYCNTR and ODQ_CHANGEMODE. The two fields will need to be considered in the target to understand if they are changed or new records. Tracking and Terminating a Subscription. Every OData Service generated for ODP also contains two additional Entities, SubscribedToFactsOf <objectName> to determine whether a delta subscription exists for the user and TerminateDeltasForFactsOf<objectName> to terminate delta extraction for the user. In our S/4HANA system the entities where not exposed through the ODP catalog but could be called without issues. The attached pipeline makes use of the SubscribedToFacts service to check if we have subscribed in a previous run to ODQ.2.8KViews3likes0CommentsHow to Read SAP Tables the RFC Way
Reading directly from tables in your SAP ERP system has, in the past, always been impossible, not from a technical point of view, but due to licensing restrictions. Today, I see more customers of SAP that do have a license in place to access the database of their S/4HANA system. Still, one of the problems why SAP did not allow this in the past remains: there are simply no datatypes available to you as they reside in what SAP calls the Data Dictionary or DDIC for short. If you go through the application layer to read the table, you get datatypes through the ABAP runtime, and one of your options is, therefore, using SAP Remote Function Call modules. That said, this blog is not about finding the suitable RFC-enabled function module to read tables but more about the vast potential of SnapLogics Expressions, which you can use in almost all input fields of any Snap including the output of SAP RFC's. So, let's dive right into it and see what it can do for you. Prerequisites The RFC Execute Snap can only run on SnapLogic Groundplexes; it requires SAPs JCo and native libraries. The Linux installation of our Groundplex comes with all libraries, and there is no need for additional configuration, while the Windows version does not. For RFC to work on a Windows-based Groundplex, you have to download JCo from SAP, install the Windows 2013 Redistributable Package, and add the folder that contains the SAP JCo native library to the Windows PATH variable as described in our Documentation. Configuring an account for SAP Snaps is straightforward. It can be done directly on an Application Server or via the Central Instance. Calling the RFC Function Module Reading an SAP Table the RFC way requires an RFC-enabled function module on the SAP side and the SAP Execute Snap in SnapLogic. I am sure other function modules can directly read a specific table, but I would like to focus on RFC_READ_TABLE in this blog. This RFC-enabled function module allows you to specify the table you want to read from the outside, including a WHERE clause if you choose to use it. The function module has been around for many decades and is used by many projects at SAP customers. While it has restrictions that we will look at a bit further down, I think it is an option you should consider over writing a custom function module yourself. Using the SAP Execute Snap The configuration of the SAP Execute Snap is pretty straightforward, as shown in the screenshot below; all you need to do is specify RFC_READ_TABLE as the function module to use, and you are done. Ideally, you should place a Mapper Snap in front and behind the SAP Execute Snap to understand the function module's behavior. Once you validate this pipeline, you can use the Mapper Snap before the SAP Execute Snap to specify the table you want to read by mapping the table name as a string to the QUERY_TABLE input parameter. Optionally, the OPTIONS input array allows you to specify a WHERE clause, as shown in the screenshot below. Validating the pipeline for a second time will return the data read from the table specified in the field DATA with all fields packed into a single string. The restriction of the function module becomes apparent when you look at the definition of this field inside the SAP Gui in transaction SE37. The field is of type TAB512, and the length is 512 characters. This means that tables with a combined length of all fields more significant than 512 will only partially return information or not at all. Transforming these strings into usable data with the correct types can then be achieved by using the output table named FIELD. This table contains the type, length, and offset information used to split the string into individual fields by leveraging the .map, .toObject, match, parse functions for int float, date, and the substr functions in SnapLogics Expressions. Match is used to find the type in the FIELD output table, susbstr returns the part of the DATA string that contains the data of the field, parse(Int, Float, Date) parses the string into the respective SnapLogic type, .toObject creates a new object with the name of the fields and the parsed content and map adds each row to an array in the SnapLogic Document. $DATA.map(d=> $FIELDS .toObject( f=> f.FIELD NAME , f=> match f.TYPE { 'I' => parseInt(d.WA.substr(f.OFFSET, f.LENGTH)) , 'N' => parseFloat(d.WA.substr(f.OFFSET, f.LENGTH)) , 'P' => parseFloat(d.WA.substr(f.OFFSET, f.LENGTH)) , 'D' => Date.parse(d.WA.substr(f.OFFSET, f.LENGTH), 'yyyyMMdd') , _ => d.WA.substr(f.OFFSET, f.LENGTH).trim() } ) ) Validating the pipeline for the third time now will present you with a perfectly structured output.2.1KViews0likes0CommentsIntegrating SAP using the SAP IDoc Snaps. (Part 4/4)
I spent more than two decades working for SAP. When Matthew Bowen approached me about helping him understand our SAP Snaps better, we started digging into them immediately. After a while, we both felt it would be a great idea to not only share the knowledge inside Snaplogic but also start a blog series on integrating with SAP and help shed some light not only on the Snaps themselves but also on the SAP side. This article will kick everything off by looking at our SAP IDoc Snaps. The series will start with a general overview of the Snaps and the required prerequisites. We then look at creating IDocs from a SnapLogic pipeline and taking a peek at the processing inside SAP before continuing to show you how to send a status change back to SAP for a given IDoc you received in a pipeline. Finally, we close the series by looking at securing the communication between the Snaplogic Groundplex and SAP with Secure Network Communication (SNC).1.1KViews0likes0CommentsIntegrating SAP using the SAP IDoc Snaps. (Part 3/4)
I spent more than two decades working for SAP. When Matthew Bowen approached me about helping him understand our SAP Snaps better, we started digging into them immediately. After a while, we both felt it would be a great idea to not only share the knowledge inside Snaplogic but also start a blog series on integrating with SAP and help shed some light not only on the Snaps themselves but also on the SAP side. This article will kick everything off by looking at our SAP IDoc Snaps. The series will start with a general overview of the Snaps and the required prerequisites. We then look at creating IDocs from a Snaplogic pipeline and taking a peek at the processing inside SAP before continuing to show you how to send a status change back to SAP for a given IDoc you received in a pipeline. Finally, we close the series by looking at securing the communication between the Snaplogic GroundPlex and SAP with Secure Network Communication (SNC).1.4KViews1like0CommentsIntegrating SAP using the SAP IDoc Snaps. (Part 2/4)
I spent more than two decades working for SAP. When Matthew Bowen approached me about helping him understand our SAP Snaps better, we started digging into them immediately. After a while, we both felt it would be a great idea to not only share the knowledge inside Snaplogic but also start a blog series on integrating with SAP and help shed some light not only on the Snaps themselves but also on the SAP side. This article will kick everything off by looking at our SAP IDoc Snaps. The series will start with a general overview of the Snaps and the required prerequisites. We then look at creating IDocs from a Snaplogic pipeline and taking a peek at the processing inside SAP before continuing to show you how to send a status change back to SAP for a given IDoc you received in a pipeline. Finally, we close the series by looking at securing the communication between the Snaplogic GroundPlex and SAP with Secure Network Communication (SNC).1.9KViews1like0CommentsIntegrating SAP using the SAP IDoc Snaps. (Part 1/4)
I spent more than two decades working for SAP. When Matthew Bowen approached me about helping him understand our SAP Snaps better, we started digging into them immediately. After a while, we both felt it would be a great idea to not only share the knowledge inside Snaplogic but also start a blog series on integrating with SAP and help shed some light not only on the Snaps themselves but also on the SAP side. This article will kick everything off by looking at our SAP IDoc Snaps. The series will start with a general overview of the Snaps and the required prerequisites. We then look at creating IDocs from a Snaplogic pipeline and taking a peek at the processing inside SAP before continuing to show you how to send a status change back to SAP for a given IDoc you received in a pipeline. Finally, we close the series by looking at securing the communication between the SnapLogic GroundPlex and SAP with Secure Network Communication (SNC).92KViews1like0Comments