SAP Data & Analytics: Data Extraction via Operational Data Provisioning Framework

SAP Data & Analytics: Data Extraction via Operational Data Provisioning Framework
Operational Data Provisioning (ODP) architecture

Introduction

Last blog post I covered the concepts of how you extracting business data from S/4HANA using CDS views.

To continue the data extraction theme from S/4HANA, these CDS extraction views run within SAP's Operational Data Provisioning (ODP) framework and provide the mechanics around how your data is extracted alongside CDC (Change Data Capture) functionality. As part of planning your analytics architecture landscape, it's important to understand the basics of ODP as it does play a critical role in how your data leaves S/4HANA.

Key Resources on Operational Data Provisioning

There is already excellent documentation on Operational Data Provisioning so I won't re-hash all the concepts here.

I recommend starting with this post to get oriented:

Operational Data Provisioning – Simplified | SAP Blogs
Recently I have been working on a POC in BW/4HANA & S/4HANA for one of the biggest Consumer goods Industry, I got an opportunity to explore this new Framework of ODP and I like to share few interesting

This post is a really good way to quickly gather up the key concepts of the framework.

If you want to dive deeper I recommend jumping into SAP's very detailed wiki on the framework here:

Introduction to Operational Delta Queues - SAP NetWeaver Business Warehouse - Support Wiki

This wiki provides a very deep guide on all aspects of the Operational Data Provisioning framework from ODPQMON transactions, critical SAP notes and security. I recommend reading this guide in detail if you're getting setup and also have specific questions about your specific use case (S/4HANA, BW, SLT etc).

Considerations when using Operational Data Provisioning

One challenge that I've seen with Operational Data Provisioning especially in the case of integrating SAP Data Services with Azure Data Lake is recovery of delta loads.

Working with Bad Source Data

For example, let's say you have a field in source S/4HANA system that contains an invalid character. Data ingestion takes place and that bad data ends up in your target Azure Data Lake storage with a corrupted file. A day or two passes before its detected.

For a full load scenario, you simplify identify and fix the issue at the source and re-run the full load to fix the entire data set.

However! When you have a delta scenario you have to be a little bit more careful. If you want to replay the delta scenario again the default setting retention period is 24 hours. If you miss this retention time frame you will need to run an entire full load again to recover and if your data set is enormous this may not be a viable option.

Review SAP note: 2854627 - How to change the Retention periods for Recovery of 'Reorganize delta queues' for more details on how to extend this setting.

Additional Reading and References

2232584 - Release of SAP extractors for ODP replication (ODP SAPI)

πŸ™‹πŸ»β€β™‚οΈ I might need additional help!

If your organization needs a helping hand on any SAP technology topics, please feel free to reach out to me at ben@ben-kwong.com

Subscribe to Ben Kwong

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe