Guest bjarkowskiMSFT Posted April 10, 2023 Posted April 10, 2023 Thanks for joining me for the third episode of the blog series on SAP CDC Connector and SLT. In the previous post, I covered the initial configuration steps required to set up the SLT engine to work with Azure. Today, I'll be taking a deeper dive into the extraction process and monitoring. These steps are essential to ensure a smooth and efficient replication of data from your SAP source system to your target system. By the end of this post, you'll have a better understanding of how to manage and troubleshoot the replication process. DATA EXTRACTION AND MONITORING Once the Azure and SLT configuration is completed the only thing left to do is clicking Trigger Now button. It starts a complex chain of actions to copy data from the SAP system to the desired target storage. Let's take a closer look to see what exactly happens under the hood. The SAP CDC connector doesn't communicate directly with the SAP SLT. Instead, it requests data from the Operational Data Provisioning framework, which acts as a proxy and manages the extraction process. SLT starts with creating the required logging tables and function modules. This phase is usually quick and takes just a couple of seconds. Then the calculation job kicks off. The main function of the calculation job is splitting the source dataset into manageable portions. It can take a longer time, heavily depending on the size of the table and the chosen calculation mode. Finally, the initial extraction starts, followed by the replication if you chose delta tracking. You can monitor the extraction process from three angles: SLT Cockpit (LTRC) As soon as you start the replication process in Azure Data Factory, you will notice that the selected table name appears in the Participating Objects tab in the SLT Cockpit. The Current Phase field provides information on the selected Run Mode (One-off extraction vs replication). The Details field contains the action that is currently running. Once the data transfer starts, you can switch to the Statistics tab which provides more insights into the replication performance. Here you can check the total number of records processed, the number of calculated portions or read and write times. It is also worth checking Application Logs, especially if you notice some issues with replication and the status of the SLT configuration changes from green to yellow or red. Background Jobs (SM37) You can use background jobs to check the details of the replication process. Each extraction step is managed by a separate job. Some jobs, like access plan calculation or data load, trigger their equivalent jobs in the source system Master Controller Job (/1LT/IUC_REP_CNTR_<MTID>) - creates database triggers and logging tables Migration Object Definition Job (/1LT/IUC_DEF_COBJ_<MTID>) – creates table structures and generates function modules Access Plan Calculation Job (/1LT/IUC_CALC_ACP_<MTID>) – chunks the source data into pieces Data Load Job (/1LT/IUC_LOAD_MT_<MTID>) – loads and replicates data In the above example, you can see that the ODQR job kicked off the extraction process. It was immediately followed by the Master Controller and Migration Object Definition Jobs. The access plan calculation was split into two processes, and once it finished three load jobs run the initial data transfer of all records from the source table. The last job is active as it continuously replicates changes. Checking the job status is especially useful if you notice the extraction is stuck on a particular step, for example calculating the access plan. Sometimes, the lack of free background processes can also negatively influence the performance of the extraction. Always check if the job runs on time and there is no delay in starting it. [*]Delta Queues Monitoring (ODQMON) You can also monitor the replication from the Delta Queue Monitoring. On the initial screen you see all delta queues available in the system. You can use the filters to narrow the list. By default, the list includes only entries where delta tracking is enabled. To include also requests for full extraction (without delta initialization), change the Request Select field to “All” or “Full”. Please note that the SAP CDC connector uses the “OData access for ODP” as the subscriber type. Despite that, the communication between the SAP system and the connector happens over RFC. The OData channel is not used at any stage of the replication. Double click on the Queue Name to display subscriptions. Click Calculate Data to display additional information like total size of the extracted data, number of processed rows, etc. You can further drill down to individual requests by double-clicking on the subscription. This view provides more details on the current operation. Azure Data Factory creates requests when you start the extraction. The below screenshot shows a running, full data extraction with delta initialization. When you start a subsequent extraction to get changes the system will create a new request. When looking at the extraction from the Delta Queue perspective, the process has two steps. Firstly, the system triggers data replication from the source system to delta queues through SLT. In the second step the system exposes data to the consumer. Look at the Composite Request: Status column to check the active job. Notice the icon difference between the above and below screenshots. Finally, as the extraction completes the request status changes to Confirmed. All three monitoring views confirm that the extraction was successful. The extraction job in Azure Data Factory is also completed: Let’s go back to Delta Queues Monitoring for a moment. I've already shown how to display and monitor subscriptions and requests, but you can drill down even further to see individual units of data. Each unit represent a portion of data defined by the Calculation Job which is then process during the data load. Now, when you click on one of units you can see actual processed data. This view is very useful, when you face any data inconsistencies. The third column identifies the type of record operation – as this is an initial extraction, all records are set to C (Create). Let’s start a delta extraction! DELTA EXTRACTION As we used the "Full on the first run, then incremental" run mode in Azure Data Factory, we specified that we want SLT to track delta changes. Database triggers created by the SLT wait for changes and write them to logging tables as we described above. After the initial extraction, we're left with an active Data Load job that transfers changes to delta queues in real-time which then are retrieved by the SAP CDC connector. Let’s change a couple of sales orders in the system to see how the replication behaves. In the SLT Cockpit you can see the replication status changed to "Wating for changes". Switch to the Statistics tab and choose Replication Statistics. Here you can see the number of processed changes. If you face a situation that no delta changes are extracted by the SAP CDC connector, see if there are any Unprocessed Logging Table Records. After starting the extraction from Azure Data Factory, you can see a new request created in the same subscription. Let’s drill down to display data preview. ODP keeps each change in a separate unit. All three changes are recorded, despite we changed one document twice. Note that the Change Mode for records has changed to U (Update) and D (Delete). Azure Data Factory uses it to apply the extracted data to the target store that contains the initial extraction. Once the delta extraction finishes, you can see all data in the chosen target storage. ENDING THE SUBSCRIPTION To stop tracking changes and replicating data, you can end the subscription in the Delta Queue Monitoring. Simply select the relevant subscription and click on the bin icon in the menu. It's worth noting that this action can only be performed directly in the Delta Queue Monitoring, and there's no functionality available in Azure Data Factory or Synapse to end the subscription. It's important to end subscriptions that are no longer needed to avoid adding unnecessary entries to delta tables, which could become problematic in the long term. We'll cover this topic in more detail later. Once you delete the subscription, the corresponding table will disappear from the SLT Cockpit. You don’t have to take any additional steps, but there may be a slight delay between ending the subscription and the table disappearing from the SLT Cockpit. This process is managed by the Master Controller job, which communicates between ODP and SLT, so the change will be made after the next job run. For full one-off extractions, there is no active subscription to remove from the ODQMON. To delete the corresponding entries from the SLT Cockpit, go to Expert Functions and choose "Delete Loaded Tables from the Configuration for the ODP Framework". This process will eliminate the entries from the Cockpit. It runs a report that deletes completed full extractions from the SLT Cockpit. When you go back to the Participating Objects tab, the fully loaded tables without further replication enabled should no longer be included on the list. That's it for today's episode on SAP CDC Connector and SLT! I hope you found this post helpful in understanding the extraction process, monitoring, and some of the advanced troubleshooting options of the solution. In the next episode, I'll continue our exploration of the SAP CDC Connector by delving into advanced options and discussing how you can optimize replication performance. Continue reading... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.