Jump to content

SAP CDC Connector and SLT - Part 3 - Let's get that data!


Recommended Posts

Guest bjarkowskiMSFT
Posted

Thanks for joining me for the third episode of the blog series on SAP CDC Connector and SLT. In the previous post, I covered the initial configuration steps required to set up the SLT engine to work with Azure. Today, I'll be taking a deeper dive into the extraction process and monitoring. These steps are essential to ensure a smooth and efficient replication of data from your SAP source system to your target system. By the end of this post, you'll have a better understanding of how to manage and troubleshoot the replication process.

 

 

 

DATA EXTRACTION AND MONITORING

 

Once the Azure and SLT configuration is completed the only thing left to do is clicking Trigger Now button. It starts a complex chain of actions to copy data from the SAP system to the desired target storage. Let's take a closer look to see what exactly happens under the hood.

 

 

 

The SAP CDC connector doesn't communicate directly with the SAP SLT. Instead, it requests data from the Operational Data Provisioning framework, which acts as a proxy and manages the extraction process.

 

 

 

SLT starts with creating the required logging tables and function modules. This phase is usually quick and takes just a couple of seconds. Then the calculation job kicks off. The main function of the calculation job is splitting the source dataset into manageable portions. It can take a longer time, heavily depending on the size of the table and the chosen calculation mode. Finally, the initial extraction starts, followed by the replication if you chose delta tracking.

 

 

 

You can monitor the extraction process from three angles:

 

  • SLT Cockpit (LTRC)
    As soon as you start the replication process in Azure Data Factory, you will notice that the selected table name appears in the Participating Objects tab in the SLT Cockpit. The Current Phase field provides information on the selected Run Mode (One-off extraction vs replication). The Details field contains the action that is currently running.
     
    largevv2px999.png.25d2199d40195f433bb6be7a6973aed2.png
     
    Once the data transfer starts, you can switch to the Statistics tab which provides more insights into the replication performance. Here you can check the total number of records processed, the number of calculated portions or read and write times.
     
    largevv2px999.png.8a907ade98b6c5c73574b1dbdc1ffb9f.png
     
    It is also worth checking Application Logs, especially if you notice some issues with replication and the status of the SLT configuration changes from green to yellow or red.largevv2px999.png.2a6b878e99a29fce466dd93d7cb345dd.png
     
     
     
  • Background Jobs (SM37)
     
    You can use background jobs to check the details of the replication process. Each extraction step is managed by a separate job. Some jobs, like access plan calculation or data load, trigger their equivalent jobs in the source system
     
    • Master Controller Job (/1LT/IUC_REP_CNTR_<MTID>) - creates database triggers and logging tables
    • Migration Object Definition Job (/1LT/IUC_DEF_COBJ_<MTID>) – creates table structures and generates function modules
    • Access Plan Calculation Job (/1LT/IUC_CALC_ACP_<MTID>) – chunks the source data into pieces
    • Data Load Job (/1LT/IUC_LOAD_MT_<MTID>) – loads and replicates data

    largevv2px999.png.d5362dfc9f324289cfd487bcd5805ed3.png

     

    In the above example, you can see that the ODQR job kicked off the extraction process. It was immediately followed by the Master Controller and Migration Object Definition Jobs. The access plan calculation was split into two processes, and once it finished three load jobs run the initial data transfer of all records from the source table. The last job is active as it continuously replicates changes.

     

     

     

    Checking the job status is especially useful if you notice the extraction is stuck on a particular step, for example calculating the access plan. Sometimes, the lack of free background processes can also negatively influence the performance of the extraction. Always check if the job runs on time and there is no delay in starting it.

     

     

    [*]Delta Queues Monitoring (ODQMON) You can also monitor the replication from the Delta Queue Monitoring. On the initial screen you see all delta queues available in the system. You can use the filters to narrow the list. By default, the list includes only entries where delta tracking is enabled. To include also requests for full extraction (without delta initialization), change the Request Select field to “All” or “Full”.

    largevv2px999.png.4f3ebc78cda5e401f18a004864be116e.png

     

    Please note that the SAP CDC connector uses the “OData access for ODP” as the subscriber type. Despite that, the communication between the SAP system and the connector happens over RFC. The OData channel is not used at any stage of the replication.

     

     

    Double click on the Queue Name to display subscriptions. Click Calculate Data to display additional information like total size of the extracted data, number of processed rows, etc.

    largevv2px999.png.d69bb0b6635efdf58ff542b0efbedb5a.png

     

     

     

    You can further drill down to individual requests by double-clicking on the subscription. This view provides more details on the current operation. Azure Data Factory creates requests when you start the extraction. The below screenshot shows a running, full data extraction with delta initialization. When you start a subsequent extraction to get changes the system will create a new request.

    largevv2px999.png.5734b337e3ec0c1f508e660c37f7130c.png

     

     

    When looking at the extraction from the Delta Queue perspective, the process has two steps. Firstly, the system triggers data replication from the source system to delta queues through SLT. In the second step the system exposes data to the consumer. Look at the Composite Request: Status column to check the active job. Notice the icon difference between the above and below screenshots.

    largevv2px999.png.97e8ce9c781f4183c1f664843f5c5794.png

     

     

    Finally, as the extraction completes the request status changes to Confirmed.

    largevv2px999.png.42adaf01ddae6e1bdb3f0bfe05c2a121.png

     

 

All three monitoring views confirm that the extraction was successful. The extraction job in Azure Data Factory is also completed:

 

largevv2px999.png.ff711acb625c6b77ab81f252c5d20881.png

 

Let’s go back to Delta Queues Monitoring for a moment. I've already shown how to display and monitor subscriptions and requests, but you can drill down even further to see individual units of data. Each unit represent a portion of data defined by the Calculation Job which is then process during the data load.

 

largevv2px999.png.0dd3fbeb779d33fa0d23ebf26bd5e9e6.png

 

 

 

Now, when you click on one of units you can see actual processed data. This view is very useful, when you face any data inconsistencies. The third column identifies the type of record operation – as this is an initial extraction, all records are set to C (Create).

largevv2px999.png.2df974e719623cdc33e362e54033c1ae.png

 

Let’s start a delta extraction!

 

 

DELTA EXTRACTION

As we used the "Full on the first run, then incremental" run mode in Azure Data Factory, we specified that we want SLT to track delta changes. Database triggers created by the SLT wait for changes and write them to logging tables as we described above. After the initial extraction, we're left with an active Data Load job that transfers changes to delta queues in real-time which then are retrieved by the SAP CDC connector. Let’s change a couple of sales orders in the system to see how the replication behaves.

 

In the SLT Cockpit you can see the replication status changed to "Wating for changes".

largevv2px999.png.c80cf37d06336e1e9033b6d381841fbc.png

 

 

 

Switch to the Statistics tab and choose Replication Statistics. Here you can see the number of processed changes. If you face a situation that no delta changes are extracted by the SAP CDC connector, see if there are any Unprocessed Logging Table Records.

largevv2px999.png.36b5aea2ebb66fe882264b657378a655.png

 

 

 

After starting the extraction from Azure Data Factory, you can see a new request created in the same subscription.

largevv2px999.png.33ac1ed1af237d5ccd1f12f433a83339.png

 

 

 

Let’s drill down to display data preview. ODP keeps each change in a separate unit. All three changes are recorded, despite we changed one document twice. Note that the Change Mode for records has changed to U (Update) and D (Delete). Azure Data Factory uses it to apply the extracted data to the target store that contains the initial extraction.

largevv2px999.png.cd149bde3d982391600777c9ae5764ea.png

Once the delta extraction finishes, you can see all data in the chosen target storage.

 

 

 

ENDING THE SUBSCRIPTION

 

To stop tracking changes and replicating data, you can end the subscription in the Delta Queue Monitoring. Simply select the relevant subscription and click on the bin icon in the menu. It's worth noting that this action can only be performed directly in the Delta Queue Monitoring, and there's no functionality available in Azure Data Factory or Synapse to end the subscription. It's important to end subscriptions that are no longer needed to avoid adding unnecessary entries to delta tables, which could become problematic in the long term. We'll cover this topic in more detail later.

 

largevv2px999.png.84c8dfdfb1331d37deef2e5405c13697.png

 

 

 

Once you delete the subscription, the corresponding table will disappear from the SLT Cockpit. You don’t have to take any additional steps, but there may be a slight delay between ending the subscription and the table disappearing from the SLT Cockpit. This process is managed by the Master Controller job, which communicates between ODP and SLT, so the change will be made after the next job run.

 

 

 

For full one-off extractions, there is no active subscription to remove from the ODQMON. To delete the corresponding entries from the SLT Cockpit, go to Expert Functions and choose "Delete Loaded Tables from the Configuration for the ODP Framework". This process will eliminate the entries from the Cockpit.

 

largevv2px999.png.75c06db88871e769d9cb16c10dcffc26.png

 

It runs a report that deletes completed full extractions from the SLT Cockpit.

 

largevv2px999.png.6e90d6dee568112d5a361e79e3a64a5d.png

 

When you go back to the Participating Objects tab, the fully loaded tables without further replication enabled should no longer be included on the list.

 

That's it for today's episode on SAP CDC Connector and SLT! I hope you found this post helpful in understanding the extraction process, monitoring, and some of the advanced troubleshooting options of the solution. In the next episode, I'll continue our exploration of the SAP CDC Connector by delving into advanced options and discussing how you can optimize replication performance.

 

Continue reading...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...