Guest Ramanath_Nayak Posted August 17, 2022 Posted August 17, 2022 Introduction This article highlights the steps involved in moving mainframe file data to Azure Blob using Java on Mainframe. Mainframe files are in Extended Binary Coded Decimal Interchange Code (EBCDIC) code page. Making these files readable on Distributed platforms involves transmission of the data out of mainframe and subsequently conversion of data from EBCDIC to American Standard Code for Information Interchange (ASCII) format. Host Integration Server (HIS) can help in the later task of conversion. However, for the first task of transmission of data, you can follow the steps outlined in this blog. Steps Involved High level steps involved in the process are as below: Compilation of JAVA code. If you have facilities to compile the code on mainframe, FTP the project to mainframe and compile it there, otherwise, compile it on your development machine and FTP the jar to a folder in your Mainframe. Setup Azure Blob and get access key for Azure Blob Setup/Update Mainframe files and environment. Submit job to Transmit the file to Azure Blob. Note: Contact our team (datasqlninja@microsoft.com) to get the code artifacts. Fig 1.1: Overview of Mainframe to Azure Storage migration using JCL. Detailed Steps Artifacts: Contact our team (datasqlninja@microsoft.com) to get the source code. It contains one Java solution and a few mainframe components which will help in transmitting the file from Mainframe to Azure Blob. Move the Mainframe components into Mainframe using any FTP software. Once the components are moved, rename the “<userid>” string to your desired user id in both JCL and the PROC. Provide the required filename to be transmitted to Azure blob under the DD name “MIGRATE”. Compile the Java code: Compile the Maven java project given using any of the JAVA IDEs. The jar will get generated in the target folder with the name “mainframetoblob-0.0.1-SNAPSHOT-jar-with-dependencies.jar” . Move this jar into the Unix System Services (USS) region under an appropriate folder. This folder path will have to be updated in the PROC later on. Please make sure that java compilation version on the VM should match with the java version installed on USS. Azure Blob setup: If you do not have a storage account, create one as guided in this link. Create a container named “Mainframe1” under this storage account. If you wish to change the name of the container, you will have to modify the JAVA file accordingly. Once you create a storage account get the shared access signature (SAS) token for the storage account as guided in this link. Keep both of these handy as we will have to put these into mainframe files. Mainframe file setup. Update user ID in all the JCLs and PROCs. Point the right location of JAR in PROC. Create a file of LRECL 72 which will contain the BLOB URL. The file should look as below Create a file of LRECL 250 which contains the SAS key. Paste the SAS key generated from the previous step into this file. The file should look as below [*]Submit the JCL. After you modify the file name in “MIGRATE” DD name and submit, it will begin the transmission of the file from Mainframe to Azure Blob. You should be able to see the file in Azure blob under a container named “Mainframe1”. You can do both binary as well as ASCII transmission by doing a slight tweak in the JAVA file. Feedback and Suggestions: If you have feedback or suggestions for improving this asset, please contact the Data Platform Engineering (datasqlninja@microsoft.com). Thanks for your support! Note: For additional information about migrating various source databases to Azure, see the Azure Database Migration Guide. Continue reading... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.