O
Omar_Abu_Arisheh
You are collecting batched or block data from a source and using a for-each loop with concatenation of a string variable to produce the final data, which you finally want to save in storage. Your Logic App workflow is consuming resources and when your data grows your performance degrades. You can make use of the Append Blob API to append data to a blob file directly without draining your ASP CPU and Memory, iterate through your data and append it to the file block by block. Finally, you can seal the deal by calling the Seal Blob API to switch the file to read-only mode.
This solution has been tested for Logic Apps (Standard) but should also work for Logic Apps (Consumption) as well.
We are using the built-in (inApp) version of the Blob Connector.
-----------------------------------------------------------------------------------------------------------
To start let's first describe the general structure of our workflow and initial preparations:
Enable System-Assigned Managed Identity for the Logic App.
Upload empty file to storage container, with Blob type as "Append blob".
You can also create this file by calling the "Put Blob" API using an HTTP Action.
Put Blob (REST API) - Azure Storage | Microsoft Learn
Giving permissions to the Logic App’s Managed Identity to write to storage:
To grant permission for the Logic App’s System-assigned Managed Identity:
Reference: Access Storage Accounts behind Firewalls from Logic Apps within the same region - Microsoft Community Hub
Browse to your storage, then “Access Control (IAM)”, add role assignment, select “Storage Blob Data Contributor”, then in the Members, search for your Logic App by its name (not the workflow name).
Then add the role assignment.
Create the workflow:
This is how the final workflow will look like, below you will find more information about each action.
We are getting a list of files from a container to use as our source for data.
Then we are looping on each of these files, getting their content in a "Read blob content" action, and pushing their content to the empty "Append blob" that we created earlier using the "Append blob" API in an HTTP Action.
Finally, we call the "Seal blob" API using the HTTP action outside the loop.
Get the blobs in a container to use a source of content to append:
Create a for-each loop and iterate through the blobs we received from the "List Blobs" action:
Note: Change the "Degree of Parallelism" to 1 to force the for-each to run sequentially.
Inside the for-each loop, add an action to read the blob content:
Inside the for-each add an HTTP action to append data to the blob:
For the URI:
(replace with your storage name, or replace the main URI with your private endpoint):
https://YOUR_STORAGE_NAME.blob.core.windows.net/YOUR_CONTAINER_NAME/YOUR_FILE_NAME_WITH_EXTENSION?comp=appendblock
For the headers:
x-ms-date utcnow('R')
x-ms-version 2022-11-02
Content-Length body('Read_blob_content_for_Partial_file')?['properties']?['length']
For the Body: body('Read_blob_content_for_Partial_file')?['content']
For Authentication: We are using "System assigned Managed Identity" with Audience as the Storage URI
For the “Audience” and “URI” for the above, you can get it from your Storage under “Endpoints”, you can use the private endpoint if your storage is behind firewall.
After the for-each, add an HTTP Action to call the "Seal blob" API:
For the URI:
(replace with your storage name, or replace the main URI with your private endpoint):
https://YOUR_STORAGE_NAME.blob.core.windows.net/YOUR_CONTAINER_NAME/YOUR_FILE_NAME_WITH_EXTENSION?comp=seal
For the headers:
x-ms-date utcnow('R')
x-ms-version 2022-11-02
For Authentication: We are using "System assigned Managed Identity" with Audience as the Storage URI
Save your workflow.
-----------------------------------------------------------------------------------------------------------
Run your workflow and verify the results:
The file size should be roughly the size of the source data combined:
For our example, you can notice the area where the files were combined:
-----------------------------------------------------------------------------------------------------------
References:
Append Block (REST API) - Azure Storage | Microsoft Learn
Append Blob Seal (REST API) - Azure Storage | Microsoft Learn
Put Blob (REST API) - Azure Storage | Microsoft Learn
Authorize requests to Azure Storage (REST API) | Microsoft Learn
Authorize access to blobs using Active Directory - Azure Storage | Microsoft Learn
Azure Blob Storage - Connectors | Microsoft Learn
Access Storage Accounts behind Firewalls from Logic Apps within the same region - Microsoft Community Hub
Continue reading...
This solution has been tested for Logic Apps (Standard) but should also work for Logic Apps (Consumption) as well.
We are using the built-in (inApp) version of the Blob Connector.
-----------------------------------------------------------------------------------------------------------
To start let's first describe the general structure of our workflow and initial preparations:
- Initial preparation:
- Enable System-Assigned Managed Identity for the Logic App.
- Manually upload an empty file to the storage container, make sure that the type is "Append file".
- Add Role Assignment to give permission to the Managed Identity on the storage.
- For the workflow:
- Use a trigger of your choice.
- Create a for-each loop, to make sure that the data is written sequentially, change the Degree of Parallelism to 1.
- Inside the loop, add an action to grab your data, you want to append, from a source of your choice. In our scenario we are getting the data from files in a container using "List blobs in container" action then looping through these files.
- Create an HTTP action, call the Append Blob API to push the data to the empty file we uploaded earlier.
- After we are done and exit the loop, we can seal this file by calling the Seal Blob API.
Enable System-Assigned Managed Identity for the Logic App.
Upload empty file to storage container, with Blob type as "Append blob".
You can also create this file by calling the "Put Blob" API using an HTTP Action.
Put Blob (REST API) - Azure Storage | Microsoft Learn
Giving permissions to the Logic App’s Managed Identity to write to storage:
To grant permission for the Logic App’s System-assigned Managed Identity:
Reference: Access Storage Accounts behind Firewalls from Logic Apps within the same region - Microsoft Community Hub
Browse to your storage, then “Access Control (IAM)”, add role assignment, select “Storage Blob Data Contributor”, then in the Members, search for your Logic App by its name (not the workflow name).
Then add the role assignment.
Create the workflow:
This is how the final workflow will look like, below you will find more information about each action.
We are getting a list of files from a container to use as our source for data.
Then we are looping on each of these files, getting their content in a "Read blob content" action, and pushing their content to the empty "Append blob" that we created earlier using the "Append blob" API in an HTTP Action.
Finally, we call the "Seal blob" API using the HTTP action outside the loop.
Get the blobs in a container to use a source of content to append:
Create a for-each loop and iterate through the blobs we received from the "List Blobs" action:
Note: Change the "Degree of Parallelism" to 1 to force the for-each to run sequentially.
Inside the for-each loop, add an action to read the blob content:
Inside the for-each add an HTTP action to append data to the blob:
For the URI:
(replace with your storage name, or replace the main URI with your private endpoint):
https://YOUR_STORAGE_NAME.blob.core.windows.net/YOUR_CONTAINER_NAME/YOUR_FILE_NAME_WITH_EXTENSION?comp=appendblock
For the headers:
x-ms-date utcnow('R')
x-ms-version 2022-11-02
Content-Length body('Read_blob_content_for_Partial_file')?['properties']?['length']
For the Body: body('Read_blob_content_for_Partial_file')?['content']
For Authentication: We are using "System assigned Managed Identity" with Audience as the Storage URI
For the “Audience” and “URI” for the above, you can get it from your Storage under “Endpoints”, you can use the private endpoint if your storage is behind firewall.
After the for-each, add an HTTP Action to call the "Seal blob" API:
For the URI:
(replace with your storage name, or replace the main URI with your private endpoint):
https://YOUR_STORAGE_NAME.blob.core.windows.net/YOUR_CONTAINER_NAME/YOUR_FILE_NAME_WITH_EXTENSION?comp=seal
For the headers:
x-ms-date utcnow('R')
x-ms-version 2022-11-02
For Authentication: We are using "System assigned Managed Identity" with Audience as the Storage URI
Save your workflow.
-----------------------------------------------------------------------------------------------------------
Run your workflow and verify the results:
The file size should be roughly the size of the source data combined:
For our example, you can notice the area where the files were combined:
-----------------------------------------------------------------------------------------------------------
References:
Append Block (REST API) - Azure Storage | Microsoft Learn
Append Blob Seal (REST API) - Azure Storage | Microsoft Learn
Put Blob (REST API) - Azure Storage | Microsoft Learn
Authorize requests to Azure Storage (REST API) | Microsoft Learn
Authorize access to blobs using Active Directory - Azure Storage | Microsoft Learn
Azure Blob Storage - Connectors | Microsoft Learn
Access Storage Accounts behind Firewalls from Logic Apps within the same region - Microsoft Community Hub
Continue reading...