Guest sakshigupta Posted April 11, 2023 Posted April 11, 2023 I have come up with the requirement from customer to omit the bind values captured as part of Azure SQL Auditing – Batch completed event. When we talk about bind values, I am referring to the parameters passed as part of prepared statement from application. By default, SQL Batch completed event captures complete statements executed by an application below is an example: So, the custom solution here includes the below services: Azure SQL Database Event Hub - 1st Destination to send the raw event. Blob Storage - Final Destination to send filtered Data. Stream Analytics Jobs – Used to filter the data. As a first steps, Azure SQL Database is provisioned and enabled SQL Auditing. Target selected to send auditing data as Event Hub. As you may be aware, default auditing would capture Successful/Failed logins/Batch Completed Event. Event Hub as 1st Target to send auditing event. Event hub would store batch completed event(PreparedStatement) sent by an application in the below format. So, the aim here is to exclude the values passed as parameter. This could be a requirement from security/compliance perspective where companies would like to capture audit event but doesn’t want to store values passed as a parameter and omit those values. [*]Once this is done, I have provisioned stream analytics and configured source and target. As you can see the above screenshot, source and destination is configured and using the query to filter the data. I have sample application running and executing prepared statement as sp_executesql so I have setup the filter in my query accordingly. I have executed application several times to ensure some data is captured in event hub and can later be used as source by streamAnalytics jobs to read the data. Below screenshot indicates, I have couple of requests coming in event hub. Once I validated, I would like use SA job to read and send the data. Below is the same query I am using read the data sending to blob storage. You can further filter the data based on the classification defined on specific columns if needed. Using two columns here as Statement – Capturing as-is data and StatementnoBind as filtered data to omit bind values. SELECT RecordsArray.ArrayValue.time, RecordsArray.ArrayValue.resourceId, RecordsArray.ArrayValue.category, RecordsArray.ArrayValue.properties.action_name, RecordsArray.ArrayValue.properties.event_time, RecordsArray.ArrayValue.properties.data_sensitivity_information, RecordsArray.ArrayValue.properties.statement, case when left(RecordsArray.ArrayValue.properties.statement,18) = 'exec sp_executesql' -- and REGEXMATCH(RecordsArray.ArrayValue.properties.data_sensitivity_information, 'information_type="Health"') = 0 then substring(RecordsArray.ArrayValue.properties.statement,0,PATINDEX('%,N''%', RecordsArray.ArrayValue.properties.statement)) else RecordsArray.ArrayValue.properties.statement end as statementnobind INTO [blobstoragedest] FROM [eventhubsrc] CROSS APPLY GetArrayElements(records) AS RecordsArray where RecordsArray.ArrayValue.properties.action_name in ( 'BATCH COMPLETED', 'RPC COMPLETED') As you can see the below screenshot, statement no bind columns are not capturing any values passed as a parameters. This validation is done from UI as of now and can see the results are expected. Now, will start SA job and see if data is stored inside blobstorage. As you can my job is running and can see the usage and event counts are increasing. As a next step, I could file is present inside blobstorage and captured data is showing as expected StatementNOBIND – no parameters values are captured. Thats all. :) Enjoy Learning!! Continue reading... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.