Guest PaulHarrison Posted January 20, 2023 Posted January 20, 2023 Q: How do I get my Translator containers to work the same way as Azure Translator? A: Integrate it with the text-analytics container. This is part 3 of a series on cognitive services, specifically on how to leverage Translator in Azure and in containers so if you would like to catch up please check out the previous posts here: Part 1: How do I use Azure Cognitive Services like Translator? - Q: How do I use Azure Cognitive Services like Translator? Part 2: How do I use connected containerized Azure Cognitive Services? - Q: How do I use connected containerized Azure Cognitive Services? We have come a long ways but the issue we’re going to address today is getting our Translator containers to work without specifying ‘from’ just like the good ole’ Azure Translator resource does it. First though you might ask yourself why it is even worth spending a bunch of effort to avoid passing one parameter. In this case it really matters for several reasons: We might not know which language we’re trying to translate, so we can’t specify what we don’t know. We have code that was written to work with the Translator v3 API against Translator resources in Azure and we don’t want to re-write all our code to leverage containers. The steps will be like deploying the Translator container. We’re going to deploy the Azure resource, then use the information from the resource in Azure to deploy the text-analytics container. Text analytics is far more capable than just detecting languages, it can do sentiment analysis, key phrase extraction, text summarization, and much more but for our purposes we’re just going to have it tell us the language of the text we give it. Once we have text-analytics we can setup Translator to automatically work with text-analytics on our behalf so that it all works just like we’re used to in Azure. Disclaimer The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages. Create a Language service in Azure Log into the Azure portal then search for and click on the “Language” service. Select “Create”, then “Continue to create your resource” To avoid the issue with the free tier not working with containers, we’ll select a paid tier. Click “Next” a few times then “Create” and now you have your Azure Language resource. [*]Download the text-analytics container. From the same VM that we were in for Part 2 we’ll run this command to get the textanalytics container sudo docker pull mcr.microsoft.com/azure-cognitive-services/textanalytics/language:latest [*] Run the container just to make sure it runs. sudo docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 mcr.microsoft.com/azure-cognitive-services/textanalytics/language Eula=accept Billing=https://paulanalyticsforblog.cognitiveservices.azure.com/ ApiKey=<put_your_own_key_here> [*]Download Docker Composer sudo apt-get update sudo apt-get install docker-compose-plugin [*]Create a composer yaml file. I created a file in /home/George called TranslatorComposer.yaml with the following contents. It specifies how I want different containers to be deployed. version: "3.3" services: text-translation: image: "mcr.microsoft.com/azure-cognitive-services/translator/text-translation:latest" volumes: - /mnt/d/TranslatorContainer:/usr/local/models ports: - 5000:5000 environment: - EULA=accept - Languages=en,fr,es - ApiKey=I_AM_NOT_SHARING_MY_KEY_BUT_PUT_YOURS_HERE - Billing=https://paultestforblog.cognitiveservices.azure.com/ - LADURL=http://language-detection:5000 language-detection: image: "mcr.microsoft.com/azure-cognitive-services/textanalytics/language" ports: - 5100:5000 environment: - EULA=accept - ApiKey= I_AM_NOT_SHARING_MY_KEY_BUT_PUT_YOURS_HERE - Billing=https://paulanalyticsforblog.cognitiveservices.azure.com/ [*]Deploy the composed containers sudo docker compose -f /home/george/TranslatorComposer.yaml up [*]Validate translation works without specifying ‘from’. We will use the same code as used in the previous blog but this time we won’t specify the ‘from’ parameter and it still works! #Variable section $baseURI = 'http://MyVMIPAddress:5000/' $Key = '<I_don’t_share_my_key>' $Region = 'eastus2' $TextToTranslate = 'I like bananas and accordions.' #Test Translation: $to = 'es' $URI = $baseURI + 'translate?api-version=3.0&to=' + $to $resp = Invoke-WebRequest -Uri $URI -Method Post -Headers $([hashtable]@{"Ocp-Apim-Subscription-Key"=$Key;"Ocp-Apim-Subscription-Region"=$Region}) -Body "[ {'Text':'$TextToTranslate'} ]" -ContentType "application/json; charset=utf-8" -ErrorAction Inquire $trans = ($resp.Content | ConvertFrom-Json).translations.text $trans Me gustan los plátanos y los acordeones. Hurrah, now we have Azure Cognitive Services running in a container anywhere we want, and it behaves just like the resources we know and love in Azure. Have fun scripting! Continue reading... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.