Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio

  • Thread starter Thread starter Minseok_Song
  • Start date Start date
M

Minseok_Song

Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio​


Spoiler (Highlight to read)

This blog series have several versions, each covering different aspects and techniques. Check out the following resource:

- Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide
: Detailed instructions for fine-tuning and integrating custom Phi-3 models with Prompt flow using a code-first approach.
- Code-first approach: End-to-end (E2E) sample on Phi-3CookBook
: An end-to-end (E2E) sample on Phi-3CookBook, developed based on the "Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide" for a code-first approach.
- Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio
: Detailed instructions for fine-tuning and integrating custom Phi-3 models with Prompt Flow in Azure AI / ML Studio using a low-code approach.

This blog series have several versions, each covering different aspects and techniques. Check out the following resource: - Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide : Detailed instructions for fine-tuning and integrating custom Phi-3 models with Prompt flow using a code-first approach.- Code-first approach: End-to-end (E2E) sample on Phi-3CookBook : An end-to-end (E2E) sample on Phi-3CookBook, developed based on the "Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide" for a code-first approach.- Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio : Detailed instructions for fine-tuning and integrating custom Phi-3 models with Prompt Flow in Azure AI / ML Studio using a low-code approach.

Introduction​


Phi-3 is a family of small language models (SLMs) developed by Microsoft that delivers exceptional performance and cost-effectiveness. In this tutorial, you will learn how to fine-tune the Phi-3 model and integrate the custom Phi-3 model with Prompt Flow in Azure AI Studio. By leveraging Azure AI / ML Studio, you will establish a workflow for deploying and utilizing custom AI models. This tutorial is divided into three series:



Series 1: Set up Azure resources and Prepare for fine-tuning​




  1. Create Azure Machine Learning Workspace: You start by setting up an Azure Machine Learning workspace, which serves as the hub for managing machine learning experiments and models.
  2. Request GPU Quotas: Since Phi-3 model fine-tuning typically benefits from GPU acceleration, you request GPU quotas in your Azure subscription.
  3. Add Role Assignment: You set up a User Assigned Managed Identity (UAI) and assign it necessary permissions (Contributor, Storage Blob Data Reader, AcrPull) to access resources like storage accounts and container registries.
  4. Set up the Project: You create a local environment, set up a virtual environment, install required packages, and create a script (download_dataset.py) to download the dataset (ULTRACHAT_200k) required for fine-tuning.

Series 2: Fine-tune and Deploy the Phi-3 model in Azure ML Studio​




  1. Create Compute Cluster: In Azure ML Studio, you create a dedicated GPU compute cluster (Standard_NC24ads_A100_v4) for fine-tuning the Phi-3 model.
  2. Fine-tune the Phi-3 Model: Using the Azure ML Studio interface, you fine-tune the Phi-3 model by specifying training and validation datasets, and configuring parameters like learning rate.
  3. Deploy the Fine-tuned Model: Once fine-tuning is complete, you register the model, create an online endpoint, and deploy the model to make it accessible for real-time inference.

Series 3: Integrate the custom Phi-3 model with Prompt Flow in Azure AI Studio

  1. Create Azure AI Studio Hub and Project: You create a Hub (similar to a resource group) and a Project within Azure AI Studio to manage your AI-related work.
  2. Add a Custom Connection: To integrate the fine-tuned Phi-3 model with Prompt Flow, you create a custom connection in Azure AI Studio, specifying the endpoint and authentication key generated during model deployment in Azure ML Studio.
  3. Create Prompt Flow: You create a new Prompt flow within the Azure AI Studio Project, configure it to use the custom connection, and design the flow to interact with the Phi-3 model for tasks like chat completion.
Note



Unlike the previous tutorial, Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide, which involved running code locally, this tutorial focuses entirely on fine-tuning and integrating your model within the Azure AI / ML Studio environment.

Here is an overview of this tutorial.



large?v=v2&px=999.png





Note

For more detailed information and to explore additional resources about Phi-3, please visit the Phi-3CookBook.

Prerequisites​

Table of Contents​


Series 1: Set Up Azure resources and Prepare for fine-tuning

  1. Create Azure Machine Learning workspace
  2. Request GPU quotas in Azure subscription
  3. Add role assignment
  4. Set up the project 1.Prepare dataset for fine-tuning

Series 2: Fine-tune and Deploy the Phi-3 model in Azure ML Studio

  1. Fine-tune the Phi-3 model
  2. Deploy the fine-tuned Phi-3 model

Series 3: Integrate the custom phi-3 model with Prompt flow in Azure AI Studio

  1. Integrate the custom Phi-3 model with Prompt flow
  2. Chat with your custom Phi-3 model
  3. Congratulation!



Series 1: Set up Azure resources and Prepare for fine-tuning




Create Azure Machine Learning workspace​




In this exercise, you will:

  • Create an Azure Machine Learning Workspace.

Create an Azure Machine Learning Workspace​


  1. Type azure machine learning in the search bar at the top of the portal page and select Azure Machine Learning from the options that appear.



    large?v=v2&px=999.png

  2. Select + Create from the navigation menu.


  3. Select New workspace from the navigation menu.


    large?v=v2&px=999.png


  4. Perform the following tasks:
    • Select your Azure Subscription.
    • Select the Resource group to use (create a new one if needed).
    • Enter Workspace Name. It must be a unique value.
    • Select the Region you'd like to use.
    • Select the Storage account to use (create a new one if needed).
    • Select the Key vault to use (create a new one if needed).
    • Select the Application insights to use (create a new one if needed).
    • Select the Container registry to None.




  5. Select Review + Create.


  6. Select Create.

Request GPU Quotas in Azure Subscription​


In this tutorial, you will learn how to fine-tune and deploy a Phi-3 model, using GPUs. For fine-tuning, you will use the Standard_NC24ads_A100_v4 GPU, which requires a quota request. For deployment, you will use the Standard_NC6s_v3 GPU, which also requires a quota request.



Note



Only Pay-As-You-Go subscriptions (the standard subscription type) are eligible for GPU allocation; benefit subscriptions are not currently supported.

For those using benefit subscriptions (such as Visual Studio Enterprise Subscription) or those looking to quickly test the fine-tuning and deployment process, this tutorial also provides guidance for fine-tuning with a minimal dataset using a CPU. However, it is important to note that fine-tuning results are significantly better when using a GPU with larger datasets.

In this exercise, you will:

  • Request GPU Quotas in your Azure Subscription

Request GPU Quotas in Azure Subscription​


  1. Visit Azure ML Studio.


  2. Perform the following tasks to request Standard NCADSA100v4 Family quota:

    • Select Quota from the left side tab.


    • Select the Virtual machine family to use. For example, select Standard NCADSA100v4 Family Cluster Dedicated vCPUs, which includes the Standard_NC24ads_A100_v4 GPU.


    • Select the Request quota from the navigation menu.



      large?v=v2&px=999.png



    • Inside the Request quota page, enter the New cores limit you'd like to use. For example, 24.


    • Inside the Request quota page, select Submit to request the GPU quota.

  3. Perform the following tasks to request Standard NCSv3 Family quota:
    • Select Quota from the left side tab.
    • Select the Virtual machine family to use. For example, select Standard NCSv3 Family Cluster Dedicated vCPUs, which includes the Standard_NC6s_v3 GPU.
    • Select the Request quota from the navigation menu.
    • Inside the Request quota page, enter the New cores limit you'd like to use. For example, 24.
    • Inside the Request quota page, select Submit to request the GPU quota.





Add role assignment​


To fine-tune and deploy your models, you must first ceate a User Assigned Managed Identity (UAI) and assign it the appropriate permissions. This UAI will be used for authentication during deployment, so it is critical to grant it access to the storage accounts, container registry, and resource group.

In this exercise, you will:

  • Create User Assigned Managed Identity(UAI).
  • Add Contributor role assignment to Managed Identity.
  • Add Storage Blob Data Reader role assignment to Managed Identity.
  • Add AcrPull role assignment to Managed Identity.

Create User Assigned Managed Identity(UAI)​


  1. Type managed identities in the search bar at the top of the portal page and select Managed Identities from the options that appear.






    large?v=v2&px=999.png



  2. Select + Create.






    large?v=v2&px=999.png



  3. Perform the following tasks to navigate to Add role assignment page:
    • Select your Azure Subscription.
    • Select the Resource group to use (create a new one if needed).
    • Select the Region you'd like to use.
    • Enter the Name. It must be a unique value.






    large?v=v2&px=999.png



  4. Select Review + create.


  5. Select + Create.

Add Contributor role assignment to Managed Identity​


  1. Navigate to the Managed Identity resource that you created.


  2. Select Azure role assignments from the left side tab.


  3. Select +Add role assignment from the navigation menu.


  4. Inside Add role assignment page, Perform the following tasks:
    • Select the Scope to Resource group.
    • Select your Azure Subscription.
    • Select the Resource group to use.
    • Select the Role to Contributor.



    large?v=v2&px=999.png







  5. Select Save.

Add Storage Blob Data Reader role assignment to Managed Identity​


  1. Type azure storage accounts in the search bar at the top of the portal page and select Storage accounts from the options that appear.






    large?v=v2&px=999.png



  2. Select the storage account that associated with the Azure Machine Learning workspace. For example, finetunephistorage.


  3. Perform the following tasks to navigate to Add role assignment page:
    • Navigate to the Azure Storage account that you created.
    • Select Access Control (IAM) from the left side tab.
    • Select + Add from the navigation menu.
    • Select Add role assignment from the navigation menu.



    large?v=v2&px=999.png

  4. Inside Add role assignment page, Perform the following tasks:

    • Inside the Role page, type Storage Blob Data Reader in the search bar and select Storage Blob Data Reader from the options that appear.



      large?v=v2&px=999.png

    • Inside the Role page, select Next.


    • Inside the Members page, select Assign access to Managed identity.


    • Inside the Members page, select + Select members.


    • Inside Select managed identities page, select your Azure Subscription.


    • Inside Select managed identities page, select the Managed identity to Manage Identity.


    • Inside Select managed identities page, select the Manage Identity that you created. For example, finetunephi-managedidentity.


    • Inside Select managed identities page, select Select.



      large?v=v2&px=999.png

    • Select Review + assign.

Add AcrPull role assignment to Managed Identity​


  1. Type container registries in the search bar at the top of the portal page and select Container registries from the options that appear.



    large?v=v2&px=999.png





  2. Select the container registry that associated with the Azure Machine Learning workspace. For example, finetunephicontainerregistries


  3. Perform the following tasks to navigate to Add role assignment page:
    • Select Access Control (IAM) from the left side tab.
    • Select + Add from the navigation menu.
    • Select Add role assignment from the navigation menu.

  4. Inside Add role assignment page, Perform the following tasks:
    • Inside the Role page, Type AcrPull in the search bar and select AcrPull from the options that appear.
    • Inside the Role page, select Next.
    • Inside the Members page, select Assign access to Managed identity.
    • Inside the Members page, select + Select members.
    • Inside Select managed identities page, select your Azure Subscription.
    • Inside Select managed identities page, select the Managed identity to Manage Identity.
    • Inside Select managed identities page, select the Manage Identity that you created. For example, finetunephi-managedidentity.
    • Inside Select managed identities page, select Select.
    • Select Review + assign.



Set up the project​


To download the datasets needed for fine-tuning, you will set up a local environment.

In this exercise, you will

  • Create a folder to work inside it.
  • Create a virtual environment.
  • Install the required packages.
  • Create a download_dataset.py file to download the dataset.

Create a folder to work inside it​


  1. Open a terminal window and type the following command to create a folder named finetune-phi in the default path.

    Code:
    mkdir finetune-phi

  2. Type the following command inside your terminal to navigate to the finetune-phi folder you created.

    Code:
    cd finetune-phi

Create a virtual environment​


  1. Type the following command inside your terminal to create a virtual environment named .venv.

    Code:
    python -m venv .venv

  2. Type the following command inside your terminal to activate the virtual environment.

    Code:
    .venv\Scripts\activate.bat
Note



If it worked, you should see (.venv) before the command prompt.

Install the required packages​


  1. Type the following commands inside your terminal to install the required packages.

    Code:
    pip install datasets==2.19.1

Create donload_dataset.py

Note



Complete folder structure:

Code:
└── YourUserName
.    └── finetune-phi
.        └── download_dataset.py

  1. Open Visual Studio Code.


  2. Select File from the menu bar.


  3. Select Open Folder.


  4. Select the finetune-phi folder that you created, which is located at C:\Users\yourUserName\finetune-phi.




    large?v=v2&px=999.png



  5. In the left pane of Visual Studio Code, right-click and select New File to create a new file named download_dataset.py.




    large?v=v2&px=999.png






Prepare dataset for fine-tuning​


In this exercise, you will run the download_dataset.py file to download the ultrachat_200k datasets to your local environment. You will then use this datasets to fine-tune the Phi-3 model in Azure Machine Learning.

In this exercise, you will:

  • Add code to the download_dataset.py file to download the datasets.
  • Run the download_dataset.py file to download datasets to your local environment.

Download your dataset using download_dataset.py


  1. Open the download_dataset.py file in Visual Studio Code.


  2. Add the following code into download_dataset.py.


    Code:
    import json
    import os
    from datasets import load_dataset
    
    def load_and_split_dataset(dataset_name, config_name, split_ratio):
        """
        Load and split a dataset.
        """
        # Load the dataset with the specified name, configuration, and split ratio
        dataset = load_dataset(dataset_name, config_name, split=split_ratio)
        print(f"Original dataset size: {len(dataset)}")
        
        # Split the dataset into train and test sets (80% train, 20% test)
        split_dataset = dataset.train_test_split(test_size=0.2)
        print(f"Train dataset size: {len(split_dataset['train'])}")
        print(f"Test dataset size: {len(split_dataset['test'])}")
        
        return split_dataset
    
    def save_dataset_to_jsonl(dataset, filepath):
        """
        Save a dataset to a JSONL file.
        """
        # Create the directory if it does not exist
        os.makedirs(os.path.dirname(filepath), exist_ok=True)
        
        # Open the file in write mode
        with open(filepath, 'w', encoding='utf-8') as f:
            # Iterate over each record in the dataset
            for record in dataset:
                # Dump the record as a JSON object and write it to the file
                json.dump(record, f)
                # Write a newline character to separate records
                f.write('\n')
        
        print(f"Dataset saved to {filepath}")
    
    def main():
        """
        Main function to load, split, and save the dataset.
        """
        # Load and split the ULTRACHAT_200k dataset with a specific configuration and split ratio
        dataset = load_and_split_dataset("HuggingFaceH4/ultrachat_200k", 'default', 'train_sft[:1%]')
        
        # Extract the train and test datasets from the split
        train_dataset = dataset['train']
        test_dataset = dataset['test']
    
        # Save the train dataset to a JSONL file
        save_dataset_to_jsonl(train_dataset, "data/train_data.jsonl")
        
        # Save the test dataset to a separate JSONL file
        save_dataset_to_jsonl(test_dataset, "data/test_data.jsonl")
    
    if __name__ == "__main__":
        main()

  3. Type the following command inside your terminal to run the script and download the dataset to your local environment.

    Code:
    python download_dataset.py

  4. Verify that the datasets were saved successfully to your local finetune-phi/data directory.
Note


Note on dataset size and fine-tuning time​


In this tutorial, you use only 1% of the dataset (split='train[:1%]'). This significantly reduces the amount of data, speeding up both the upload and fine-tuning processes. You can adjust the percentage to find the right balance between training time and model performance. Using a smaller subset of the dataset reduces the time required for fine-tuning, making the process more manageable for a tutorial.



Series 2: Fine-tune and Deploy the Phi-3 model in Azure ML Studio




Fine-tune the Phi-3 model​


In this exercise, you will fine-tune the Phi-3 model in Azure Machine Learning Studio.

In this exercise, you will:

  • Create computer cluster for fine-tuning.
  • Fine-tune the Phi-3 model in Azure Machine Learning Studio.

Create computer cluster for fine-tuning​


  1. Visit Azure ML Studio.


  2. Select Compute from the left side tab.


  3. Select Compute clusters from the navigation menu.


  4. Select + New.




    large?v=v2&px=999.png



  5. Perform the following tasks:

    • Select the Region you'd like to use.


    • Select the Virtual machine tier to Dedicated.


    • Select the Virtual machine type to GPU.


    • Select the Virtual machine size filter to Select from all options.


    • Select the Virtual machine size to Standard_NC24ads_A100_v4.




      large?v=v2&px=999.png


  6. Select Next.


  7. Perform the following tasks:

    • Enter Compute name. It must be a unique value.


    • Select the Minimum number of nodes to 0.


    • Select the Maximum number of nodes to 1.


    • Select the Idle seconds before scale down to 120.




      large?v=v2&px=999.png


  8. Select Create.

Fine-tune the Phi-3 model​


  1. Visit Azure ML Studio.


  2. Select the Azure Macnine Learning workspace that you created.




    large?v=v2&px=999.png



  3. Perform the following tasks:
    • Select Model catalog from the left side tab.
    • Type phi-3-mini-4k in the search bar and select Phi-3-mini-4k-instruct from the options that appear.




    large?v=v2&px=999.png



  4. Select Fine-tune from the navigation menu.




    large?v=v2&px=999.png



  5. Perform the following tasks:

    • Select Select task type to Chat completion.


    • Select + Select data to upload Traning data.


    • Select the Validation data upload type to Provide different validation data.


    • Select + Select data to upload Validation data.




      large?v=v2&px=999.png




    Tip

    You can select Advanced settings to customize configurations such as learning_rate and lr_scheduler_type to optimize the fine-tuning process according to your specific needs.




  6. Select Finish.


  7. In this exercise, you successfully fine-tuned the Phi-3 model using Azure Machine Learning. Please note that the fine-tuning process can take a considerable amount of time. After running the fine-tuning job, you need to wait for it to complete. You can monitor the status of the fine-tuning job by navigating to the Jobs tab on the left side of your Azure Machine Learning Workspace. In the next series, you will deploy the fine-tuned model and integrate it with Prompt flow.


    large?v=v2&px=999.png


Deploy the fine-tuned model​


To integrate the fine-tuned Phi-3 model with Prompt flow, you need to deploy the model to make it accessible for real-time inference. This process involves registering the model, creating an online endpoint, and deploying the model.

In this exercise, you will:

  • Register the fine-tuned model in the Azure Machine Learning workspace.
  • Create an online endpoint.
  • Deploy the registered fine-tuned Phi-3 model.

Register the fine-tuned model​


  1. Visit Azure ML Studio.


  2. Select the Azure Macnine Learning workspace that you created.




    large?v=v2&px=999.png



  3. Select Models from the left side tab.


  4. Select + Register.


  5. Select From a job output.




    large?v=v2&px=999.png



  6. Select the job that you created.




    large?v=v2&px=999.png



  7. Select Next.


  8. Select Model type to MLflow.


  9. Ensure that Job output is selected; it should be automatically selected.




    large?v=v2&px=999.png



  10. Select Next.


  11. Select Register.




    large?v=v2&px=999.png



  12. You can view your registered model by navigating to the Models menu from the left side tab.




    large?v=v2&px=999.png

Deploy the fine-tuned model​


  1. Navigate to the Azure Macnine Learning workspace that you created.


  2. Select Endpoints from the left side tab.


  3. Select Real-time endpoints from the navigation menu.




    large?v=v2&px=999.png



  4. Select Create.


  5. select the registered model that you created.




    large?v=v2&px=999.png



  6. Select Select.


  7. Perform the following tasks:

    • Select Virtual machine to Standard_NC6s_v3.


    • Select the Instance count you'd like to use. For example, 1.


    • Select the Endpoint to New to create an endpoint.


    • Enter Endpoint name. It must be a unique value.


    • Enter Deployment name. It must be a unique value.




      large?v=v2&px=999.png


  8. Select Deploy.
Warning



To avoid additional charges to your account, make sure to delete the created endpoint in the Azure Machine Learning workspace.

Check deployment status in Azure Machine Learning Workspace​


  1. Navigate to Azure Machine Learning workspace that you created.


  2. Select Endpoints from the left side tab.


  3. Select the endpoint that you created.




    large?v=v2&px=999.png






  4. On this page, you can manage the endpoints during the deployment process.
Note



Once the deployment is complete, ensure that Live traffic is set to 100%. If it is not, select Update traffic to adjust the traffic settings. Note that you cannot test the model if the traffic is set to 0%.




large?v=v2&px=999.png



Series 3: Integrate the custom phi-3 model with Prompt flow in Azure AI Studio




Integrate the custom Phi-3 model with Prompt flow​


After successfully deploying your fine-tuned model, you can now integrate it with Prompt Flow to use your model in real-time applications, enabling a variety of interactive tasks with your custom Phi-3 model.

In this exercise, you will:

  • Create Azure AI Studio Hub.
  • Create Azure AI Studio Project.
  • Create Prompt flow.
  • Add a custom connection for the fine-tuned Phi-3 model.
  • Set up Prompt flow to chat with your custom Phi-3 model
Note



You can also integrate with Promptflow using Azure ML Studio. The same integration process can be applied to Azure ML Studio.

Create Azure AI Studio Hub​


You need to create a Hub before creating the Project. A Hub acts like a Resource Group, allowing you to organize and manage multiple Projects within Azure AI Studio.


  1. Visit Azure AI Studio.


  2. Select All hubs from the left side tab.


  3. Select + New hub from the navigation menu.




    large?v=v2&px=999.png



  4. Perform the following tasks:
    • Enter Hub name. It must be a unique value.
    • Select your Azure Subscription.
    • Select the Resource group to use (create a new one if needed).
    • Select the Location you'd like to use.
    • Select the Connect Azure AI Services to use (create a new one if needed).
    • Select Connect Azure AI Search to Skip connecting.




    large?v=v2&px=999.png



  5. Select Next.

Create Azure AI Studio Project​


  1. In the Hub that you created, select All projects from the left side tab.


  2. Select + New project from the navigation menu.




    large?v=v2&px=999.png



  3. Enter Project name. It must be a unique value.




    large?v=v2&px=999.png



  4. Select Create a project.

Add a custom connection for the fine-tuned Phi-3 model​


To integrate your custom Phi-3 model with Prompt flow, you need to save the model's endpoint and key in a custom connection. This setup ensures access to your custom Phi-3 model in Prompt flow.

Set api key and endpoint uri of the fine-tuned Phi-3 model


  1. Visit Azure ML Studio.


  2. Navigate to the Azure Machine learning workspace that you created.


  3. Select Endpoints from the left side tab.




    large?v=v2&px=999.png



  4. Select endpoint that you created.




    large?v=v2&px=999.png



  5. Select Consume from the navigation menu.


  6. Copy your REST endpoint and Primary key.




    large?v=v2&px=999.png

Add the Custom Connection


  1. Visit Azure AI Studio.


  2. Navigate to the Azure AI Studio project that you created.


  3. In the Project that you created, select Settings from the left side tab.


  4. Select + New connection.




    large?v=v2&px=999.png



  5. Select Custom keys from the navigation menu.




    large?v=v2&px=999.png



  6. Perform the following tasks:
    • Select + Add key value pairs.
    • For the key name, enter endpoint and paste the endpoint you copied from Azure ML Studio into the value field.
    • Select + Add key value pairs again.
    • For the key name, enter key and paste the key you copied from Azure ML Studio into the value field.
    • After adding the keys, select is secret to prevent the key from being exposed.




    large?v=v2&px=999.png

  7. Select Add connection.


  8. Perform the following tasks to add the custom Phi-3 model's key:

Create Prompt flow​


You have added a custom connection in Azure AI Studio. Now, let's create a Prompt flow using the following steps. Then, you will connect this Prompt flow to the custom connection so that you can use the fine-tuned model within the Prompt flow.



  1. Navigate to the Azure AI Studio project that you created.


  2. Select Prompt flow from the left side tab.


  3. Select + Create from the navigation menu.




    large?v=v2&px=999.png



  4. Select Chat flow from the navigation menu.




    large?v=v2&px=999.png



  5. Enter Folder name to use.




    large?v=v2&px=999.png



  6. Select Create.

Set up Prompt flow to chat with your custom Phi-3 model​

You need to integrate the fine-tuned Phi-3 model into a Prompt flow. However, the existing Prompt flow provided is not designed for this purpose. Therefore, you must redesign the Prompt flow to enable the integration of the custom model.


  1. In the Prompt flow, perform the following tasks to rebuild the existing flow:

    • Select Raw file mode.


    • Delete all existing code in the flow.dag.yml file.


    • Add the folling code to flow.dag.yml file.

      Code:
      inputs:
        input_data:
          type: string
          default: "Who founded Microsoft?"
      
      outputs:
        answer:
          type: string
          reference: ${integrate_with_promptflow.output}
      
      nodes:
      - name: integrate_with_promptflow
        type: python
        source:
          type: code
          path: integrate_with_promptflow.py
        inputs:
          input_data: ${inputs.input_data}

    • Select Save.




    large?v=v2&px=999.png








  2. Add the following code to integrate_with_promptflow.py file to use the custom Phi-3 model in Prompt flow.


    Code:
    import logging
    import requests
    from promptflow import tool
    from promptflow.connections import CustomConnection
    
    # Logging setup
    logging.basicConfig(
        format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
        datefmt="%Y-%m-%d %H:%M:%S",
        level=logging.DEBUG
    )
    logger = logging.getLogger(__name__)
    
    def query_phi3_model(input_data: str, connection: CustomConnection) -> str:
        """
        Send a request to the Phi-3 model endpoint with the given input data using Custom Connection.
        """
    
        # "connection" is the name of the Custom Connection, "endpoint", "key" are the keys in the Custom Connection
        endpoint_url = connection.endpoint
        api_key = connection.key
    
        headers = {
            "Content-Type": "application/json",
            "Authorization": f"Bearer {api_key}"
        }
        data = {
            "input_data": {
                "input_string": [
                    {"role": "user", "content": input_data}
                ],
                "parameters": {
                    "temperature": 0.7,
                    "max_new_tokens": 128
                }
            }
        }
        try:
            response = requests.post(endpoint_url, json=data, headers=headers)
            response.raise_for_status()
            
            # Log the full JSON response
            logger.debug(f"Full JSON response: {response.json()}")
    
            result = response.json()["output"]
            logger.info("Successfully received response from Azure ML Endpoint.")
            return result
        except requests.exceptions.RequestException as e:
            logger.error(f"Error querying Azure ML Endpoint: {e}")
            raise
    
    @tool
    def my_python_tool(input_data: str, connection: CustomConnection) -> str:
        """
        Tool function to process input data and query the Phi-3 model.
        """
        return query_phi3_model(input_data, connection)




    large?v=v2&px=999.png

Note



For more detailed information on using Prompt flow in Azure AI Studio, you can refer to Prompt flow in Azure AI Studio.

  1. Select Chat input, Chat output to enable chat with your model.




    large?v=v2&px=999.png



  2. Now you are ready to chat with your custom Phi-3 model. In the next exercise, you will learn how to start Prompt flow and use it to chat with your fine-tuned Phi-3 model.
Note



The rebuilt flow should look like the image below:




medium?v=v2&px=400.png



Chat with your custom Phi-3 model​


Now that you have fine-tuned and integrated your custom Phi-3 model with Prompt flow, you are ready to start interacting with it. This exercise will guide you through the process of setting up and initiating a chat with your model using Prompt flow. By following these steps, you will be able to fully utilize the capabilities of your fine-tuned Phi-3 model for various tasks and conversations.

  • Chat with your custom Phi-3 model using Prompt flow.

Start Prompt flow​


  1. Select Start compute sessions to start Prompt flow.




    large?v=v2&px=999.png



  2. Select Validate and parse input to renew parameters.




    large?v=v2&px=999.png



  3. Select the Value of the connection to the custom connection you created. For example, connection.




    large?v=v2&px=999.png

Chat with your custom Phi-3 model​


  1. Select Chat.




    large?v=v2&px=999.png






  2. Here's an example of the results: Now you can chat with your custom Phi-3 model. It is recommended to ask questions based on the data used for fine-tuning.




    large?v=v2&px=999.png

Congratulations!​

You've completed this tutorial​


Congratulations! You have successfully completed the tutorial on fine-tuning and integrating custom Phi-3 models with Prompt flow in Azure AI Studio. This tutorial introduced the process of fine-tuning, deploying, and integrating the custom Phi-3 model with Prompt flow using Azure ML Studio and Azure AI Studio.




large?v=v2&px=999.png


Clean Up Azure Resources​


Cleanup your Azure resources to avoid additional charges to your account. Go to the Azure portal and delete the following resources:

  • The Azure Machine learning resource.
  • The Azure Machine learning model endpoint.
  • The Azure AI Studio Project resource.
  • The Azure AI Studio Prompt flow resource.

Next Steps​




Documentation​

Training Content​

Reference​


Continue reading...
 
Back
Top