B
bethanyjep
How to build a social media assistant with Prompty and PromptFlow
Large Language Models were trained by data from all over the internet with billions of parameters. To generate an output from LLMs, you need a prompt. For day-to-day questions for example generating a LinkedIn Post from a blog post, you may need additional instructions for your LLM, for example: the word count, tone of the message, the format and the call to action at the end. Using prompty, you can easily standardize your prompt and execute it into a single asset.
Prompty brings the focus of building LLM apps to prompts, giving you visibility and allowing you to easily integrate with other orchestrators and tools available. It goes beyond just calling the API and adds in additional configurations using markdown format to create and manage your prompts. Prompty comprises of the specification, tooling and runtime:
Prerequisites and Authentication
To get started, you need to create a new Azure OpenAI resource on the Azure OpenAI Studio.
Once you create the resource, head over to Azure OpenAI Studio and create a gpt-35-turbo model deployment.
The first, and easiest way, to get started is by using environment variables. In your local environment, create a
AZURE_OPENAI_DEPLOYMENT_NAME = ""
AZURE_OPENAI_API_VERSION = ""
AZURE_OPENAI_API_VERSION = ""
AZURE_OPENAI_ENDPOINT =""
Once done, you can then add your environment variables to your Prompty metadata as follows:
You can also, login to your Azure account using
- <your name>
This will generate PromptFlow code for you that you can load and run a Prompty as shown below:
We can then be able to understand the time taken for out Prompty to run as well as number of tokes used and understand our LLM performance better.
Continue learning and building with Prompty using the following resources:
Continue reading...
Large Language Models were trained by data from all over the internet with billions of parameters. To generate an output from LLMs, you need a prompt. For day-to-day questions for example generating a LinkedIn Post from a blog post, you may need additional instructions for your LLM, for example: the word count, tone of the message, the format and the call to action at the end. Using prompty, you can easily standardize your prompt and execute it into a single asset.
Why Prompty.
Prompty brings the focus of building LLM apps to prompts, giving you visibility and allowing you to easily integrate with other orchestrators and tools available. It goes beyond just calling the API and adds in additional configurations using markdown format to create and manage your prompts. Prompty comprises of the specification, tooling and runtime:- Specification: Prompty is an asset class not tied to any language as it uses
markdown
format withyam
l to specify your metadata. It unifies the prompt and its execution in a single package to quickly get started. - Tooling: using the Prompty extension, you can quickly get started in your coding environment and set up your configuration. It enables you to work with prompts at a high level with additional features such as metadata autocompletion, syntax highlighting, quick run and validation.
- Runtime: Prompty additionally generates code for you in different frameworks e.g. LangChain, PromptFlow and Semantic Kernel. the asset class easily helps you convert to code simplifying your workflow as it is language agnostic.
Prerequisites and Authentication
To get started, you need to create a new Azure OpenAI resource on the Azure OpenAI Studio.
Once you create the resource, head over to Azure OpenAI Studio and create a gpt-35-turbo model deployment.
You can authenticate your Prompty using.env
or using managed identity where you will sign into your account usingaz login and follow the instructions to set your subscription as the
active subscription if you have multiple tenants/subscriptions
The first, and easiest way, to get started is by using environment variables. In your local environment, create a
.env
file, where you store your keys and endpoints. Your .env
file, should be something like this: AZURE_OPENAI_DEPLOYMENT_NAME = ""
AZURE_OPENAI_API_VERSION = ""
AZURE_OPENAI_API_VERSION = ""
AZURE_OPENAI_ENDPOINT =""
Once done, you can then add your environment variables to your Prompty metadata as follows:
Code:
configuration:
type: azure_openai
azure_deployment: ${env:AZURE_OPENAI_DEPLOYMENT_NAME}
api_version: ${env:AZURE_OPENAI_API_VERSION}
azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}
You can also, login to your Azure account using
azd auth login --use-device-code
, this will allow you to define your model configurations and update your settings.json
files, allowing you to access your configurations without using environment variable.
Getting Started with Prompty
- Install the Prompty extension
- Right-click on the VSCode explorer and select "New Prompty." This will create a new Prompty file in markdown format.
- Configure your application metadata, in this case we will update as follows:
- name: SociallyPrompt
- description: An AI assistant designed to help you create engaging Twitter and LinkedIn posts
- authors:
- model:
- azure_endpoint: azure_openai
- azure_deployment: gpt-35-turbo
- parameters:
- max_tokens: 3000
- temperature: 0.9
- sample:
- blog_title: LLM based development tools - PromptFlow vs LangChain vs Semantic Kernel
- blog_link: LLM based development tools: PromptFlow vs LangChain vs Semantic Kernel
- call_to_action: GitHub Sample Code at GitHub - BethanyJep/Swahili-Tutor
- post type: Twitter
- model:
- Update your application instructions, use examples where necessary, for example, below is a sample instruction for our social media assistant:
Code:
system:
You are an Social Media AI assistant who helps people create engaging content for Twitter. As the assistant,
you keep the tweets concise - remember that 280-character limit! Inject enthusiasm into your posts!
Use relevant hashtags to boost visibility for the posts. And have a call to action or even add some personal flair with appropriate emojis.
# Context
I am an AI assistant designed to help you create engaging Twitter and LinkedIn posts. I can help you create posts for a variety of topics, including technology, education, and more. I can also help you create posts for blog posts, articles, and other content you want to share on social media.
user:
{{post_type}} post for the blog post titled "{{blog_title}}" found at {{blog_link}} with the call to action "{{call_to_action}}".
- Once done click the
run
button to run your Prompty in the terminal
Extending functionality with Promptflow
Adding PromptFlow or LangChain into your code is just one click away. In your VSCode, right click on your Prompty file and selectadd Prompt flow code
.This will generate PromptFlow code for you that you can load and run a Prompty as shown below:
Tracing your Flow
Tracing allows you to understand your LLM's behaviour. To trace our application, we will need to importstart_trace
from the promptflow tracing library using: from promptflow.tracing import start_trace
and addstart trace
to PromptFlow code. The output will be a URL with trace UI as follows:We can then be able to understand the time taken for out Prompty to run as well as number of tokes used and understand our LLM performance better.
Resources
Continue learning and building with Prompty using the following resources:
- Our Social Media Assistant Prompty: BethanyJep/social-media-assistant-prompty (github.com)
- Microsoft Build: Practical End-to-End AI Development using Prompty and AI Studio (microsoft.com)
- Prompty samples: Collections | Microsoft Learn
- Prompty documentations: prompty.ai
Continue reading...