K
Kyle_Raymond
The Sematic Kernel is a great tool to help developers quickly integrate their codebases with OpenAI or Azure OpenAI models. The SK team has provided wonderful samples in C# and python using functions, web apps and notebooks.
Semantic Kernel Overview
Semantic Kernel Repo
After working through the samples, I thought it might be fun to create my own endpoint using the minimal API in C#. The nice thing about working with the minimal API is that a developer can quickly stand up an endpoint for a POC or MVP.
Let's get started!
Prerequisites
Usage
Code for this post can be found here.
When using the minimal API, the endpoints can simply be defined in the program.cs file. In the code example, there is just one endpoint which a user can POST information.
app.MapPost("plugins/{pluginName}/invoke/{functionName}", async (HttpContext context, Query query, string pluginName, string functionName) =>
The path specifies a "pluginName" and a "functionName" to execute.
When calling this endpoint, the path would end "/plugins/FunPlugin/invoke/Joke". In the repo, FunPlugin is the directory which contains the Joke plugin.
So... what are plugins?
Plugins are interoperable, OpenAI standards based, encapsulated AI capabilities.
Clear?
How about, a plugin is a simple implementation of an AI task that can be shared with other developers or in the future, with different copilots.
Within the Semantic Kernel repo are several examples of plugins(note: plugins used to be skills). The plugins are also known as Semantic Functions.
Here is an example from the Joke plugin. (skprompt.txt)
WRITE EXACTLY ONE JOKE or HUMOROUS STORY ABOUT THE TOPIC BELOW
JOKE MUST BE:
- G RATED
- WORKPLACE/FAMILY SAFE
NO SEXISM, RACISM OR OTHER BIAS/BIGOTRY
BE CREATIVE AND FUNNY. I WANT TO LAUGH.
+++++
{{$input}}
+++++
The skprompt.txt file a simple text file defining the natural language prompt that will be sent to the AI service. The other file in the Jokes folder is config.json. This file provides configuration information and descriptions that can be used by the planner.
{
"schema": 1,
"description": "Generate a funny joke",
"type": "completion",
"completion": {
"max_tokens": 1000,
"temperature": 0.9,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "Joke subject",
"defaultValue": ""
}
]
}
}
Getting back to the code, the next step in the endpoint is to read the header values passed in the request. These are the Azure OpenAI or OpenAI information needed to execute the query. Below is an example with Azure OpenAI.
var headers = context.Request.Headers;
var model = headers["x-sk-web-app-model"];
var endpoint = headers["x-sk-web-app-endpoint"];
var key = headers["x-sk-web-app-key"];
Then we instantiate the kernel using the AzureTextCompletionService.
var kernel = new KernelBuilder()
.WithAzureTextCompletionService(model!, endpoint!, key!)
.Build();
Set the location and specified plugin to execute.
var pluginDirectory = "Plugins";
var plugInFunctions = kernel!.ImportSemanticSkillFromDirectory(pluginDirectory, pluginName);
And finally, invoke the Semantic Kernel function and return the result.
var result = await plugInFunctions[functionName].InvokeAsync(query.Value);
SKResponse response = new SKResponse();
response.Value = result.Result.Trim();
return Results.Json(response);
To test your endpoint, start the application (Start or "dotnet run" from the terminal) and the service should be ready.
If you are using Postman, ask the service for a joke.
And the result...
{
"value":"A bear was looking for a job and decided to become an Uber driver. He was a great driver and always got five-star reviews from his passengers. One day, a passenger asked him why he was so good at his job. The bear replied, \"It's simple, I just follow the bear necessities of life!\""
}
Continue reading...
Semantic Kernel Overview
Semantic Kernel Repo
After working through the samples, I thought it might be fun to create my own endpoint using the minimal API in C#. The nice thing about working with the minimal API is that a developer can quickly stand up an endpoint for a POC or MVP.
Let's get started!
Prerequisites
- An OpenAI account
- An API key from the OpenAI portal
- An Azure subscription
- Azure OpenAI subscription. Access to the Azure OpenAI portal is by application only. Apply for access with this form.
- Deployed model to reference in Azure OpenAI Studio
Usage
Code for this post can be found here.
When using the minimal API, the endpoints can simply be defined in the program.cs file. In the code example, there is just one endpoint which a user can POST information.
app.MapPost("plugins/{pluginName}/invoke/{functionName}", async (HttpContext context, Query query, string pluginName, string functionName) =>
The path specifies a "pluginName" and a "functionName" to execute.
When calling this endpoint, the path would end "/plugins/FunPlugin/invoke/Joke". In the repo, FunPlugin is the directory which contains the Joke plugin.
So... what are plugins?
Plugins are interoperable, OpenAI standards based, encapsulated AI capabilities.
Clear?
How about, a plugin is a simple implementation of an AI task that can be shared with other developers or in the future, with different copilots.
Within the Semantic Kernel repo are several examples of plugins(note: plugins used to be skills). The plugins are also known as Semantic Functions.
Here is an example from the Joke plugin. (skprompt.txt)
WRITE EXACTLY ONE JOKE or HUMOROUS STORY ABOUT THE TOPIC BELOW
JOKE MUST BE:
- G RATED
- WORKPLACE/FAMILY SAFE
NO SEXISM, RACISM OR OTHER BIAS/BIGOTRY
BE CREATIVE AND FUNNY. I WANT TO LAUGH.
+++++
{{$input}}
+++++
The skprompt.txt file a simple text file defining the natural language prompt that will be sent to the AI service. The other file in the Jokes folder is config.json. This file provides configuration information and descriptions that can be used by the planner.
{
"schema": 1,
"description": "Generate a funny joke",
"type": "completion",
"completion": {
"max_tokens": 1000,
"temperature": 0.9,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "Joke subject",
"defaultValue": ""
}
]
}
}
Getting back to the code, the next step in the endpoint is to read the header values passed in the request. These are the Azure OpenAI or OpenAI information needed to execute the query. Below is an example with Azure OpenAI.
var headers = context.Request.Headers;
var model = headers["x-sk-web-app-model"];
var endpoint = headers["x-sk-web-app-endpoint"];
var key = headers["x-sk-web-app-key"];
Then we instantiate the kernel using the AzureTextCompletionService.
var kernel = new KernelBuilder()
.WithAzureTextCompletionService(model!, endpoint!, key!)
.Build();
Set the location and specified plugin to execute.
var pluginDirectory = "Plugins";
var plugInFunctions = kernel!.ImportSemanticSkillFromDirectory(pluginDirectory, pluginName);
And finally, invoke the Semantic Kernel function and return the result.
var result = await plugInFunctions[functionName].InvokeAsync(query.Value);
SKResponse response = new SKResponse();
response.Value = result.Result.Trim();
return Results.Json(response);
To test your endpoint, start the application (Start or "dotnet run" from the terminal) and the service should be ready.
If you are using Postman, ask the service for a joke.
And the result...
{
"value":"A bear was looking for a job and decided to become an Uber driver. He was a great driver and always got five-star reviews from his passengers. One day, a passenger asked him why he was so good at his job. The bear replied, \"It's simple, I just follow the bear necessities of life!\""
}
Continue reading...