Semantic Kernel
In this tutorial, we’ll walk through the steps to create a Chainlit application integrated with Microsoft’s Semantic Kernel. The integration automatically visualizes Semantic Kernel function calls (like plugins or tools) as Steps in the Chainlit UI.
Prerequisites
Before getting started, make sure you have the following:
- A working installation of Chainlit
- The
semantic-kernel
package installed - An LLM API key (e.g., OpenAI, Azure OpenAI) configured for Semantic Kernel
- Basic understanding of Python programming and Semantic Kernel concepts (Kernel, Plugins, Functions)
Step 1: Create a Python file
Create a new Python file named app.py
in your project directory. This file will contain the main logic for your LLM application using Semantic Kernel.
Step 2: Write the Application Logic
In app.py
, import the necessary packages, set up your Semantic Kernel Kernel
, add the SemanticKernelFilter
for Chainlit integration, and define functions to handle chat sessions and incoming messages.
Here’s an example demonstrating how to set up the kernel and use the filter:
Step 3: Run the Application
To start your app, open a terminal and navigate to the directory containing app.py
. Then run the following command:
The -w
flag tells Chainlit to enable auto-reloading, so you don’t need to restart the server every time you make changes to your application. Your chatbot UI should now be accessible at http://localhost:8000. Interact with the bot, and if you ask for the weather (and the LLM uses the tool), you should see a “Weather-get_weather” step appear in the UI.
Was this page helpful?