Prerequisites
Before getting started, make sure you have the following:- A working installation of Chainlit
- The
semantic-kernel
package installed - An LLM API key (e.g., OpenAI, Azure OpenAI) configured for Semantic Kernel
- Basic understanding of Python programming and Semantic Kernel concepts (Kernel, Plugins, Functions)
Step 1: Create a Python file
Create a new Python file namedapp.py
in your project directory. This file will contain the main logic for your LLM application using Semantic Kernel.
Step 2: Write the Application Logic
Inapp.py
, import the necessary packages, set up your Semantic Kernel Kernel
, add the SemanticKernelFilter
for Chainlit integration, and define functions to handle chat sessions and incoming messages.
Here’s an example demonstrating how to set up the kernel and use the filter:
app.py
Step 3: Run the Application
To start your app, open a terminal and navigate to the directory containingapp.py
. Then run the following command:
-w
flag tells Chainlit to enable auto-reloading, so you don’t need to restart the server every time you make changes to your application. Your chatbot UI should now be accessible at http://localhost:8000. Interact with the bot, and if you ask for the weather (and the LLM uses the tool), you should see a “Weather-get_weather” step appear in the UI.