semantic-kernel
package installedapp.py
in your project directory. This file will contain the main logic for your LLM application using Semantic Kernel.
app.py
, import the necessary packages, set up your Semantic Kernel Kernel
, add the SemanticKernelFilter
for Chainlit integration, and define functions to handle chat sessions and incoming messages.
Here’s an example demonstrating how to set up the kernel and use the filter:
app.py
. Then run the following command:
-w
flag tells Chainlit to enable auto-reloading, so you don’t need to restart the server every time you make changes to your application. Your chatbot UI should now be accessible at http://localhost:8000. Interact with the bot, and if you ask for the weather (and the LLM uses the tool), you should see a “Weather-get_weather” step appear in the UI.