In this tutorial, we will guide you through the steps to create a Chainlit application integrated with LiteLLM Proxy The benefits of using LiteLLM Proxy with Chainlit is:Documentation Index
Fetch the complete documentation index at: https://docs.chainlit.io/llms.txt
Use this file to discover all available pages before exploring further.
- You can call 100+ LLMs in the OpenAI API format
- Use Virtual Keys to set budget limits and track usage
- see LLM API calls in a step in the UI, and you can explore them in the prompt playground.
Prerequisites
Before getting started, make sure you have the following:- A working installation of Chainlit
- The OpenAI package installed
- LiteLLM Proxy Running
- A LiteLLM Proxy API Key
- Basic understanding of Python programming
Step 1: Create a Python file
Create a new Python file namedapp.py in your project directory. This file will contain the main logic for your LLM application.
Step 2: Write the Application Logic
Inapp.py, import the necessary packages and define one function to handle messages incoming from the UI.
Step 3: Run the Application
To start your app, open a terminal and navigate to the directory containingapp.py. Then run the following command:
-w flag tells Chainlit to enable auto-reloading, so you don’t need to restart the server every time you make changes to your application. Your chatbot UI should now be accessible at http://localhost:8000.