Example: Reply to a user message
Lets create a simple assistant that replies to a user message with a greeting.Message API
Learn more about the Message API.
Chat Context
Since LLMs are stateless, you will often have to accumulate the messages of the current conversation in a list to provide the full context to LLM with each query. You could do that manually with the user_session. However, Chainlit provides a built-in way to do this:chat_context
cl.chat_context
.
You can then use cl.chat_context.to_openai()
to get the conversation in the OpenAI format and feed it to the LLM.