A Message is a piece of information that is sent from the user to an assistant and vice versa. Coupled with life cycle hooks, they are the building blocks of a chat.

A message has a content, a timestamp and cannot be nested.

Example: Reply to a user message

Lets create a simple assistant that replies to a user message with a greeting.

import chainlit as cl

@cl.on_message
async def on_message(message: cl.Message):
    response = f"Hello, you just sent: {message.content}!"
    await cl.Message(response).send()

Message API

Learn more about the Message API.

Chat Context

Since LLMs are stateless, you will often have to accumulate the messages of the current conversation in a list to provide the full context to LLM with each query.

You could do that manually with the user_session. However, Chainlit provides a built-in way to do this:

chat_context
import chainlit as cl

@cl.on_message
async def on_message(message: cl.Message):
    # Get all the messages in the conversation in the OpenAI format
    print(cl.chat_context.to_openai())

    # Send the response
    response = f"Hello, you just sent: {message.content}!"
    await cl.Message(response).send()

Every message sent or received will be automatically accumulated in cl.chat_context. You can then use cl.chat_context.to_openai() to get the conversation in the OpenAI format and feed it to the LLM.