The Prompt Playground will automatically be available through a Step if a generation object is passed to that step.

Chainlit integrations will take care of this for you if the framework you are using supports it.


For this example we are going to create a completion with OpenAI chat API and create a ChatGeneration reflecting the API call made to OpenAI.

Chainlit supports other LLM providers and let’s you implement custom ones. Learn more here.

from openai import AsyncOpenAI

from chainlit.playground.providers import ChatOpenAI
import chainlit as cl

client = AsyncOpenAI()

template = "Hello, {name}!"
variables = {"name": "John"}

settings = {
    "model": "gpt-3.5-turbo",
    "temperature": 0,
    # ... more settings

async def call_llm():
    generation = cl.ChatGeneration(,
                "content": template.format(**variables),

    # Make the call to OpenAI
    response = await
        messages=generation.messages, **settings

    generation.message_completion = {
        "content": response.choices[0].message.content,
        "role": "assistant"

    # Add the generation to the current step
    cl.context.current_step.generation = generation

    return generation.message_completion["content"]

async def start():
    await call_llm()


Once the step is sent, a new button will be available next to it. Clicking on that button will open the Prompt Playground in the context of that step.

Open the Prompt Playground