Integrations
LlamaIndex Callback Handler
Callback Handler to enable Chainlit to display intermediate steps in the UI.
Usage
Code Example
from llama_index.core.callbacks import CallbackManager
from llama_index.core.service_context import ServiceContext
import chainlit as cl
@cl.on_chat_start
async def start():
service_context = ServiceContext.from_defaults(callback_manager=CallbackManager([cl.LlamaIndexCallbackHandler()]))
# use the service context to create the predictor
Was this page helpful?