Literal AI - LLMOps
Enterprise
If your organization can’t use third party cloud services for data hosting, we can provide your company with a self-hostable Literal AI docker image (under commercial license).
Please fill this form for enterprise support.
Define your Literal AI Server
Once you are hosting your own Literal AI instance, you can point to the server for data persistence.
You will need to use the LITERAL_API_URL
environment variable.
Modify the .env
file next to your Chainlit application.
.env
Alternatively, inlined:
Activating Data Persistence
Using your own Literal AI instance, you will still need to provide a valid API key to persist the data as described here.
Once activated, your chats and elements will be stored on your own server.
Was this page helpful?