Documentation Index
Fetch the complete documentation index at: https://docs.litellm-agent-platform.ai/llms.txt
Use this file to discover all available pages before exploring further.
What sandbox technology does this use?
Sandboxes are Kubernetes pods managed by the kubernetes-sigs/agent-sandbox CRD. Local dev runs on kind; production runs on EKS / GKE / AKS / on-prem β anywhere you can run Kubernetes. The CRD also supports gVisor and Kata runtime classes viaruntimeClass if you need stronger isolation than vanilla pod boundaries.
Can I use this with the LiteLLM AI Gateway?
Yes β and itβs required. LAP routes every model call through a LiteLLM gateway. SetLITELLM_API_BASE and LITELLM_API_KEY when you deploy the platform; the vault proxy delivers the real key on each outbound request so the agent process only ever sees a stub.
What agent harnesses are supported?
Four out of the box:- Claude Code and Codex β interactive terminal (TUI) harnesses, attach with
lap - opencode and Claude Agent SDK β API harnesses, drive with
POST /sessions/{id}/message
How long do sessions stay alive?
24 hours of idle time. Any message or terminal activity resets the countdown. After 24h with no traffic, the reconciler reaps the pod and the session flips todead.
How do I detach from a TUI session without killing it?
Press Ctrl-D inlap. The session stays alive; reconnect by running lap <agent-name> again.