LiteLLM model connector for ADK agents¶
ADK Python Security Advisory: LiteLLM supply chain compromise
Unauthorized code was identified in LiteLLM versions 1.82.7 and 1.82.8 on
PyPI on March 24, 2026. If you use ADK Python with the eval or
extensions extras, update to the latest version of ADK Python immediately.
If you installed or upgraded LiteLLM during this period, rotate all secrets
and credentials. For details and required actions, refer to the ADK
security advisory and
LiteLLM's Security Update: Suspected Supply Chain
Incident.
LiteLLM is a Python library that acts as a translation layer for models and model hosting services, providing a standardized, OpenAI-compatible interface to over 100+ LLMs. ADK provides integration through the LiteLLM library, allowing you to access a vast range of LLMs from providers like OpenAI, Anthropic (non-Vertex AI), Cohere, and many others. You can run open-source models locally or self-host them and integrate them using LiteLLM for operational control, cost savings, privacy, or offline use cases.
You can use the LiteLLM library to access remote or locally hosted AI models:
- Remote model host: Use the
LiteLlmwrapper class and set it as themodelparameter ofLlmAgent. - Local model host: Use the
LiteLlmwrapper class configured to point to your local model server. For examples of local model hosting solutions, see the Ollama or vLLM documentation.
Windows Encoding with LiteLLM
When using ADK agents with LiteLLM on Windows, you might encounter a
UnicodeDecodeError. This error occurs because LiteLLM may attempt to read
cached files using the default Windows encoding (cp1252) instead of UTF-8.
Prevent this error by setting the PYTHONUTF8 environment variable to
1. This forces Python to use UTF-8 for all file I/O.
Example (PowerShell):
Setup¶
- Install LiteLLM:
-
Set Provider API Keys: Configure API keys as environment variables for the specific providers you intend to use.
-
Example for OpenAI:
-
Example for Anthropic (non-Vertex AI):
-
Consult the LiteLLM Providers Documentation for the correct environment variable names for other providers.
-
Example implementation¶
from google.adk.agents import LlmAgent
from google.adk.models.lite_llm import LiteLlm
# --- Example Agent using OpenAI's GPT-4o ---
# (Requires OPENAI_API_KEY)
agent_openai = LlmAgent(
model=LiteLlm(model="openai/gpt-4o"), # LiteLLM model string format
name="openai_agent",
instruction="You are a helpful assistant powered by GPT-4o.",
# ... other agent parameters
)
# --- Example Agent using Anthropic's Claude Haiku (non-Vertex) ---
# (Requires ANTHROPIC_API_KEY)
agent_claude_direct = LlmAgent(
model=LiteLlm(model="anthropic/claude-3-haiku-20240307"),
name="claude_direct_agent",
instruction="You are an assistant powered by Claude Haiku.",
# ... other agent parameters
)