-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Open
Labels
models[Component] Issues related to model support[Component] Issues related to model support
Description
** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.
Describe the bug
When using LiteLLM wrapper, the outgoing calls should always be OpenAI format. But if the model name is gemini-* the ADK tries to use the :streamGenerateContent endpoint, which will not exist.
To Reproduce
root_agent = Agent(
name="root_agent",
model=LiteLlm(
api_base="<lite-llm-endpoint>",
model="gemini-2.5-flash",
),
instruction="You are a helpful AI assistant designed to provide accurate and useful information.",
)$ adk webThen send a message using the web interface to see the error:
httpx.HTTPStatusError: Client error ` 404 Not Found` for url: https://<lite-llm-endpoint>/v1:streamGenerateContent?alt=sse
Expected behavior
Should call the OpenAI /v1/completions endpoint.
Desktop (please complete the following information):
- OS: Any
- Python version(python -V): 3.13
- ADK version(pip show google-adk): 1.21.0
Model Information:
- Are you using LiteLLM: Yes
- Which model is being used: gemini-2.5-flash
Metadata
Metadata
Assignees
Labels
models[Component] Issues related to model support[Component] Issues related to model support