Skip to content

LiteLLM when used with the model string "gemini-*" calls the wrong endpoint #4008

@ashubham

Description

@ashubham

** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.

Describe the bug
When using LiteLLM wrapper, the outgoing calls should always be OpenAI format. But if the model name is gemini-* the ADK tries to use the :streamGenerateContent endpoint, which will not exist.

To Reproduce

root_agent = Agent(
    name="root_agent",
    model=LiteLlm(
        api_base="<lite-llm-endpoint>",
        model="gemini-2.5-flash",    
    ),
    instruction="You are a helpful AI assistant designed to provide accurate and useful information.",
)
$ adk web

Then send a message using the web interface to see the error:

httpx.HTTPStatusError: Client error ` 404 Not Found` for url: https://<lite-llm-endpoint>/v1:streamGenerateContent?alt=sse

Expected behavior

Should call the OpenAI /v1/completions endpoint.

Desktop (please complete the following information):

  • OS: Any
  • Python version(python -V): 3.13
  • ADK version(pip show google-adk): 1.21.0

Model Information:

  • Are you using LiteLLM: Yes
  • Which model is being used: gemini-2.5-flash

Metadata

Metadata

Assignees

No one assigned

    Labels

    models[Component] Issues related to model support

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions