Skip to content

[Bug]: OpenRouter support for embeddings #17773

@conext-noc

Description

@conext-noc

What happened?

Support for OpenRouter embeddings in LiteLLM is crucial because, currently, there are bugs or compatibility limitations that prevent their explicit configuration and direct use.

Image Image Image

Relevant log output

if use openrouter provider and custom model name:
1) google/gemini-embedding-001 - Error is , LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=google/gemini-embedding-001 Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers

2) openrouter/google/gemini-embedding-001 - Error is, Unmapped LLM provider for this endpoint. You passed model=google/gemini-embedding-001, custom_llm_provider=openrouter. Check supported provider and route: https://docs.litellm.ai/docs/providers

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.79.3-stable.gemini3

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions