-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
Support for OpenRouter embeddings in LiteLLM is crucial because, currently, there are bugs or compatibility limitations that prevent their explicit configuration and direct use.
Relevant log output
if use openrouter provider and custom model name:
1) google/gemini-embedding-001 - Error is , LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=google/gemini-embedding-001 Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers
2) openrouter/google/gemini-embedding-001 - Error is, Unmapped LLM provider for this endpoint. You passed model=google/gemini-embedding-001, custom_llm_provider=openrouter. Check supported provider and route: https://docs.litellm.ai/docs/providersAre you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.79.3-stable.gemini3
Twitter / LinkedIn details
No response
Magolor, CesarSanch1998 and Taivin
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working