Conversation
|
All contributors have signed the CLA ✍️ ✅ |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: b0619178e2
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| protected _convertModelName(model: string): string { | ||
| return ModelsLab.MODEL_IDS[model] ?? model; |
There was a problem hiding this comment.
Wire model alias conversion into request body
This provider defines short-name aliases in MODEL_IDS, but only overrides _convertModelName; the OpenAI request path builds payloads from options.model in _convertArgs and does not invoke _convertModelName. That means configs like "model": "llama3.1-8b" are sent to ModelsLab unchanged and can fail model resolution, even though this file advertises alias support. Apply the conversion before sending requests (for example by overriding _convertArgs).
Useful? React with 👍 / 👎.
|
I have read the CLA Document and I hereby sign the CLA |
feat: Add ModelsLab provider
Summary
Adds ModelsLab as a provider in Continue, giving users access to uncensored Llama 3.1 8B and 70B models with 128K context windows.
What is ModelsLab?
ModelsLab is an AI API platform providing uncensored language models via an OpenAI-compatible endpoint — ideal for code assistance, creative writing, and use cases where standard content filters are too restrictive.
Files
New files
core/llm/llms/ModelsLab.ts— provider implementationdocs/docs/reference/model-providers/modelslab.md— user documentationModified files (one-line patches)
core/llm/llms/index.ts— add import and export:Implementation
Follows the exact same pattern as
Together,Groq,Fireworks, and other OpenAI-compatible providers — extendsOpenAIbase class, setsproviderNameanddefaultOptions.apiBase:No new dependencies. No behavior overrides. All streaming, tool calling, and completion logic is inherited from the OpenAI base class.
Usage
{ "models": [{ "title": "ModelsLab Llama 3.1 8B", "provider": "modelslab", "model": "llama-3.1-8b-uncensored", "apiKey": "YOUR_MODELSLAB_API_KEY" }] }Models
llama-3.1-8b-uncensored(default)llama-3.1-70b-uncensoredChecklist
OpenAIbase class (same pattern as Together, Groq, Fireworks, Novita, etc.)static providerNameset to"modelslab"static defaultOptionswithapiBaseandmodel_convertModelName)index.ts(see patch above)Continue Tasks:▶️ 2 queued · ▶️ 7 not started · ✅ 13 no changes — View all
Summary by cubic
Adds ModelsLab as a provider to access uncensored Llama 3.1 8B and 70B models with a 128K context window via an OpenAI-compatible endpoint. No behavior changes or new dependencies.
Written for commit b061917. Summary will update on new commits.