Is your feature request related to a problem? Please describe.
Users who want to access multiple AI models (Claude, GPT, Gemini, DeepSeek) through a single API endpoint currently have limited gateway options in Roo Code. FuturMix provides a unified AI gateway with 22+ models, 99.99% SLA, and OpenAI-compatible API — similar to how OpenRouter and Requesty are already integrated.
Describe the solution you'd like
Add FuturMix as a native provider in Roo Code, following the same pattern as existing gateway providers (OpenRouter, Requesty, Vercel AI Gateway). The implementation would:
- Extend
base-openai-compatible-provider.ts
- Add provider type definitions in
packages/types/src/providers/futurmix.ts
- Add API handler in
src/api/providers/futurmix.ts
- Add UI component in
webview-ui/src/components/settings/providers/FuturMix.tsx
- Support dynamic model fetching via
/v1/models endpoint
- Default base URL:
https://futurmix.ai/v1
Describe alternatives you've considered
Users can use the "OpenAI Compatible" provider with a custom base URL, but a native integration provides:
- Curated model list with correct context windows and pricing
- Dynamic model discovery
- Proper provider branding in the UI
- First-class documentation
Additional context
- API: OpenAI-compatible (
/v1/chat/completions), also supports Anthropic Messages format (/v1/messages)
- Models: 22+ models including Claude 4 Opus, GPT-4o, Gemini 2.5 Pro, DeepSeek
- Features: Auto-failover, load balancing, 99.99% SLA
- Website: https://futurmix.ai
- Docs: https://futurmix.ai/docs
I'm happy to implement this PR myself. The implementation would follow the Requesty/Unbound pattern (extend BaseProvider with OpenAI SDK).
Is your feature request related to a problem? Please describe.
Users who want to access multiple AI models (Claude, GPT, Gemini, DeepSeek) through a single API endpoint currently have limited gateway options in Roo Code. FuturMix provides a unified AI gateway with 22+ models, 99.99% SLA, and OpenAI-compatible API — similar to how OpenRouter and Requesty are already integrated.
Describe the solution you'd like
Add FuturMix as a native provider in Roo Code, following the same pattern as existing gateway providers (OpenRouter, Requesty, Vercel AI Gateway). The implementation would:
base-openai-compatible-provider.tspackages/types/src/providers/futurmix.tssrc/api/providers/futurmix.tswebview-ui/src/components/settings/providers/FuturMix.tsx/v1/modelsendpointhttps://futurmix.ai/v1Describe alternatives you've considered
Users can use the "OpenAI Compatible" provider with a custom base URL, but a native integration provides:
Additional context
/v1/chat/completions), also supports Anthropic Messages format (/v1/messages)I'm happy to implement this PR myself. The implementation would follow the Requesty/Unbound pattern (extend BaseProvider with OpenAI SDK).