LIVE
ANTHROPICOpus 4.7 benchmarks published2m ago
CLAUDEOK142ms
OPUS 4.7$15 / $75per Mtok
CHATGPTOK89ms
HACKERNEWSWhy has not AI improved design quality the way it improved dev speed?14m ago
MMLU-PROleader Opus 4.788.4
GEMINIDEGRADED312ms
MISTRALMistral Medium 3 released6m ago
GPT-4o$5 / $15per Mtok
ARXIVCompositional reasoning in LRMs22m ago
BEDROCKOK178ms
GEMINI 2.5$3.50 / $10.50per Mtok
THE VERGEFrontier Model Forum expansion announced38m ago
SWE-BENCHleader Claude Opus 4.772.1%
MISTRALOK104ms
ANTHROPICOpus 4.7 benchmarks published2m ago
CLAUDEOK142ms
OPUS 4.7$15 / $75per Mtok
CHATGPTOK89ms
HACKERNEWSWhy has not AI improved design quality the way it improved dev speed?14m ago
MMLU-PROleader Opus 4.788.4
GEMINIDEGRADED312ms
MISTRALMistral Medium 3 released6m ago
GPT-4o$5 / $15per Mtok
ARXIVCompositional reasoning in LRMs22m ago
BEDROCKOK178ms
GEMINI 2.5$3.50 / $10.50per Mtok
THE VERGEFrontier Model Forum expansion announced38m ago
SWE-BENCHleader Claude Opus 4.772.1%
MISTRALOK104ms
All harnesses

Continue

Continue.dev

Continue is an open-source VS Code and JetBrains agent with a strong local-first story (Ollama, LM Studio, llama.cpp first-class) and a configurable per-task model router. It started as a code assistant and is layering agentic features on top, so the SWE-bench numbers trail the agent-native harnesses but the customization surface is broader than most.

Type
ide
License
Open source
Model story
Multi-model, BYOK
Vendor
Continue.dev

Leaderboard Placements

BenchmarkBest base modelScoreRank
SWE-bench Verified Claude Sonnet 4.652.4#15 / 15
Terminal-Bench
Aider Polyglot
SWE-Lancer

Distribution

Open-source extensions for VS Code and JetBrains. Apache 2.0 license.

Model Story

Multi-model with bring-your-own-key. First-class local model support via Ollama and LM Studio.

Pricing

Free harness; you pay for whatever model provider you point it at.

Who It's For

Engineers who want a customizable, mostly-assistant-shaped agent with strong support for local models.

Notable Features

  • First-class local model support (Ollama, LM Studio, llama.cpp)
  • Per-task model routing config
  • Custom slash commands
  • Open-source under Apache 2.0
  • VS Code and JetBrains parity
Vendor site for Continue:https://continue.dev

Other Harnesses