AI Code Routing Gets Personal: Arch-Router Lets You Choose Your LLM

AI Code Routing Gets Personal: Arch-Router Lets You Choose Your LLM

Photo by Pixabay on Pexels

A new system called Arch-Router is changing the way developers interact with AI coding assistants. By using Arch Gateway, this preference-aware routing system lets users direct coding tasks to specific Large Language Models (LLMs) like Grok, Mistral, Gemini, DeepSeek, GPT, and even local models via Ollama, based on their individual preferences. This moves away from relying solely on generic benchmarks, allowing for subjective evaluations tailored to specific project needs. The Arch-Router system offers access to these diverse LLMs through a unified command-line interface (CLI) agent. Developers can explore the Arch Gateway repository on GitHub ([https://github.com/katanemo/archgw](https://github.com/katanemo/archgw)), and find specific Claude Code routing support details online ([https://github.com/katanemo/archgw/tree/main/demos/use_cases/claude_code_router](https://github.com/katanemo/archgw/tree/main/demos/use_cases/claude_code_router)).