AI/ML
-
Jan 16, 2026
Making (Very) Small LLMs Smarter
Run tiny LLMs locally and still get helpful code. Use vector search to feed the right snippets with Docker Model Runner, LangChainJS, and Nova.
Read now
-
Jan 15, 2026
OpenCode with Docker Model Runner for Private AI Coding
Configure OpenCode to use Docker Model Runner for a private, cost-aware coding assistant. Run models locally via an OpenAI-compatible API with full control.
Read now
-
Docker Captain Dec 16, 2025
Develop and deploy voice AI apps using Docker
Build real-time voice agents with Docker. Use EchoKit, Model Runner, and the MCP Toolkit to run ASR/LLM/TTS locally or in the cloud
Read now
-
Dec 16, 2025
Docker Model Runner now included with the Universal Blue family
Docker Model Runner now ships with Universal Blue (Aurora, Bluefin), delivering an out-of-the-box, GPU-ready AI development environment.
Read now
-
Dec 11, 2025
Docker Model Runner now supports vLLM on Windows
Run vLLM with GPU acceleration on Windows using Docker Model Runner and WSL2. Fast AI inference is here.
Read now
-
Docker Captain Dec 11, 2025
Breaking Free From AI Vendor Lock-in: Integrating GitHub Models with Docker cagent
See how Docker cagent integrates with GitHub Models to build and ship multi-agent apps without vendor lock-in.
Read now
-
Dec 5, 2025
Docker, JetBrains, and Zed: Building a Common Language for Agents and IDEs
As agents become capable enough to write and refactor code, they should work natively inside the environments developers work in: editors. That’s why JetBrains and Zed are co-developing ACP, the Agent Client Protocol. ACP gives agents and editors a shared language, so any agent can read context, take actions, and respond intelligently without bespoke wiring…
Read now
-
Dec 5, 2025
Announcing vLLM v0.12.0, Ministral 3 and DeepSeek-V3.2 for Docker Model Runner
Run Ministral 3 and DeepSeek-V3.2 on Docker Model Runner with vLLM 0.12. Test-drive the latest open-weights models as soon as they’re released.
Read now