Docker has updated Compose with new features that will make it easier for developers to build, ship, and run AI agents.
Developers can define open models, agents, and MCP-compatible tools in a compose.yaml
file and then spin up an agentic stack with a single command: docker compose up
.
Compose integrates with several agentic frameworks, including LangGraph, Embabel, Vercel AI SDK, Spring AI, CrewAI, Google’s ADK, and Agno.
It also now integrates with Google Cloud Run and Microsoft Azure Container Apps Service, allowing agents to be deployed to serverless environments.
Additionally, the company announced Docker Offload, which allows developers to offload compute-intensive workloads to high-performance cloud environments. “Build, test, and scale your agentic applications just like you always have locally, while Docker handles the heavy lifting behind the scenes,” Mark Cavage, president and COO of Docker, and Tushar Jain, EVP of engineering and product at Docker, wrote in a blog post.
The company is currently offering 300 minutes of free Offload usage so that users can try it out.
Other useful Docker capabilities for building agents include the MCP Catalog for finding tools that can be connected to agents, and Model Runner, which lets developers pull open-weight LLMs from Docker Hub, run them locally, and interact with them using OpenAI-compatible endpoints.
“The future of software is agentic, where every developer builds goal-driven, multi-LLM agents that reason, plan, and act across a rich ecosystem of tools and services. With Docker Compose, Docker Offload, Docker’s broader AI capabilities, and our partnerships with Google, Microsoft, and Agent SDKs, we’re making that future accessible to, and easy for, everyone,” Cavage and Jain wrote.
The post Docker Compose gets new features for building and running agents appeared first on SD Times.
Source: Read MoreÂ