docker compose

Docker Compose integrates AI agents and aligns with DevOps workflows

Docker is modernizing its Docker Compose tool to address the growing needs of artificial intelligence. The company has added a dedicated “AI models” block to the open-source Compose specification, enabling developers to define and deploy AI agents directly within DevOps workflows.

With this update, developers can connect models to various tools through the Model Context Protocol (MCP) and run them on cloud infrastructures with just a few commands. The goal is to simplify the development of hybrid applications that combine traditional business logic with AI models, thanks to a unified deployment in a single YAML file.

A strengthened partnership with Google Cloud

This update comes with deeper integration with Google Cloud Run, through a dedicated command (gcloud run compose up). Support for Microsoft Azure Container Apps will follow soon. Docker Compose is also becoming compatible with several AI frameworks, including CrewAI, Embabel, LangGraph, Spring AI, Google’s ADK, and Vercel’s AI SDK.

 

Easier access to GPU resources

Another new feature is Docker Offload, which provides access to cloud GPUs directly from Docker Desktop. This allows developers to design and test their AI agents locally without the need for powerful hardware.

An open-source gateway to secure exchanges

Finally, Docker is making its MCP Gateway available, an open-source gateway under the Apache 2.0 license. It secures interactions between AI agents and business tools — a component considered critical for enterprise deployments.

Visit our « News » page to discover other must-read updates.