Celebrating Women in AI: 3 Questions with Cecilia Liu on Leading Docker’s MCP Strategy
Source: Docker Blog
To celebrate International Women’s Day, we sat down with Cecilia Liu, Senior Product Manager at Docker, for three questions about the vision and strategy behind Docker’s MCP solutions. From shaping product direction to driving AI innovation, Cecilia plays a key role in defining how Docker enables secure, scalable AI tooling.

Cecilia leads product management for Docker’s MCP Catalog and Toolkit, our solution for running MCP servers securely and at scale through containerization. She drives Docker’s AI strategy across both enterprise and developer ecosystems, helping organizations deploy MCP infrastructure with confidence while empowering individual developers to seamlessly discover, integrate, and use MCP in their workflows. With a technical background in AI frameworks and an MBA from NYU Stern, Cecilia bridges the worlds of AI infrastructure and developer tools, turning complex challenges into practical, developer‑first solutions.
What products are you responsible for?
I own Docker’s MCP solution. At its core, it’s about solving the problems that anyone working with MCP runs into: how do you find the right MCP servers, how do you actually use them without a steep learning curve, and how do you deploy and manage them reliably across a team or organization.
How does Docker’s MCP solution benefit developers and enterprise customers?
Dev productivity is where my heart is. I want to build something that meaningfully helps developers at every stage of their cycle — and that’s exactly how I think about Docker’s MCP solution.
For end‑user developers and “vibe coders,” the goal is simple: you shouldn’t need to understand the underlying infrastructure to get value from MCP. As long as you’re working with AI, we make it easy to discover, configure, and start using MCP servers without the usual setup headaches. User feedback highlighted that people couldn’t even tell if their setup was actually working, which pushed us to ship in‑product setup instructions that walk you through configuration and verification. It sounds small, but it made a real difference.
For developers building MCP servers and integrating them into agents, I focus on providing the right creation and testing tools so they can ship faster and with more confidence.
For security and enterprise admins, we solve real deployment pain, making it faster and cheaper to roll out and manage MCP across an entire organization. Features include custom catalogs, role‑based access controls, audit logging, and policy enforcement. The goal is to give teams the visibility and control they need to adopt AI tooling confidently at scale.
Customers love us for all of the above, and there’s one more thing that ties it together: the security that comes built‑in with Docker. That trust doesn’t happen overnight, and it’s something we take seriously across everything we ship.
What are you excited about when it comes to the future of MCP?
What excites me most is honestly the pace of change itself. The AI landscape is shifting constantly, and with every new tool that makes AI more powerful, there’s a whole new set of developers who need a way to actually use it productively. That’s a massive opportunity.
MCP is where that’s happening right now, and the adoption we’re seeing tells me the need is real. What gets me out of bed is knowing the problems we’re solving—discoverability, usability, deployment—will matter just as much for whatever comes next. We’re not just building for today’s tools; we’re building the foundation that developers will reach for every time something new emerges.
Cecilia is speaking about scaling MCP for enterprises at the MCP Dev Summit in NYC on 3 April 2026. If you’re attending, be sure to stop by Docker’s booth (D/P9).
Learn more
- Explore Docker’s MCP Catalog and Toolkit on our website.
- Dive into our documentation to get started quickly.
- Ready to go hands‑on? Open Docker Desktop or the CLI and start using MCP to streamline and automate your development workflows.