Azure Weekly: Python Gets First-Class MCP Support, Custom Agents Hit Azure Boards

Published: (March 6, 2026 at 10:55 PM EST)
6 min read
Source: Dev.to

Source: Dev.to

[![Hector Flores](https://media2.dev.to/dynamic/image/width=50,height=50,fit=cover,gravity=auto,format=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2155191%2F4eb16de9-82ac-4486-b7cd-6c0ec2b33daf.png)](https://dev.to/htekdev)

# Azure Goes All‑In on Python‑Native AI Development

This past week Azure made significant moves toward making AI‑agent development feel native in Python. The Azure MCP Server **landed on PyPI**, GitHub Copilot custom agents now **integrate directly into Azure Boards**, and the February Azure SDK release shipped critical infrastructure for ASP.NET Core apps. If you’ve been waiting for Azure to meet Python developers where they are, this is the week it happened.

---

## Azure MCP Server: Finally Native in Python

Until last week, using the **[Azure Model Context Protocol (MCP) Server](https://github.com/microsoft/mcp)** required Node.js or .NET. For Python developers building AI agents—the majority of the AI ecosystem—this was friction. Now you can install and run the Azure MCP Server directly from PyPI with `uvx` or `pip`:

```bash
# Run on‑demand with uvx (recommended)
uvx --from msmcp-azure azmcp server start

# Or install into your project
pip install msmcp-azure

Why it matters

  • MCP is becoming the standard protocol for AI agents to interact with external tools and services.
  • The GitHub Copilot SDK, Claude Desktop, VS Code, Visual Studio, IntelliJ, and Eclipse all support it.
  • Python developers now get first‑class access to 40+ Azure services through a single MCP server—no JavaScript runtime required.

The package supports the same server modes as the Node.js and .NET versions:

  • namespace (default)
  • all
  • single tool modes
  • Read‑only mode to prevent accidental writes

You can run it in Docker for CI/CD pipelines, or use it directly in your IDE.


GitHub Copilot Custom Agents + Azure Boards: The AI DevOps Bridge

Azure DevOps Sprint 269 shipped a quietly important feature: GitHub Copilot custom agents now integrate with Azure Boards.

How it works

  1. Create a custom agent at the repository or organization level in GitHub.
  2. The agent automatically appears in Azure DevOps.
  3. When you create a pull request from a work item, select which agent should generate the code.
  4. The agent writes the code, opens the PR, and links it back to the work item.

This is the first real bridge between GitHub’s native AI capabilities and Azure DevOps’ work‑tracking. Enterprises that use Azure Boards for project management but host code on GitHub now get agentic code generation that respects their work‑item structure. The 2,000‑repository connection limit increase (up from the previous cap) means large orgs can use this at scale.

Strategic angle – Microsoft isn’t trying to compete with GitHub’s AI features; it’s integrating them into Azure DevOps. It’s a pragmatic acknowledgment that GitHub is where the AI action is, and Azure DevOps survives by being the enterprise workflow layer on top of it.


New OpenAI Models: GPT‑Realtime‑1.5 and GPT‑Audio‑1.5

February introduced two new OpenAI models to Azure:

Improvements over the previous generation focus on:

  • Better instruction following
  • Expanded multilingual support
  • Enhanced tool‑calling capabilities

Both models maintain sub‑second latency, making them ideal for Copilot voice mode, customer‑service bots, or interactive voice assistants.

Access is through the existing Azure OpenAI chat‑completion APIs, so if you’re already using gpt‑4o or similar, swapping in these models is just a configuration change. The key question is whether the improved tool‑calling and instruction following justify any cost delta—but for voice‑first apps, latency alone may be enough.


Azure SDK February Release: Infrastructure Upgrades

The February 2026 Azure SDK release shipped several important plumbing upgrades.

Azure.Core 1.51.0 for .NET

  • Adds support for Microsoft.Extensions.Configuration and Microsoft.Extensions.DependencyInjection, allowing Azure SDK clients to integrate seamlessly with ASP.NET Core’s native DI container.
  • Introduces client‑certificate rotation support in the transport layer, enabling dynamic token‑binding scenarios where certificates can be updated at runtime without tearing down the HTTP pipeline. This is essential for workloads with short‑lived certificates or compliance‑driven rotation.

corehttp 1.0.0b7 for Python

  • Provides native OpenTelemetry tracing support.
  • The new OpenTelemetryTracer class wraps the OTel tracer for creating spans, and SDK clients can now define tracing directly via this class.

These upgrades tighten the developer experience across both .NET and Python, reducing boiler‑plate and improving observability and security.



Written by Hector Flores.


```markdown
## Azure Content Understanding 1.0.0b1

This is the initial beta of Azure's unified API for extracting insights from documents, audio, and video. If you're building [multi‑modal AI systems](https://htek.dev/articles/choosing-the-right-ai-sdk/), this is worth tracking—content understanding is one of the gaps between code‑first AI and real‑world business data.

---

## What This Week Signals

These updates cluster around a theme: **Azure is meeting developers where they are, not forcing them into Azure‑specific workflows**.

- Python support for MCP  
- GitHub Copilot agents in Azure Boards  
- OpenTelemetry support in the Python SDK  

These are all moves toward interoperability. Azure isn’t trying to own the entire stack anymore; it’s positioning itself as the enterprise‑grade infrastructure layer for AI development that happens primarily in Python, GitHub, and open‑source tooling.

The risk for Azure is that this strategy only works if they stay competitive on AI capabilities and pricing. If [Claude Opus 4.6 on Azure Foundry](https://azure.microsoft.com/en-us/blog/claude-opus-4-6-anthropics-powerful-model-for-coding-agents-and-enterprise-workflows-is-now-available-in-microsoft-foundry-on-azure/) is meaningfully slower or more expensive than running it elsewhere, no amount of MCP integration will matter.

But for now, Azure is making the right bets. The path to agentic DevOps [isn’t through proprietary tooling](https://htek.dev/articles/agentic-devops-next-evolution-of-shift-left/)—it’s through protocols, open standards, and letting developers use the tools they already know.

---

## The Bottom Line

Azure had a strong week.  

- **Python MCP support** removes friction for the largest AI developer community.  
- **GitHub Copilot custom agents in Azure Boards** bridge the gap between work tracking and AI code generation.  
- **SDK updates** ship the infrastructure pieces—DI support, certificate rotation, OpenTelemetry tracing—that production AI systems actually need.

Watch how Azure handles the next wave: if agentic workflows become the default, Azure’s success depends on being the best place to *run* those agents, not necessarily the best place to *build* them. This week suggests they understand that distinction.

---

### Instrumentation Note

Configure with the `_instrumentation_config` class variable. This is critical for observability in production AI systems—you need to trace Azure SDK calls alongside your agent logic to diagnose latency or failures.
0 views
Back to Blog

Related posts

Read more »