Copilot in Azure streamlines cloud operations with natural language commands.
Published: Apr 09, 2025
Key Takeaways:
Microsoft has made Copilot in Azure generally available for commercial users. The company has also added Meta’s Llama 4 models to Azure AI Foundry and Azure Databricks.
Microsoft launched Copilot in Azure in public preview in May 2024. This new AI-powered tool allows customers to manage, optimize, and troubleshoot Azure services and infrastructure. They can ask questions, generate queries, and perform tasks through natural language commands. The Copilot AI assistant operates within Azure’s robust security framework to ensure data privacy and compliance.
“Within Microsoft alone, we estimate that Copilot saves more than 30,000 developer hours every month. Some of the largest enterprises across multiple segments, including finance, healthcare, and consumer goods, use Copilot to simplify their cloud operations,” Microsoft explained.
Microsoft highlighted that this release brings several enhancements in performance, accessibility, and availability. The company has improved Copilot response times by over 30 percent by implementing front-end enhancements such as streaming and backend optimizations in the orchestrator and skills layer. Copilot in Azure is also getting a revamped UI to meet high accessibility standards.
According to Microsoft, Copilot in Azure will be available and operational 99.9% of the time. Moreover, Copilot in Azure is developed following Microsoft’s Responsible AI principles, and rigorous testing is conducted to identify and mitigate any harmful behaviors. Microsoft has also added support for 19 languages, including English, French, German, and Chinese.
Microsoft highlighted that users can leverage Copilot to generate Terraform configurations for managing Azure infrastructure. The AI-powered tool also offers tools for diagnosing and troubleshooting Azure Kubernetes Service (AKS) clusters.
Microsoft has announced the general availability of Copilot on the Azure mobile app. The latest update brings several new features, such as enhanced entry points, real-time AI chat streaming, improved accessibility and localization, and more.
Last but not least, Microsoft is bringing Meta’s Llama 4 Scout and Maverick models (Scout and Maverick) into Azure AI Foundry. These models are designed to handle and integrate multiple types of data (such as text and images) within a single model architecture. Llama 4 Scout is designed for tasks like summarization, personalization, and handling extensive reasoning. It supports up to 10 million tokens while running efficiently on a single H100 GPU.
Llama 4 Maverick is a robust model designed for multilingual and multimodal chat applications. This model leverages 17 billion active parameters and a Mixture of Experts (MoE) architecture to excel in reasoning, coding, and image understanding.
Microsoft notes that Llama 4 models undergo rigorous safety checks at every stage of development. Moreover, the Llama 4 family is well-suited for enterprises looking to build advanced AI solutions.