Microsoft Introduces Mu Model for On-Device AI Agent in Windows Settings

martes, 24 de junio de 2025, 4:26 am ET1 min de lectura
MSFT--
MU--
NVDA--
PHI--

Microsoft has introduced a new lightweight language model called "Mu" that powers agentic AI Settings features on Copilot+ PCs. Mu runs entirely on-device using the Neural Processing Unit (NPU) and delivers response speeds of over 100 tokens per second, making it fast enough to power AI agents in certain scenarios. The model is designed to be efficient and private, using an encoder-decoder architecture and hardware-friendly tweaks to reduce its size without sacrificing performance. Mu was trained using Azure cloud platform and NVIDIA A100 graphics processors, and builds on techniques from Microsoft's previous small models like the Phi family.

Microsoft Corp. (MSFT) has introduced a new lightweight language model named Mu, designed to power agentic AI Settings features on Copilot+ PCs. Mu operates entirely on-device using the Neural Processing Unit (NPU), delivering response speeds of over 100 tokens per second. This efficiency is achieved through an encoder-decoder architecture and hardware-friendly tweaks, making it suitable for fast, private AI interactions [1].

Mu is trained on Azure cloud platform and NVIDIA A100 graphics processors, building on techniques from Microsoft's previous small models like the Phi family. The model's architecture includes a transformer-based encoder-decoder structure, which splits input encoding and output generation, improving processing efficiency and reducing memory demands [2].

The AI agent in the Settings menu allows users to ask natural language questions and take actions on hundreds of system settings with permission. Mu can generate more than 100 tokens every second, delivering responses in under half a second during real-time use. When deployed on tuned systems such as the Surface Laptop 7, its output surpasses 200 tokens per second [1].

Mu is currently in preview for some Windows Insiders, with support for Snapdragon-powered Copilot+ PCs. Microsoft plans to extend support to AMD and Intel-based PCs at an unspecified date [2].

Microsoft's introduction of Mu aligns with the growing trend of Small Language Models (SLMs), which offer efficient and specialized AI capabilities. These models can run on consumer hardware, reducing computational requirements and energy consumption. SLMs are particularly advantageous for real-time applications, edge computing, and privacy-conscious organizations [3].

Microsoft stock has gained over 15% year-to-date and over 8% in the last 12 months, reflecting investor confidence in the company's AI initiatives [1].

References:
[1] https://stocktwits.com/news-articles/markets/equity/microsoft-compact-mu-language-model-powers-instant-ai-settings-agent-for-windows/chm5fLFRRdE
[2] https://www.techrepublic.com/article/news-microsoft-mu-small-language-model-ai/
[3] https://medium.com/@ssharshitha2701/small-language-models-the-future-of-efficient-ai-140755117333

Microsoft Introduces Mu Model for On-Device AI Agent in Windows Settings

Comentarios



Add a public comment...
Sin comentarios

Aún no hay comentarios