Constructed for Agentic Scale and Cloud‑Native Apps


2025 was a pivotal 12 months in Azure Storage, and we’re heading into 2026 with a transparent deal with serving to clients flip AI into actual impression.

2025 was a pivotal 12 months in Azure Storage, and we’re heading into 2026 with a transparent deal with serving to clients flip AI into actual impression. As outlined in final December’s Azure Storage improvements: Unlocking the way forward for knowledge, Azure Storage is evolving as a unified clever platform that helps the total AI lifecycle at enterprise scale with the efficiency trendy workloads demand.

Looking forward to 2026, our investments span the total breadth of that lifecycle as AI turns into foundational throughout each {industry}. We’re advancing storage efficiency for frontier mannequin coaching, delivering goal‑constructed options for big‑scale AI inferencing and rising agentic functions, and empowering cloud‑native functions to function at agentic scale. In parallel, we’re simplifying adoption for mission‑essential workloads, reducing TCO, and deepening partnerships to co‑engineer AI‑optimized options with our clients.

We’re grateful to our clients and companions for his or her belief and collaboration, and excited to form the following chapter of Azure Storage collectively within the 12 months forward.

Extending from coaching to inference

AI workloads prolong from massive, centralized mannequin coaching to inference at scale, the place fashions are utilized repeatedly throughout merchandise, workflows, and real-world choice making. LLM coaching continues to run on Azure, and we’re investing to remain forward by increasing scale, bettering throughput, and optimizing how mannequin recordsdata, checkpoints, and coaching datasets circulate via storage.

Improvements that helped OpenAI to function at unprecedented scale are actually out there for all enterprises. Blob scaled accounts permit storage to scale throughout a whole bunch of scale items inside a area, dealing with hundreds of thousands of objects required to allow enterprise knowledge for use as coaching and tuning datasets for utilized AI. Our partnership with NVIDIA DGX on Azure exhibits that scale interprets into real-world inference. DGX cloud was co-engineered to run on Azure, pairing accelerated compute with high-performance storage, Azure Managed Lustre (AMLFS), to assist LLM analysis, automotive, and robotics functions. AMLFS gives one of the best price-performance for protecting GPU fleets repeatedly fed. We just lately launched Preview assist for 25 PiB namespaces and as much as 512 GBps of throughput, making AMLFS greatest in school managed Lustre deployment on Cloud.

As we glance forward, we’re deepening integration throughout well-liked first and third-party AI frameworks equivalent to Microsoft Foundry, Ray, Anyscale, and LangChain, enabling seamless connections to Azure Storage out of field. Our native Azure Blob Storage integration inside Foundry permits enterprise knowledge consolidation into Foundry IQ, making blob storage the foundational layer for grounding enterprise information, fine-tuning fashions, and serving low-latency context to inference, all underneath the tenant’s safety and governance controls.

From coaching via full-scale inferencing, Azure Storage helps the whole agent lifecycle: from distributing massive mannequin recordsdata effectively, storing and retrieving long-lived context, to serving knowledge from RAG vector shops. By optimizing for every sample end-to-end, Azure Storage has performant options for each stage of AI inference.

Evolving cloud native functions for agentic scale

As inference turns into the dominant AI workload, autonomous brokers are reshaping how cloud native functions work together with knowledge. Not like human-driven programs with predictable question patterns, brokers function repeatedly, issuing an order of magnitude extra queries than conventional customers ever did. This surge in concurrency stresses databases and storage layers, pushing enterprises to rethink how they architect new cloud native functions.

Azure Storage is constructing with SaaS leaders like ServiceNow, Databricks, and Elastic to optimize for agentic scale leveraging our block storage portfolio. Trying ahead, Elastic SAN turns into a core constructing block for these cloud native workloads, beginning with reworking Microsoft’s personal database options. It presents totally managed block storage swimming pools for various workloads to share provisioned assets with guardrails for internet hosting multi-tenant knowledge. We’re pushing the boundaries on max scale items to allow denser packing and capabilities for SaaS suppliers to handle agentic visitors patterns.

As cloud native workloads undertake Kubernetes to scale quickly, we’re simplifying the event of stateful functions via our Kubernetes native storage orchestrator, Azure Container Storage (ACStor) alongside CSI drivers. Our latest ACStor launch indicators two directional adjustments that may information upcoming investments: adopting the Kubernetes operator mannequin to carry out extra advanced orchestration and open sourcing the code base to collaborate and innovate with the broader Kubernetes neighborhood.

Collectively, these investments set up a powerful basis for the following era of cloud native functions the place storage should scale seamlessly and ship excessive effectivity to function the info platform for agentic scale programs.

Breaking worth efficiency obstacles for mission essential workloads

In addition to evolving AI workloads, enterprises proceed to develop their mission essential workloads on Azure.

SAP and Microsoft are partnering collectively to broaden core SAP efficiency whereas introducing AI-driven brokers like Joule that enrich Microsoft 365 Copilot with enterprise context. Azure’s newest M-series developments add substantial scale-up headroom for SAP HANA, pushing disk storage efficiency to ~780k IOPS and 16 GB/s throughput. For shared storage, Azure NetApp Information (ANF) and Azure Premium Information ship the excessive throughput NFS/SMB foundations SAP landscapes depend on, whereas optimizing TCO with ANF Versatile Service Stage and Azure Information Provisioned v2. Coming quickly, we’ll introduce Elastic ZRS storage service stage in ANF, bringing zone‑redundant excessive availability and constant efficiency via synchronous replication throughout availability zones leveraging Azure’s ZRS structure, with out added operational complexity.

Equally, Extremely Disks have turn into foundational to platforms like BlackRock’s Aladdin, which should react immediately to market shifts and maintain high-performance underneath heavy load. With common latency properly underneath 500 microseconds, assist for 400K IOPS, and 10 GB/s throughput, Extremely Disks allow sooner danger calculation, extra agile portfolio administration, and resilient efficiency on BlackRock’s highest-volume buying and selling days. When paired with Ebsv6 VMs, Extremely Disks can attain 800K IOPS and 14 GB/s for essentially the most demanding mission essential workloads. And with versatile provisioning, clients can tune efficiency exactly to their wants whereas optimizing TCO.

These mixed investments give enterprises a extra resilient, scalable, and cost-efficient platform for his or her most crucial workloads.

Designing for brand new realities of energy and provide

The worldwide AI surge is straining energy grids and {hardware} provide chains. Rising vitality prices, tight datacenter budgets, and industry-wide HDD/SSD shortages imply organizations can’t scale infrastructure just by including extra {hardware}. Storage should turn into extra environment friendly and clever by design.

We’re streamlining the whole stack to maximise {hardware} efficiency with minimal overhead. Mixed with clever load balancing and cost-effective tiering, we’re uniquely positioned to assist clients scale storage sustainably at the same time as energy and {hardware} availability turn into strategic constraints. With continued improvements on Azure Enhance Information Processing Items (DPUs), we count on step perform features in storage velocity and feeds at even decrease per unit vitality consumption.

AI pipelines can span on-premises estates, neo cloud GPU clusters, and cloud, but many of those environments are restricted by energy capability or storage provide. When these limits turn into a bottleneck, we make it straightforward to shift workloads to Azure. We’re investing in integrations that make exterior datasets firstclass residents in Azure, enabling seamless entry to coaching, finetuning, and inference knowledge wherever it lives. As cloud storage evolves into AI-ready datasets, Azure Storage is introducing curated, pipeline optimized experiences to simplify how clients feed knowledge into downstream AI companies.

Accelerating improvements via the storage associate ecosystem

We will’t do that alone. Azure Storage companions intently with strategic companions to push inference efficiency to the following stage. Along with the self-publishing capabilities out there in Azure Market, we go a step additional by devoting assets with experience to co-engineer options with companions to construct extremely optimized and deeply built-in companies.

In 2026, you will note extra co-engineered options like Commvault Cloud for Azure, Dell PowerScale, Azure Native Qumulo, Pure Storage Cloud, Rubrik Cloud Vault, and Veeam Information Cloud. We’ll deal with hybrid options with companions like VAST Information and Komprise to allow knowledge motion that unlocks the ability of Azure AI companies and infrastructure—fueling impactful buyer AI Agent and Software initiatives.

To an thrilling new 12 months with Azure Storage

As we transfer into 2026, our imaginative and prescient stays easy: assist each buyer unlock extra worth from their knowledge with storage that’s sooner, smarter, and constructed for the longer term. Whether or not powering AI, scaling cloud native functions, or supporting mission essential workloads, Azure Storage is right here that can assist you innovate with confidence within the 12 months forward.