Be part of the occasion trusted by enterprise leaders for practically twenty years. VB Rework brings collectively the individuals constructing actual enterprise AI technique. Be taught extra
As AI transforms enterprise operations throughout numerous industries, vital challenges proceed to floor round information storage—irrespective of how superior the mannequin, its efficiency hinges on the flexibility to entry huge quantities of information rapidly, securely, and reliably. With out the best information storage infrastructure, even probably the most highly effective AI methods may be dropped at a crawl by sluggish, fragmented, or inefficient information pipelines.
This subject took heart stage on Day Certainly one of VB Rework, in a session centered on medical imaging AI improvements spearheaded by PEAK:AIO and Solidigm. Collectively, alongside the Medical Open Community for AI (MONAI) venture—an open-source framework for growing and deploying medical imaging AI—they’re redefining how information infrastructure helps real-time inference and coaching in hospitals, from enhancing diagnostics to powering superior analysis and operational use instances.
>>See all our Rework 2025 protection right here<<Innovating storage on the fringe of scientific AI
Moderated by Michael Stewart, managing associate at M12 (Microsoft’s enterprise fund), the session featured insights from Roger Cummings, CEO of PEAK:AIO, and Greg Matson, head of merchandise and advertising at Solidigm. The dialog explored how next-generation, high-capacity storage architectures are opening new doorways for medical AI by delivering the velocity, safety and scalability wanted to deal with large datasets in scientific environments.
Crucially, each firms have been deeply concerned with MONAI since its early days. Developed in collaboration with King’s Faculty London and others, MONAI is purpose-built to develop and deploy AI fashions in medical imaging. The open-source framework’s toolset—tailor-made to the distinctive calls for of healthcare—contains libraries and instruments for DICOM help, 3D picture processing, and mannequin pre-training, enabling researchers and clinicians to construct high-performance fashions for duties like tumor segmentation and organ classification.
A vital design objective of MONAI was to help on-premises deployment, permitting hospitals to keep up full management over delicate affected person information whereas leveraging commonplace GPU servers for coaching and inference. This ties the framework’s efficiency carefully to the information infrastructure beneath it, requiring quick, scalable storage methods to totally help the calls for of real-time scientific AI. That is the place Solidigm and PEAK:AIO come into play: Solidigm brings high-density flash storage to the desk, whereas PEAK:AIO makes a speciality of storage methods purpose-built for AI workloads.
“We have been very lucky to be working early on with King’s Faculty in London and Professor Sebastien Orslund to develop MONAI,” Cummings defined. “Working with Orslund, we developed the underlying infrastructure that enables researchers, docs, and biologists within the life sciences to construct on high of this framework in a short time.”
Assembly twin storage calls for in healthcare AI
Matson identified that he’s seeing a transparent bifurcation in storage {hardware}, with totally different options optimized for particular phases of the AI information pipeline. To be used instances like MONAI, comparable edge AI deployments—in addition to eventualities involving the feeding of coaching clusters—ultra-high-capacity solid-state storage performs a vital function, as these environments are sometimes area and power-constrained, but require native entry to large datasets.
As an illustration, MONAI was in a position to retailer greater than two million full-body CT scans on a single node inside a hospital’s current IT infrastructure. “Very space-constrained, power-constrained, and really high-capacity storage enabled some pretty outstanding outcomes,” Matson stated. This sort of effectivity is a game-changer for edge AI in healthcare, permitting establishments to run superior AI fashions on-premises with out compromising efficiency, scalability, or information safety.
In distinction, workloads involving real-time inference and lively mannequin coaching place very totally different calls for on the system. These duties require storage options that may ship exceptionally excessive enter/output operations per second (IOPS) to maintain up with the information throughput wanted by high-bandwidth reminiscence (HBM) and guarantee GPUs stay totally utilized. PEAK:AIO’s software-defined storage layer, mixed with Solidigm’s high-performance solid-state drives (SSDs), addresses each ends of this spectrum—delivering the capability, effectivity, and velocity required throughout the whole AI pipeline.
A software-defined layer for scientific AI workloads on the edge
Cummings defined that PEAK:AIO’s software-defined AI storage know-how, when paired with Solidigm’s high-performance SSDs, permits MONAI to learn, write, and archive large datasets on the velocity scientific AI calls for. This mix accelerates mannequin coaching and enhances accuracy in medical imaging whereas working inside an open-source framework tailor-made to healthcare environments.
“We offer a software-defined layer that may be deployed on any commodity server, reworking it right into a high-performance system for AI or HPC workloads,” Cummings stated. “In edge environments, we take that very same functionality and scale it all the way down to a single node, bringing inference nearer to the place the information lives.”
A key functionality is how PEAK:AIO helps eradicate conventional reminiscence bottlenecks by integrating reminiscence extra immediately into the AI infrastructure. “We deal with reminiscence as a part of the infrastructure itself—one thing that’s usually neglected. Our answer scales not simply storage, but additionally the reminiscence workspace and the metadata related to it,” Cummings stated. This makes a major distinction for purchasers who can’t afford—both when it comes to area or value—to re-run massive fashions repeatedly. By maintaining memory-resident tokens alive and accessible, PEAK:AIO permits environment friendly, localized inference with no need fixed recomputation.
Bringing intelligence nearer to the information
Cummings emphasised that enterprises might want to take a extra strategic strategy to managing AI workloads. “You possibly can’t be only a vacation spot. It’s important to perceive the workloads. We do some unimaginable know-how with Solidign and their infrastructure to be smarter on how that information is processed, beginning with the right way to get efficiency out of a single node,” Cummings defined. “So with inference being such a big push, we’re seeing generalists turning into extra specialised. And we’re now taking work that we’ve performed from a single node and pushing it nearer to the information to be extra environment friendly. We would like extra clever information, proper? The one approach to do this is to get nearer to that information.”
Some clear traits are rising from large-scale AI deployments, significantly in newly constructed greenfield information facilities. These services are designed with extremely specialised {hardware} architectures that deliver information as shut as doable to the GPUs. To realize this, they rely closely on all solid-state storage—particularly ultra-high-capacity SSDs—designed to ship petabyte-scale storage with the velocity and accessibility wanted to maintain GPUs constantly fed with information at excessive throughput.
“Now that very same know-how is mainly occurring at a microcosm, on the edge, within the enterprise,” Cumming defined. “So it’s turning into vital to purchasers of AI methods to find out how you choose your {hardware} and system vendor, even to ensure that if you wish to get probably the most efficiency out of your system, that you just’re working on all solid-state. This lets you deliver large quantities of information, just like the MONAI instance—it was 15,000,000 plus photos, in a single system. This permits unimaginable processing energy, proper there in a small system on the finish.”