Home > Workload Solutions > Data Analytics > White Papers > Scale AI Training and Fine-Tuning with Dell PowerScale and PowerEdge Servers > Overview
Generative AI has been front and center in the current tech landscape, especially as it relates to the compute and GPU aspect of the workflows. Regardless of the phase in the GenAI lifecycle—customization and fine-tuning, training, or inference—data is a constant. These operations simply cannot exist without data, from existing data being used to generate intelligent business insights and outcomes to new data created as a result, all of which must be supported by the hosting storage system.
Storage is a critical component within the architecture and will need to be able to support the various workloads that are applied throughout the AI workflows. A proper storage system must be able to support the various requirements of AI workflows including streaming and random reads and writes, as well as scale linearly alongside the compute and GPU requirements. The storage software will need to support hundreds to thousands of concurrent connections as each GPU draws data from the storage system.
Fortunately, Dell Technologies and NVIDIA have teamed up to bring together the industry’s leading AI platform, leading scale-out file platform, and award winning server platforms.