Generative AI models can be sizable, with many parameters and intermediate outputs. This volume means that the models require significant amounts of storage to hold all the data. It is common to use distributed storage systems such as Hadoop or Spark to store the training data and intermediate outputs during training. For inferencing, it might be possible to store the model on a local disk, but for larger models, it might be necessary to use network-attached storage or cloud-based storage solutions. Scalable, high-capacity, and low-latency storage components for both file object and file store are essential in AI systems.