For information about the cluster requirements, see:
Object storage is required for SAP Data Intelligence to enable the checkpoint store (SAP Vora database streaming tables require the checkpoint store to be enabled) and for SAP Data Intelligence backup and restore functionality. Object storage is also required for running AI or ML scenarios. Further, you must have configured the SDL Data Lake connection with object storage to be able to use the Machine Learning Scenario Manager, Modeler, and other SAP Data Intelligence components.
OpenShift Container Storage 4.6 is required for the SAP Data Intelligence 3.1 deployment on the Ready Stack to provide both the object storage and the persistent volumes. You can deploy OpenShift Container Storage 4.6 running on storage nodes on OpenShift Container Platform for this purpose.
Note: The checkpoint store object storage solutions must be validated and verified with SAP. For the list of supported platforms for the checkpoint store with SAP Data Intelligence, see SAP Note 2693555.
The following tables show the minimum requirements and number of instances for each node type for the latest validated SAP Data Intelligence and OpenShift Container Platform 4.X releases.
The following table shows the minimum nonproduction requirements for a proof of concept (PoC):
Table 2. Minimum hardware requirements for dedicated nodes
Node type |
Count |
Operating system |
Minimum CPU/vCPU cores |
RAM |
Storage |
CSAH |
1 |
Red Hat Enterprise Linux 7.6+ |
4 / 8 |
32 GB |
200 GB |
Bootstrap VM |
1 |
Red Hat Enterprise Linux CoreOS (RHCOS) |
4 / 8 |
16 GB |
120 GB |
Controller |
3 |
RHCOS |
4 / 8 |
16 GB |
120 GB |
Compute |
3+ |
RHCOS or Red Hat Enterprise Linux 7.8 or 7.9 |
4 / 8 |
32 GB |
120 GB |
Storage (OpenShift Container Storage 4.6) |
3 |
RHCOS |
5 / 10 |
32 GB |
120 GB + 2 TB |
The following table shows the minimum size of a cluster with an SAP Data Intelligence workload running on the control-plane nodes:
Table 3. Minimum requirements for shared compute and controller nodes
Node type |
Count |
Operating system |
Minimum CPU/vCPU cores |
RAM |
Storage |
CSAH |
1 |
Red Hat Enterprise Linux 7.6+ |
4 / 8 |
32 GB |
200 GB |
Bootstrap VM |
1 |
RHCOS |
4 / 8 |
16 GB |
120 GB |
Controller/Compute |
3 |
RHCOS |
5 / 10 |
40 GB |
120 GB |
Storage (OpenShift Container Storage 4.6) |
3 |
RHCOS |
5 / 10 |
24 GB |
120 GB + 2 TB |
The following table shows the minimum requirements for production systems for the latest validated SAP Data Intelligence and OpenShift Container Platform 4 releases:
Table 4. Minimum production requirements for dedicated nodes
Node type |
Count |
Operating system |
Minimum CPU/vCPU cores |
RAM |
Storage |
CSAH |
1 |
Red Hat Enterprise Linux 7.6+ |
4 / 8 |
32 GB |
200 GB |
Bootstrap VM |
1 |
RHCOS |
4 / 8 |
16 GB |
120 GB |
Controller |
3 |
RHCOS |
4 / 8 |
16 GB |
120 GB |
Compute |
3+ |
RHCOS or Red Hat Enterprise Linux 7.8 or 7.9 |
8 / 16 |
64 GB |
120 GB |
Storage OpenShift Container Platform 4.6) |
3 |
RHCOS |
10 / 20 |
49 GB |
120 + 6 x 2 TB |
The following table shows the minimum cluster size on which an SAP Data Intelligence workload was run on the control-plane nodes:
Table 5. Minimum requirements for shared compute and controller nodes
Node type |
Count |
Operating system |
Minimum CPU/vCPU cores |
RAM |
Storage |
CSAH |
1 |
Red Hat Enterprise Linux 7.6+ |
4 / 8 |
32 GB |
200 GB |
Bootstrap VM |
1 |
RHCOS |
4 / 8 |
16 GB |
120 GB |
Controller/Compute |
3 |
RHCOS |
11 / 22 |
72 GB |
120 GB |
Storage (OCS 4.6) |
3 |
RHCOS |
10 / 20 |
49 GB |
120 + 6 x 2 TB |