We performed this test with a single GPU-enabled R7525 server acting as the compute host. The compute host was equipped with six NVIDIA T4 GPUs for graphic acceleration. We configured 96 vGPU-enabled virtual machines on the host with the NVIDIA GRID vPC T4-1B profile. We performed the testing with the NVIDIA nVector Knowledge Worker workload. The tests were performed on VMware Horizon 7 linked-clone desktops, and we used the VMware Horizon Blast Extreme display protocol with H.264 hardware encoding.
The following graph shows the percentage of CPU usage, core utilization, and readiness. CPU usage and core utilization increased during the login phase. An average CPU utilization of 72 percent was recorded during the steady state phase of this testing, which was below the 85 percent threshold we set for CPU utilization. The CPU readiness reached a peak of 8.6 percent and CPU usage spiked to almost 100 percent near the end of the test period. This was a one-time spike and it did not affect the performance of the system.
The following graph shows the performance of the six NVIDIA T4 GPUs during the testing. The average GPU utilization across the six GPUs was around 15 percent.
The following graph shows that the consumed and active memory usage remained constant during the testing. The average consumed memory and average active memory recorded were 769 GB and 842 GB, respectively. Memory was not a bottleneck during the testing, and the configured memory of 1,024 GB was sufficient for this workload.
The following graph shows the network usage recorded during the testing. The peak network usage was around 991 Mbps. Network bandwidth was not an issue. With two 25 GbE network interface controllers (NICs) configured as uplink in an active/active team, network bandwidth usage was well below the 85 percent threshold set for network throughput.
The image quality testing takes screenshots of the endpoint and the virtual desktop, and then makes comparisons to show how the display protocol is performing. As shown in the following figure, the image quality SSIM was 0.993. An SSIM value close to 1 indicates that the image quality was excellent and was not degraded by the remoting protocol.
Frame rate measures the smoothness of the virtual desktop session experienced at the endpoint. It measures the rate at which frames are delivered on the screen of the endpoint device (FPS). An average FPS of 20 was recorded during this testing.
The end-user latency metric indicates how responsive the VDI session is at the user’s endpoint. The average end-user latency measured by the nVector Lite tool during the test was 104 ms. This low latency figure indicates that the remote session was very responsive.