We performed this test on a R7525 compute host with 96 virtual machines. We used the NVIDIA nVector Knowledge Worker workload and used VMware Horizon 7 linked clones to provision the virtual desktops. The Blast Extreme remote display protocol was used with H.264 hardware encoding.
The following graph shows the percentage of CPU usage, core utilization, and readiness. As shown in the following graph, CPU usage and core utilization increased during login. We noted that the CPU usage was significantly higher than when the same 96-user test was carried out using GPUs. CPU usage reached 100 percent on several occasions during this testing. We recorded an average CPU usage of 92 percent during the steady state phase, which was above the 85 percent threshold that we set for CPU utilization.
Better CPU performance in the GPU test compared to this non-GPU test indicates that the GPUs improve the performance of the system while running the nVector Knowledge Worker workload. The GPUs perform some of the tasks otherwise carried out by the CPUs, thereby improving the overall performance of the system.
The following graph shows memory usage during the test. Memory usage remained constant. The average consumed memory and the average active memory recorded were 831 GB and 116 GB, respectively.
The following graph shows the network usage recorded during the testing. The peak network usage was around 409 Mbps. Network bandwidth was not an issue. With two 25 GbE NICs configured as an uplink in an active/active team, network bandwidth usage was well under the 85 percent threshold set for network throughput.
The image quality testing takes screenshots on the endpoint and the virtual desktop and then makes comparisons of how the display protocol is performing. As shown in the following figure, the image quality SSIM value was 0.998. This shows that the image quality was excellent and was not degraded by the remoting protocol. The image quality in both the GPU and non-GPU tests was almost equal.
Frame rate measures the smoothness of the virtual desktop session experienced at the endpoint by measuring the rate at which frames are delivered on the screen of the endpoint device (FPS). An average FPS of 16 was recorded during this testing.
The end-user latency metric indicates how responsive the VDI session is at the user’s endpoint. The average end-user latency measured by the nVector Lite tool during the test was 115 ms. This low latency figure indicates that the remote session was very responsive.