Home > Storage > PowerScale (Isilon) > Industry Solutions and Verticals > Analytics > Deep Learning with Dell EMC Isilon > Network
The solution comprises of three network fabrics. The head node and all compute nodes are connected with a 1 Gigabit Ethernet fabric. This connection is primarily used by Bright Cluster Manager for deployment, maintenance and monitoring the solution.
The second fabric connects the head node, and all compute nodes are through 100 Gb/s EDR InfiniBand. The EDR InfiniBand switch is the Mellanox SB7800 which has 36 ports. This fabric is used for IPC by the applications as well as to serve NFS from Isilon.
The third switch in the solution is the gateway switch and connects the Isilon F800 to the head node and compute nodes. Isilon’s external interfaces are 40 Gigabit Ethernet. Hence, a switch which can serve as the gateway between the 40GbE Ethernet and InfiniBand networks is needed for connectivity to the head and compute nodes. The Mellanox SX6036 is used for this purpose. The gateway is connected to the InfiniBand EDR switch and the Isilon.