Scientific Computing and Imaging (SCI) Institute

The Scientific Computing and Imaging (SCI) Institute is one of eight research institutes at the University of Utah and includes 32 faculty members and over 200 other scientists, administrative support staff, and graduate and undergraduate students as well as thousands of collaborative users. The SCI Institute has over 25,000 square feet of functional space allocated to its research and computing activities within the John and Marva Warnock Engineering Building on campus. Laboratory facilities for the researchers listed in this project include personal workspaces and access to common working areas.

Power Display Wall

The interactive Power Display Wall provides users with the ability to explore 2D/3D visualizations on 36 (4 x 9) 27-inch tiled screens at a 133-megapixel resolution with an upgraded 120GB of graphics memory. The display is powered by 5 workstations and one controller, each equipped with a Xeon W-2295 18-core CPU, 128GB DDR4 2933 MT/s ECC RAM, and 2x RTX A2000 12GB GPUs. The display can be controlled by a computer and/or tablet device, either on-site or by remote collaborators. Its infrastructure is designed to handle massive, terascale data sets from local or remote sources. Each node of the display wall operates 4-8 screens and can be configured by the controller to stream process the data as it is displayed, aiding in detailed analysis. This makes the wall an ideal resource for local and remote collaborations, enabling users to examine fine details of large datasets while maintaining a global context..

Office Space and Labs

The SCI Institute houses its faculty and staff in offices. Students have an individual desk space equipped with a workstation and are located in large, open common areas, referred to as labs, that facilitate student collaboration and communication. All workstations are connected to the SCI local area network via 1 to 10 Gpbs Ethernet.  Systems run on Linux, MacOS or Windows as dictated by the research and user preferences.

Shared Infrastructure

The SCI Institute computing facility, which has a dedicated data center in the Warnock Engineering Building, includes various CPU- and GPU-based servers and clusters, core infrastructure services, storage and backups and high-speed networking:

Compute

CPU

  • 2 systems x 192 cores Intel Xeon Gold 6252 CPU at 2.10GHz, 1TB RAM
  • 2 systems x 192 cores Intel(R) Xeon(R) Platinum 8360H CPU @ 3.00GHz, 3TB RAM
  • 256 cores multi-threaded Intel Xeon CPU E7-4850 v4 at 2.10GHz, 1TB RAM, HPE UV3000
  • 128 cores multi-threaded Intel Xeon X7560 CPU at 2.27GHz, 512GB RAM, HP DL980 G7
  • 48 cores Intel(R) Xeon(R) CPU E7540 @ 2.00GHz, 128GB RAM
  • 3 systems x 160 cores multi-threaded Intel Xeon E7-4870 CPU at 2.40GHz, 800GB RAM, HP DL980 G7

GPU

  • 12 cores Intel(R) Core(TM) i7-5930K CPU @ 3.50GHz, 128GB RAM, 4 x Nvidia Titan X
  • Intel(R) Xeon(R) Silver 4110 CPU @ 2.10GHz, 256GB RAM, 4 x Nvidia Titan RTX
  • Intel(R) Xeon(R) CPU E5-2640 0 @ 2.50GHz, 32GB RAM, 3 x Nvidia Tesla K20c
  • 2 systems x Intel(R) Xeon(R) Gold 6334 CPU @ 3.60GHz, 192GB RAM, 3 x Nvidia RTX A6000
  • AMD EPYC 7343 4 x 16-Core Processor, 256GB RAM, 4 x Nvidia A100 SXM
  • AMD EPYC 9554P 64-Core Processor, 792GB RAM, 4 x Nvidia A800
  • AMD EPYC 7542 32-Core Processor, 256GB RAM, 4 x Nvidia A100 SXM
  • 2 systems x AMD Ryzen Threadripper 3970X 32-Core Processor, 256GB RAM, Nvidia GeForce RTX 3090
  • 3 systems x 32 cores multi-threaded Intel Xeon Silver 4108 CPU at 1.80GHz, 132GB RAM, 4 x Nvidia Titan V
  • 160 cores multi-threaded Intel Xeon E7-4870 CPU at 2.40GHz with 800GB RAM, Nvidia GeForce GT 730, HP DL980 G7
  • 16 cores Intel Xeon E5630 CPU at 2.53GHz, 132 GB RAM, 3 x Nvidia GeForce GTX 1080 Ti
  • 24 cores Intel Xeon X5650 CPU at 2.67GHz, 148 GB RAM, 3 x Nvidia GeForce GTX 1080 Ti
  • 16 cores Intel Xeon x5570 CPU at 2.93GHz, 24GB RAM, Nvidia Tesla K40c
  • 12 cores Intel Xeon X5650 CPU at 2.67GHz, 12GB RAM, Nvidia Tesla K40c
  • 144 cores Intel(R) Xeon(R) CPU E7-8890 v3 @ 2.50GHz, 1.5TB RAM, Nvidia GeForce GTX 750
  • Intel(R) Xeon(R) Gold 6208U CPU @ 2.90GHz, 256GB RAM, 2 x Nvidia Titan RTX
  • Intel(R) Xeon(R) W-2295 CPU @ 3.00GHz, 256GB RAM, 2 x Nvidia Titan RTX
  • 48 cores Intel(R) Xeon(R) Silver 4214R CPU @ 2.40GHz, 392GB RAM, 8 x Nvidia GeForce ??
  • 32 cores Intel(R) Xeon(R) W-3335 CPU @ 3.40GHz, 256GB RAM, 2x Nvidia GeForce RTX 4090
  • 12 cores Intel(R) Core(TM) i7-5930K CPU @ 3.50GHz, 128GB RAM, 4 x Nvidia Titan X

Networking

  • Campus: Redundant 40 GbE links to the university network backbone.
  • Core: 2 x Arista 7280CR3-32D4-F switches providing 100/200 GbE to HPC servers and a 400 GbE core.
  • Edge: 20 x Cisco Catalyst 9300 switches providing 1/2.5/5/10 GbE to individual workstations and non-HPC servers.

Storage and Backups

  • VAST Data Platform with 2PB usable space with 16 x 100 GbE.
  • VAST Data Platform (DR) with 2PB usable space with 4 x 100 GbE located off-site at the DDC.
  • Qumulo QC208 storage cluster comprised of 4 storage nodes for a total of 532TB usable space with 4 x 40 GbE.
  • Synology archival disk with 420TB usable space with 2 x 25 GbE.
  • 9PB 6 x LTO-7 HPE MSL6480 tape library primary backup system.
  • Dedicated 3 x IBM backup servers to manage backup SAN through various robots via Commvault.

Core Services

  • VMware ESXi/vSphere clusters providing the following services:
    • DNS
    • DHCP
    • Authentication
    • E-Mail
    • Websites
    • License servers
    • Databases
    • Monitoring (environmental and systems)
    • Starfish (data management and archiving)
    • Software distribution and data transfer nodes
    • Interactive shell access
    • Development research servers
    • Container services

UPS and Generator

  • 225 kVA UPS power, including 90 minutes of battery backup for critical SCI servers and services
  • 600 kVA generator providing emergency power