If (top != self) { window.location = 'about:blank'; }
+ Home > About Us > Facilities & Services > Computing Systems Overview
This table shows the systems and related resources at the NASA Advanced Supercomputing (NAS) Facility and the NASA Center for Climate Simulation (NCCS).
NAS | NCCS | |
---|---|---|
Systems | Aitken SGI/HPE modular system 4 E-Cells (1,152 nodes), 20 Apollo 9000 racks (2,560 nodes) Intel Xeon Cascade Lake processors (2.5 GHz) AMD Rome processos (2.25 GHz) Electra SGI modular system 24 racks (3,456 nodes) 124,416 cores 589 terabytes of memory ntel Xeon Gold 6148 Skylake processors (2.4 GHz) and Intel Xeon E5-2680v4 Broadwell processors (2.4 GHz) Pleiades SGI/HP ICE cluster 152 racks (10,410 nodes) 228,572 cores
Endeavour 2-node HPE Superdome Flex system 1,792 cores 12 terabytes of memory Intel Xeon Platinum 8280 Cascade Lake processors (2.7 GHz) Cabeus A100 & V100 Nodes: Manufacturer: HPE 22 racks 187 nodes 10,956 CPU cores + 2,428,928 double-precision GPU cores Theoretical-double precision peak performance: 7.56 petaflops (0.57 petaflops from CPUs + 6.99 petaflops from GPUs) Total memory: 75 terabytes (26 terabytes from CPU host memory + 49 terabytes from GPU memory) GH200 Nodes: Manufacturer: Supermicro 15 racks 350 nodes 25,200 CPU cores + 2,956,800 double-precision GPU cores Theoretical-double precision peak performance: 13.11 petaflops (1.40 petaflops from CPUs + 11.71 petaflops from GPUs) Total memory: 206.1 terabytes (168 terabytes from CPU host memory + 33.6 terabytes from GPU memory) | Discover 1,772 nodes 138,240 cores 6.42 petaflops peak Scalable Compute Unit 14 = Supermicro FatTwin Rack Scale System Scalable Compute Unit 16 CPU-Only Nodes = Aspen Systems and Supermicro TwinPro nodes 576 total AMD EPYC Rome processor cores (2.8 GHz) 6,912 CUDA cores Scalable Compute Unit 17 CPU-Only Nodes = Aspen Systems and Supermicro TwinPro nodes |
Storage | Online: Archive Capacity: |
Online: Archive Capacity: Centralized Storage System (CSS): |
Networking | SGI NUMAlink Voltaire InfiniBand 10-Gigabit Ethernet 1-Gigabit Ethernet |
Mellanox Technologies InfiniBand Intel Omni-Path 40-Gigabit Ethernet 10-Gigabit Ethernet 1-Gigabit Ethernet |
Visualization and Analysis | Hyperwall-2 |
Data Visualization Theater 15 Samsung UD55C 55-inch displays in 5x3 configuration 16 Dell Precision WorkStation R5400s 2 dual-core Intel Xeon Harpertown processors per node 4 GB of memory per node NVIDIA Quadro FX 1700 graphics 1 Gigabit Ethernet network connectivity Control Station One Dell FX100 Thin Client Explore/ADAPT Science Cloud Managed Virtual Machine Environment 550+ Hypervisors – Intel Xeon Westmere, Ivy Bridge, Sandy Bridge, and Broadwell processor cores and AMD Rome and Milan processor cores High-speed InfiniBand and 10 Gigabit Ethernet networks Linux and Windows Virtual Machines 7 petabytes of Panasas storage Explore/ADAPT: Prism GPU Cluster DataPortal HP ProLiant DL380p Gen8 Dual-socket, 10-core Intel Xeon 2.5 GHz Ivy Bridge processors 128 gigabytes of RAM Mellanox ConnectX-3 MT27500 Interconnect 2 x 500GB SAS drives and 3 x 4TB SAS drives JupyterHub Available on ADAPT/Explore and Prism, coming soon to Discover |