If (top != self) { window.location = 'about:blank'; }
NASA Logo, National Aeronautics and Space Administration
High-End Computing Program

+ Home > About Us > Facilities & Services > Computing Systems Overview


This table shows the systems and related resources at the NASA Advanced Supercomputing (NAS) Facility and the NASA Center for Climate Simulation (NCCS).

Information about HEC Systems and Related Resources


SGI/HPE modular system

4 E-Cells (1,152 nodes)

46,080 cores
3.69 petaflops peak
2.38 petaflops LINPACK rating (#169 on November 2020 TOP500 list)
45.47 Tflop/s HPCG rating (#58 on November 2020 HPCG list)
45.47 Tflop/s HPCG rating (#58 on November 2020 HPCG list)

216 terabytes of memory

Intel Xeon Cascade Lake processors (2.5 GHz)


SGI modular system

24 racks (3,456 nodes)

124,416 cores
8.32 petaflops peak
5.44 petaflops LINPACK rating (#53 on November 2020 TOP500 list)
106.54 teraflops HPCG rating (#349 on November 2020 HPCG list)

589 terabytes of memory

ntel Xeon Gold 6148 Skylake processors (2.4 GHz) and Intel Xeon E5-2680v4 Broadwell processors (2.4 GHz)


SGI ICE cluster

158 racks (11,207 nodes)

241,324 cores
7.09 petaflops peak
5.95 petaflops LINPACK rating (#46 on November 2020 TOP500 list)
175 teraflops HPCG rating (#25 on November 2020 HPCG list)
927 terabytes of memory
Intel Xeon Westmere 5670 processors (2.93 GHz); Intel Xeon Westmere X5675 processors (3.06 GHz); Intel Xeon Sandy Bridge E5-2670 processors (2.6 GHz); Intel Xeon Ivy Bridge E5-2680v2 processors (2.8 GHz); Intel Xeon Haswell E5-2680v3 (2.5 GHz) processors; and Intel Xeon Broadwell E5-2680v4 processors (2.4 GHz)

935 terabytes of memory

GPU/Sandy Bridge and Westmere nodes:

3 racks (83 nodes; 1 GPU per node)

1,024 Intel Xeon Sandy Bridge cores and 684 Intel Xeon Skylake cores
614,400 GPU cores
646 teraflops, peak



2-node SGI UV 2000 system

1,536 cores
32 teraflops, peak

6 terabytes of memory

Intel Xeon E5-4650L Sandy Bridge processors (2.6 GHz)



56 racks (half-population; 1,792 nodes)

21,504 cores
252 teraflops, peak

86 terabytes of memory

Intel Xeon X5670 Westmere processors (2.93 GHz)

Aggregate System:

103 racks
129,056 cores

6.798 petaflops peak
600.576 terabytes of memory

Scalable Units 10, 11, 12, and 13 = SGI Rackable System
81,954 cores
Intel Xeon Haswell (2.6 GHz)

Scalable Compute Unit 14 = Supermicro FatTwin Rack Scale System
20,800 cores
Intel Xeon Skylake (2.4 GHz)

Scalable Compute Unit 15 = Aspen Systems and Supermicro TwinPro Rack Scale System
25,600 cores
Intel Xeon Skylake (2.4 GHz)


29 petabytes of RAID disk capacity (combined total for all systems)

Archive Capacity:
1,040 petabytes (1 exabyte)

75 petabytes of RAID
12 petabytes in Centralized Storage System

Archive Capacity:
150 petabytes

Networking SGI NUMAlink
Voltaire InfiniBand
10-Gigabit Ethernet
1-Gigabit Ethernet
Mellanox Technologies InfiniBand
40-Gigabit Ethernet
10-Gigabit Ethernet
1-Gigabit Ethernet
Visualization and Analysis

128-screen tiled LCD wall arranged in 8x16 configuration
Measures 23-ft. wide by 10-ft. high
128 graphics processing units (Nvidia GeForce GTX 780 Ti)
646 teraflops, peak processing power
2,560 Intel Xeon E5-2680v2 (Ivy Bridge) cores (10-core)
57 teraflops, peak processing power
393 gigabytes of GDDR5 graphics memory
1.5 petabytes of storage

Data Visualization Theater

15 Samsung UD55C 55-inch displays in 5x3 configuration
Measures 20 ft. wide by 6-ft.10-in. high
DVI connection
1920 x 1080 screen resolution @1080p

Hyperwall Cluster
16 Dell Precision WorkStation R5400s
2 dual-core Intel Xeon Harpertown processors per node
4 GB of memory per node
NVIDIA Quadro FX 1700 graphics
1 Gigabit Ethernet network connectivity
Control Station
One Dell FX100 Thin Client

ADAPT—Advanced Data Analytics Platform

Managed Virtual Machine Environment

550+ Hypervisors – Intel Xeon Westmere, Ivy Bridge, Sandy Bridge, and Broadwell processor cores

High-speed InfiniBand and 10 Gigabit Ethernet networks

Linux and Windows Virtual Machines

10+ petabytes of raw storage under Gluster file system management

ADAPT GPU Cluster (Aspen Systems)
880 Intel Xeon Gold 6248 Cascade Lake cores (2.5 GHz)
88 NVIDIA V100 GPUs with 32 gigabytes of VRAM each
16.896 terabytes of RAM
83.6 terabytes of local NVMe storage
Dual 100-gigabit (Gb) HDR100 InfiniBand
Dual 25-gigabit (Gb) Ethernet, bonded for high availability


HP ProLiant DL380p Gen8

Dual-socket, 10-core Intel Xeon 2.5 GHz Ivy Bridge processors

128 gigabytes of RAM

Mellanox ConnectX-3 MT27500 Interconnect

2 x 500GB SAS drives and 3 x 4TB SAS drives

Remote Visualization

HP DL380 G8

20 E5-2670 v2/2.50GHz cores

128 gigabytes of RAM

NVIDIA K5000 GPU card


USA.gov NASA Logo - nasa.gov