Skip to main content »

Trinity College Dublin

IITAC

Vendor: 
IBM
Available to: 
Irish Researchers
Processor Type: 
Opteron
Architecture: 
64 bits
Number of Nodes: 
356
RAM per node: 
4
RAM: 
1.4TB
Clock Speed: 
2.40GHz
Interconnect: 
Voltaire Infiniband SDR
Theoretical Peak Performance: 
3.4TF
Total number of cores: 
712
Number of sockets per Node: 
2
Number of Cores per Socket: 
1
Status: 
Decommissioned
Linpack Score: 
2.724TF
IITAC

The IITAC project has funded one large, 64bit, compute cluster (called iitac) for large scale parallel jobs and one medium, 32bit, compute cluster (called moloch) for serial jobs.

Queue Status

Graphs showing details of queue usage on IITAC are available here.

Overview

The components of the IITAC cluster were purchased from IBM over the Summer of 2005 and assembled by technical staff at the Trinity Centre for High Performance Computing. It is a large scale facility consisting of 712 AMD Opteron 2.4GHz processors with a theoretical peak performance of 3.4 TFlops.

Operating system

The nodes all run Scientific Linux (a RedHat Enterprise clone).

Login

There are two e326 login nodes in the cluster, iitac01.tchpc.tcd.ie and iitac02.tchpc.tcd.ie. These nodes are also used for compiling and job submission.

Compute

There are 346 e326 compute nodes each containing two 2.4GHz AMD 64bit processors, 4GB RAM and a 80GB SATA scratch disk. Each node has dual onboard gigabit ethernet and a PCI-X infiniband card.

Interconnect

The nodes are connected to two Voltaire 288 port InfiniBand (IB) switches - providing a dedicate low latency high bandwidth interconnect for parallel jobs. The Voltaire technology allows for inter node communication speeds of up to 10Gb. The IB network is also used by our parallel file system. Each node is also connected to a Force10 Gigabit switch - providing a high speed dedicated management network.

One Voltaire IR9288 switch
One of our Voltaire IB switches.

Disk storage

IBM's General Parallel File System (GPFS) runs across all nodes on the cluster. The cluster is connected to our S2A9500 storage system from Data Direct Networks. Twelve e326 nodes are connected via two fibre channel switches to the DDN S2A9550. These twelve nodes operate as GPFS I/O servers. All other nodes connect to these twelve nodes as GPFS clients via the two Voltaire IB switches. 70TB of high performance storage, in a RAID 6 configuration, for home directories.

Node Specifications

Processors 2
Vendor AMD
Model Opteron 250
CPU Speed 2.4 Ghz
Cache size L1 64Kb instruction, L1 64Kb data, L2 1024Kb
RAM 4GB DDR PC3200
Disk 80GB SATA
Ethernet 2 Broadcom BCM5704 Gigabit Ethernet

Funding and Acknowledgement

This infrastructure was funded by:

Sponsors Logo

Users must acknowledge the support and infrastructure provided by the Trinity Centre for High Performance Computing and the IITAC project funded by the HEA under the Program for Research in Third Level Institutes (PRTLI) co-funded by the Irish Government and the European Union


Last updated 12 Jul 2011Contact TCHPC: info | support.