Compute Clusters

UCD Research IT provide a wide range of services to the UCD Research community in relation to the provisioning and use of High  Performance Computing Clusters. The HPC environment is ideal for researchers who need access to substantial computing resources, allowing them to greatly reduce the time required to perform large scale calculations. In collaboration with ICHEC, we currently have 3 options for the use of HPC clusters:

To apply for an account on any of these clusters please fill out this application form and a member from Research IT or ICHEC will be in contact with you.

NB You will need to be logged into your UCD Connect account before applying for a HPC account. 

For help, advice or additional software requirements please contact us

Applying for a HPC Account

Open All

Please note : If you cannot see a login box for this form, you must first log into UCD Connect mail in a separate browser tab.  Go to UCD Connect and click on the Mail icon.  Once logged in, return to and refresh this page and you will see the booking form.

 

Tier 1 - ICHEC/Kay National Cluster

Tier 1 is a national service and is allocated to user projects that range in size from small "Discovery" (Class C) projects to large multi-year "High Impact" (Class A) projects. Customers for this service may require PI level status. Further information on this service is available here.

Tier 2 - UCD Condominium on ICHEC

Tier 2 is provided in collaboration with ICHEC (Irish Centre for High End Computing). Research IT offers a HPC condominium service which allows users to run large parallel processing jobs via fionn.ichec.ie. Further details on this service are available here. Further information is available here.

Tier 3 - UCD Local Clusters

Tier 3 offers  use of either the Sonic cluster which is suitable for small parallel, large memory, serial or loosely coupled jobs. This cluster is offered under both the Community and Shared Compute models. The Community model allows those customers who can offer funding to these clusters a higher priority level over other customers who are operating with more restricted budgets. The cluster is administered and managed by ResearchIT at no cost to the customers using it. Further information is available here.

Acknowledgements

How should I acknowledge use of the ResearchIT HPC cluster?

For use in academic publications we suggest the following line;

This work has been carried out using the ResearchIT Sonic cluster which was funded by UCD IT Services and the Research Office.

Research IT Tier 3 Clusters

Cluster Name

Sonic

Current Usage

Real Time Information

Number of Nodes

45

Total Number of Cores

1360 (hyper-threaded)

Total Processing Speed

12.1 Teraflops per second

Processor Speed

2.2GHz Intel Ivy Bridge (24) 2.0GHz Intel Ivy Bridge (16)

Memory per Nodes

128Gb (32) 256Gb (8) 

Interconnect

Ethernet (1Gb) Infiniband (40Gb)

Home Directory Quota

100GB

Scratch Space

28Tb

Additional Nodes Types

MEM1 - High Memory - 765 GB RAM 4 * 2.2GHz (6 cores)

MEM2 - High Memory - 765Gb RAM 4 * 2.4GHz (6 cores)

Computational Usage

Suitable for serial or loosely coupled jobs

User Guide

Sonic User Guide

Software List

Sonic Software List

Full Hardware Specification

Sonic Hardware Spec