UK HPC Facilities

These are HPC facilities in the UK that have defined access processes for external users :

ARCHER2, the new UK national supercomputing service offers a capability resource for running very large parallel jobs. Based around an HPE Cray EX supercomputing system with an estimated peak performance of 28 PFLOP/s, the machine will have 5,848 compute nodes, each with dual AMD EPYC Zen2 (Rome) 64 core CPUs at 2.2GHz, giving 748,544 cores in total. The initial service, which will be live from early 2021, is based on a subset of 4 cabinets of the total 23 cabinet system. The remaining 19 cabinets will be added later in 2021. The service includes a service desk staffed by HPC experts from EPCC with support from HPE Cray. Access is free at point of use for academic researchers working in the EPSRC and NERC domains. Users will also be able to purchase access at a variety of rates. Industry access will be available.

Cirrus at EPCC is one of the EPSRC Tier-2 HPC facilities. The main resource is a 10,080 core SGI/HPE ICE XA system. Cirrus Phase II saw the addition of 36 HPE Plainfield Blades each with two Intel Xeon processors and four NVIDIA v100 GPUs. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.

Baskerville, an EPSRC Tier-2 HPC facility, is a collaboration between the University of Birmingham, The Rosalind Franklin Institute, The Alan Turing Institute and Diamond Light Source, the UK’s national synchrotron. Baskerville comprises 52 Lenovo® Neptune™ liquid cooled servers each featuring twin Intel IceLake CPUs, 512GB system RAM and 980Gb local NVMe storage built to support four of NVIDIA’s A100 Tensor Core GPUs attached to each system via high performance PCIe gen4 connection. There are 46 nodes with 4x A100 40GB and 6 nodes with 4x A100 80GB. Access to Baskerville is available through the EPSRC Access to HPC calls or via the Baskerville Consortium.

Isambard at GW4 is one of the EPSRC Tier-2 HPC facilities. Isambard provides multiple advanced architectures within the same system in order to enable evaluation and comparison across a diverse range of hardware platforms. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.

Cambridge Service for Data Driven Discovery (CSD3) is one of the EPSRC Tier-2 HPC facilities. CSD3 is a multi-institution service underpinned by an innovative, petascale, data-centric HPC platform, designed specifically to drive data-intensive simulation and high-performance data analysis. Free access is available to academic researchers working in the EPSRC domain via the Access to HPC calls; academic users from other domains and institutions can purchase access. Industry access is available.

Sulis at HPC Midlands+ is a Tier-2 HPC platform for ensemble computing workflows, realised through replicating workstation-scale calculations over many inputs or models. Sulis delivers substantial HPC capacity targeted at data-intensive, high-throughput workloads, based on modern software deployment and containerisation technologies to enable scale up from workstation to Tier-2 with minimal code modification. The platform is further supported by a developing Research Software Engineering training effort which recognises non-traditional routes to large-scale scientific research. Access to Sulis is available through the EPSRC Access to HPC calls or via the HPC Midlands+ Consortium.

JADE (Joint Academic Data science Endeavour) is one of the EPSRC Tier-2 HPC facilities. The system design exploits the capabilities of NVIDIA's DGX-1 Deep Learning System which has eight of its newest Tesla P100 GPUs tightly coupled by its high-speed NVlink interconnect. The DGX-1 runs optimized versions of many standard machine learning software packages such as Caffe, TensorFlow, Theano and Torch. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.

MMM Hub (Materials and Molecular Modelling Hub) The theory and simulation of materials is one of the most thriving and vibrant areas of modern scientific research today. Designed specifically for the materials and molecular modelling community, this Tier 2 supercomputing facility is available to HPC users all over the UK. The MMM Hub was established in 2016 with a £4m EPSRC grant awarded to collaborators The Thomas Young Centre (TYC), and the Science and Engineering South Consortium (SES). The MMM Hub is led by University College London on behalf of the eight collaborative partners who sit within the TYC and SES: Imperial, King’s, QMUL, Oxford, Southampton, Kent, Belfast and Cambridge.

DiRAC DiRAC is the STFC HPC facility for particle physics and astronomy researchers. It is currently made up of five different systems with different architectures. These range from an extreme scaling IBM BG/Q system, a large SGI/HPE UV SMP system, and a number of Intel Xeon multicore HPC systems. Free access is available to academic researchs working in the STFC domain; academic researchers from other domains can purchase access. Industry access is also available.

NI-HPC The NI-HPC Centre is a UK Tier-2 National High Performance Computing (HPC) facility funded by EPSRC and jointly managed by Queen's University Belfast (QUB) and Ulster University. The focus is on introducing new aspects of HPC modelling for neurotechnology and computational neuroscience, advanced chemistry, innovative drug delivery, precision medicine, metabolomics and hydrogen safety. The cluster is named Kelvin2 and is comprised of 60x128core AMD nodes, 4x2TB hi-memory nodes and 32xNVIDIA v100s. A fast track allocation process is available for researchers wishing to try the facility.

Bede at N8 CIR (formerly NICE) is one of the EPSRC Tier-2 HPC facilities. Bede is comprised of 32 IBM POWER9 dual-CPU nodes, each with 4 NVIDIA V100 GPUs and high performance NVLink interconnect. There are 4 additional nodes with NVIDIA T4 Tensor Core GPU Accelerators to improve AI inference. Access to Bede is available through the EPSRC Access to HPC calls, and for members of the N8 Research Partnership universities.

Worldwide HPC Facilites

Some facilities around the world may also be accessible to UK users. The list below includes facilities or organisations that can provide access to users from the UK.

DOE INCITE The US Department of Energy makes access to its Leadership Computing facilities available to users worldwide through the INCITE programme.


† Note: all the above facilities are non-commercial, free at the point of use for academic research use.