The part of the infrastructure used by the Particle Physics division constitutes a Tier3-level resource of the Worldwide LHC Computing Grid (WLCG) – an international infrastructure that provides computing and storage for the Large Hadron Collider at CERN. A Tier3 resource is primarily a local analysis and modelling facility, and it also stores data sets necessary for analyses. Another such Tier3 exists in Stockholm. The gateway between the facility’s storage pools and the WLCG is handled by the experts at the National Supercomputing Centre in Linköping.
The infrastructure provides computing and storage capacity for five groups at two Departments at the Lund University working in the area of subatomic physics: CERN experiments ALICE and ATLAS in the Particle Physics division, CERN-ISOLDE and FAIR-NUSTAR in the Nuclear Physics division, – all at the Department of Physics, – and the Theoretical High Energy Physics group of the Department of Astronomy and Theoretical Physics. It has started with the Iridium cluster (192 computing cores in 12 nodes and 30 Terabytes of attached storage) and was recently extended to a larger partition within Aurora facility at LUNARC (additional 480 cores in 20 nodes and 50 Terabytes of storage). The combined facility has about 35 permanent users and a large number of temporary accounts set for diploma students and for students attending various training courses given by our divisions. There is a variety of analysis and simulation tasks being executed by the facility, ranging from quick student projects to massive simulations taking up hundreds of cores. Apart of enabling a large variety of research projects in subatomic physics, the facility also serves as an educational resource, where students can learn aspects of scientific computing. It has been used for the courses of the Science Faculty’s COMPUTE Research School, and for the Faculty’s MNXB01 course. Moreover, the Iridium cluster offers a Grid interface, allowing it to be included in distributed computing infrastructures, such as the Worldwide LHC Computing Grid (WLCG), and also serves as a testbed for e-Science research projects.
The infrastructure is closely related to the WLCG Tier2 Aurora partition which is provided by LUNARC to WLCG via SNIC. Although both support similar research activities, Tier2 is mainly offered to external users as a national contribution to WLCG, while this Iridium/Aurora infrastructure is a local resource and serves several research groups.
• 192 computing cores in 12 nodes, AMD Opteron 6220 processors, 64 GB RAM per node
• 30 Terabytes of disk storage, Dell PowerEdge R515
• Operating system: Scientific Linux 6.9 (Carbon)
• 480 computing cores in 24 nodes, Intel Xeon E5-2650 v3 processors (Haswell) , 128 GB RAM per node
• 50 Terabytes of disk storage, Dell PowerEdge R730xd
• Operating system: CentOS 7.2
The infrastructure holds a temporary copy of a subset of derived data as collected by the ATLAS experiment at CERN, for a rapid access by local researchers. The data consists of records of proton-proton collision events, as well as relevant simulation. The infrastructure does not provide archival storage. Copies of data sets change with time, old being replaced by new ones.
The infrastructure provides primarily computing and some temporary storage services to the researchers in the Particle Physics and Nuclear Physics divisions at the Department of Physics and the Theoretical High Energy Physics group of the Department of Astronomy and Theoretical Physics. The systems expert (0.25 FTE) provides related IT support to the researchers as well, such as e.g. deployment of the necessary software packages and training.
The infrastructure consists of servers owned by the div. of Particle Physics and hosted at LUNARC. LUNARC is governed by a Board. The servers that belong to Subatomic Physics are coordinated by the working group consisting of the Principal Investigator of the infrastructure (O. Smirnova), LUNARC director (J. Lindemann), Head of the division of Particle Physics (T. Åkesson), Principal Investigator of the EU DARKJETS project (C. Doglioni, for the storage), and systems experts (F. Paganelli, M. Acebes)