Computing and Storage Facility for Subatomic Physics


Research areas and keywords

UKÄ subject classification

  • Natural Sciences

Type of infrastructure

  • Equipment
  • Services

Degree of recognition

  • Local

Name of national/international infrastructure this infrastructure belongs to

The part of the infrastructure used by the Particle Physics division constitutes a Tier3-level resource of the Worldwide LHC Computing Grid (WLCG) – an international infrastructure that provides computing and storage for the Large Hadron Collider at CERN. A Tier3 resource is primarily a local analysis and modelling facility, and it also stores data sets necessary for analyses. Another such Tier3 exists in Stockholm. The gateway between the facility’s storage pools and the WLCG is handled by the experts at the National Supercomputing Centre in Linköping.


The infrastructure provides computing and storage capacity for five groups at two Departments at the Lund University working in the area of subatomic physics: CERN experiments ALICE and ATLAS in the Particle Physics division, CERN-ISOLDE and FAIR-NUSTAR in the Nuclear Physics division, – all at the Department of Physics, – and the Theoretical High Energy Physics group of the Department of Astronomy and Theoretical Physics. It has started with the Iridium cluster (192 computing cores in 12 nodes and 30 Terabytes of attached storage) and was recently extended to a larger partition within Aurora facility at LUNARC (additional 480 cores in 20 nodes and 50 Terabytes of storage). The combined facility has about 35 permanent users and a large number of temporary accounts set for diploma students and for students attending various training courses given by our divisions. There is a variety of analysis and simulation tasks being executed by the facility, ranging from quick student projects to massive simulations taking up hundreds of cores. Apart of enabling a large variety of research projects in subatomic physics, the facility also serves as an educational resource, where students can learn aspects of scientific computing. It has been used for the courses of the Science Faculty’s COMPUTE Research School, and for the Faculty’s MNXB01 course. Moreover, the Iridium cluster offers a Grid interface, allowing it to be included in distributed computing infrastructures, such as the Worldwide LHC Computing Grid (WLCG), and also serves as a testbed for e-Science research projects.
The infrastructure is closely related to the WLCG Tier2 Aurora partition which is provided by LUNARC to WLCG via SNIC. Although both support similar research activities, Tier2 is mainly offered to external users as a national contribution to WLCG, while this Iridium/Aurora infrastructure is a local resource and serves several research groups.

Equipment and resources

• 192 computing cores in 12 nodes, AMD Opteron 6220 processors, 64 GB RAM per node
• 30 Terabytes of disk storage, Dell PowerEdge R515
• Operating system: Scientific Linux 6.9 (Carbon)
Aurora partition:
• 480 computing cores in 24 nodes, Intel Xeon E5-2650 v3 processors (Haswell) , 128 GB RAM per node
• 50 Terabytes of disk storage, Dell PowerEdge R730xd
• Operating system: CentOS 7.2

Digital and physical collections

The infrastructure holds a temporary copy of a subset of derived data as collected by the ATLAS experiment at CERN, for a rapid access by local researchers. The data consists of records of proton-proton collision events, as well as relevant simulation. The infrastructure does not provide archival storage. Copies of data sets change with time, old being replaced by new ones.

Services provided

The infrastructure provides primarily computing and some temporary storage services to the researchers in the Particle Physics and Nuclear Physics divisions at the Department of Physics and the Theoretical High Energy Physics group of the Department of Astronomy and Theoretical Physics. The systems expert (0.25 FTE) provides related IT support to the researchers as well, such as e.g. deployment of the necessary software packages and training.

Management of the infrastructure

The infrastructure consists of servers owned by the div. of Particle Physics and hosted at LUNARC. LUNARC is governed by a Board. The servers that belong to Subatomic Physics are coordinated by the working group consisting of the Principal Investigator of the infrastructure (O. Smirnova), LUNARC director (J. Lindemann), Head of the division of Particle Physics (T. Åkesson), Principal Investigator of the EU DARKJETS project (C. Doglioni, for the storage), and systems experts (F. Paganelli, M. Acebes)

Available for loan

Available for loan - internal only

Terms of access:

Access to the services is granted upon request to members of the listed divisions, and temporary limited access is offered to LU students that take part in training courses. Access to Iridium is fully controlled by the systems manager, while access to Aurora is handled in a centralized manner following Swedish National Infrastructure for Computing (SNIC) policies and procedures. The only external users that may access the infrastructures are those who access it via Grid interfaces, in accordance with WLCG policies.

Recent research outputs

Ch Lorenz, L. G. Sarmiento, D. Rudolph, P. Golubev, T. Eronen, D. A. Nesterenko, A. Kankainen, L. Canete, D. M. Cox, A. Fernandez, U. Forsberg, A. Jungclaus, I. Kojouharov, N. Kurz, N. Lalović, J. Partanen, M. Reponen, S. Rinta-Antila, A. De Roubin, A. Såmark-Roth & 2 others, V. Vaquero & M. Vilén, 2019 Apr 19, In : Physical Review C. 99, 4, 21 p., 044310.

Research output: Contribution to journalArticle

A. Såmark-Roth, L. G. Sarmiento, D. Rudolph, J. Ljungberg, B. G. Carlsson, C. Fahlander, U. Forsberg, P. Golubev, I. Ragnarsson, D. Ackermann, L. L. Andersson, M. Block, H. Brand, D. M. Cox, A. Di Nitto, Ch E. Düllmann, K. Eberhardt, J. Even, J. M. Gates, J. Gerl & 28 others, K. E. Gregorich, C. J. Gross, R. D. Herzberg, F. P. Heßberger, E. Jäger, J. Khuyagbaatar, B. Kindler, I. Kojouharov, J. V. Kratz, J. Krier, N. Kurz, B. Lommel, A. Mistry, C. Mokry, J. P. Omtvedt, P. Papadakis, J. Runke, K. Rykaczewski, M. Schädel, H. Schaffner, B. Schausten, P. Thörle-Pospiech, N. Trautmann, T. Torres, A. Türler, A. Ward, N. Wiehl & A. Yakushev, 2018 Oct 9, In : Physical Review C. 98, 4, 11 p., 044307.

Research output: Contribution to journalArticle

Nataša Lalović, Dirk Rudolph, Zsolt Podolyák, Luis Sarmiento Pico, E.C. Simpson, T. Alexander, M.L. Cortés, J. Gerl, Pavel Golubev, F. Ameil, T. Arici, Ch. Bauer, D. Bazzacco, M. A. Bentley, P. Boutachkov, M. Bowry, Claes Fahlander, A. Gadea, Jnaneswari Gellanki, A. Givechev & 31 others, N. Goel, M. Gorska, A. Gottardo, E. Gregor, G. Guastalla, T. Habermann, M. Hackstein, A. Jungclaus, I. Kojouharov, R. Kumar, N. Kurz, M. Lettmann, C. Lizarazo, C. Louchart, E. Merchán, C. Michelagnoli, Th. Moeller, K. Moschner, Z. Patel, N. Pietralla, S. Pietri, D. Ralet, M. Reese, P.H. Regan, P. Reiter, H. Schaffner, P. Singh, C. Stahl, R. Stegmann, O. Stezowski & AGATA collaboration AGATA collaboration, 2018 Feb 12, In : Journal of Physics G: Nuclear and Particle Physics. 45, 3, 27 p., 035205.

Research output: Contribution to journalArticle

View All (4)