Multifunctional Information and Computing Complex of JINR

Additional data

Submitted: 26.01.2026; Accepted: 25.03.2026; Published 15.04.2026;
Views: 0; Downloaded: 0

How to Cite

A. I. Balandin, N. A. Balashov, O. Yu. Derenovskaya, A. G. Dolbilov, A. P. Gavrish, A. O. Golunov, N. I. Gromova, A. V. Evlanov, I. A. Kashunin, V. V. Korenkov, N. A. Kutovskiy, V. V. Mitsyn, A. N. Moibenko, I. S. Pelevanyuk, D. V. Podgainy, O. I. Streltsova, S. V. Shmatov, T. A. Strizh, V. V. Trofimov, A. S. Vorontsov, N. N. Voytishin, M. I. Zuev. "Multifunctional Information and Computing Complex of JINR" Natural Sci. Rev. 3 200701 (2026)
https://doi.org/10.54546/NaturalSciRev.200701
A. I. Balandin1, N. A. Balashov1, O. Yu. Derenovskaya1,a, A. G. Dolbilov1, A. P. Gavrish1, A. O. Golunov1, N. I. Gromova1, A. V. Evlanov1, I. A. Kashunin1, V. V. Korenkov1, N. A. Kutovskiy1, V. V. Mitsyn1, A. N. Moibenko1, I. S. Pelevanyuk1, D. V. Podgainy1, O. I. Streltsova1, S. V. Shmatov1, T. A. Strizh1,b, V. V. Trofimov1, A. S. Vorontsov1, N. N. Voytishin1, M. I. Zuev1
  • 1Joint Institute for Nuclear Research, Dubna, Russia
  • aodenisova@jinr.ru
  • bstrizh@jinr.int
DOI: 10.54546/NaturalSciRev.200701
Keywords: grid technologies, cloud technologies, Govorun supercomputer, distributed data storage, LHC, NICA, Baikal-GVD
Topics: Physics , High Energy Physics (Experiment) , Mathematical and Computer Sciences , Information Technology , 70th anniversary of JINR
PDF

Abstract

The Multifunctional Information and Computing Complex (MICC) of the JINR Meshcheryakov Laboratory of Information Technologies (MLIT) is a key element of the JINR network and information and computing infrastructures. The MICC is regarded as JINR’s unique basic facility and plays a decisive role in scientific research, which entails advanced computing power and storage systems. Its uniqueness is ensured by the consolidation of all state-of-the-art information technologies for data processing and storage, united by the network infrastructure with a bandwidth of up to 4 × 100 Gbps. It consists of distributed data processing and storage systems based on both grid and cloud technologies and the hyperconverged computing infrastructure with liquid cooling. Multifunctionality, high reliability, and availability in 24 × 7 × 365 mode, scalability and high performance, information security and an advanced software environment are the main requirements that the MICC meets. The reliability and availability are ensured by the enhanced high-speed telecommunication system and the modern local network infrastructure, as well as by the reliable engineering infrastructure that provides guaranteed power supply and cooling for server hardware. This infrastructure is a staple for computing the experiments at the NICA accelerator complex. The BM@N, MPD, and SPD experiments intensively use all computational components and storage systems. Being part of the Worldwide LHC Computing Grid, the MICC serves as the Tier1 grid site for the CMS experiment at the LHC and as the Tier2 grid site that provides support for the experiments at the LHC and other world’s large-scale experiments in high-energy physics. The integrated cloud environment of the JINR Member States focuses on supporting users and experiments in Russia, China, the USA, etc. (e.g., NICA, NOvA, BaikalGVD, JUNO). The HybriLIT platform comprising the Govorun supercomputer provides capabilities for elaborating mathematical models and algorithms and performing resource-intensive computations, including on graphics accelerators that enable the development of the ecosystem for machine and deep learning tasks, Big Data analysis, and quantum computing on simulators.

Acknowledgements

The authors are grateful to all MLIT staff for their contribution to the development and continuous operation of the JINR Multifunctional Information and Computing Complex. We commend the JINR Directorate for their support and constant attention.

References

[1] JINR Long-Term Development Strategy up to 2030 and beyond: Science & Technology. Dubna: JINR, 2020, 235 p.

[2] V. Korenkov, The JINR Multifunctional Information and Computing Complex, in: Proceedings of the IEEE Xplore:2020 International Scientific and Technical Conference Modern Computer Network Technologies (MoNeTeC), Moscow, Russia, 2020, pp. 1–4. https://doi.org/10.1109/ MoNeTeC49726.2020.9258311.

[3] A. Baginyan, A. Balandin, N. Balashov, A. Dolbilov, A. Gavrish, A. Golunov, N. Gromova, I. Kashunin, V. Korenkov, N. Kutovskiy, V. Mitsyn, I. Pelevanyuk, D. Podgainy, O. Streltsova, T. Strizh, V. Trofimov, A. Vorontsov, N. Voytishin, M. Zuev, Current status of the MICC: An overview, CEUR Workshop Proceedings of the 9th International Conference “Distributed Computing and Grid Technologies in Science and Education” 3041 (2021) 1–8. https://ceur-ws.org/Vol-3041/1-8-paper-1.pdf.

[4] Seven-Year Plan for the Development of JINR for 2024–2030. Dubna: JINR, 2023, 76 p.

[5] A. Baginyan, A. Balandin, A. Dolbilov, A. Golunov, N. Gromova, I. Kashunin, V. Korenkov, V. Mitsyn, I. Pelevanyuk, S. Shmatov, T. Strizh, V. Trofimov, A. Vorontsov, N. Voytishin, JINR grid infrastructure: Status and plans, Physics of Particles and Nuclei 55 (3) (2024) 355–359. https://doi.org/10.1134/S1063779624030079.

[6] Worldwide LHC Computing Grid (WLCG), https://wlcg.web.cern.ch/, accessed 2025-12-01.

[7] Nuclotron-based Ion Collider fAсility (NICA), https://nica.jinr.ru/ru/, accessed 2025-12-01.

[8] A. V. Baranov, N. A. Balashov, N. A. Kutovskiy, R. N. Semenov, JINR cloud infrastructure evolution, Physics of Particles and Nuclei Letters 13 (2016) 672–675. https://doi.org/10.1134/S1547477116050071.

[9] OpenNebula: Enterprise Cloud and Virtualization Platform, https://opennebula.io/, accessed 2025-12-01.

[10] E. I. Alexandrov, D. V. Belyakov, M. A. Matveyev, D. V. Podgainy, O. I. Streltsova, Sh. G. Torosyan, E. V. Zemlyanaya, P. V. Zrelov, M. I. Zuev, Research of acceleration calculations in solving scientific problems on the heterogeneous cluster HybriLIT, Bulletin of Peoples’ Friendship University of Russia. Series: Mathematics. Information Sciences. Physics 4 (2015) 30–37.

[11] I. Kashunin, V. Mitsyn, V. Trofimov, A. Dolbilov, Integration of the cluster monitoring system based on Icinga2 at JINR LIT MICC, Physics of Particles and Nuclei Letters, 17 (3) (2020) 345–352. https://doi.org/10.1134/S1547477120030073.

[12] S. Chatrchyan et al. (CMS Collab.), The CMS experiment at the CERN LHC, Journal of Instrumentation 3 (2008) S08004. https://home.cern/science/experiments/cms, accessed 2025-12-01.

[13] O. S. Burning et al. (Eds.), LHC Design Report Vol. 1: The LHC Main Ring, CERN2004-003-V-1. Geneva: CERN, 2004, 548 p. https://home.cern/science/accelerators/large-hadron-collider, accessed 2025-12-01.

[14] V. B. Gavrilov, I. A. Golutvin, O. L. Kodolova, V. V. Korenkov, L. G. Levchuk, S. V. Shmatov, E. A. Tikhonenko, V. E. Zhiltsov, RDMS CMS computing: Current status and plans, Computer Research and Modeling 7 (3) (2015) 395–398.

[15] S. Campana, I. Bird, B. Panzer-Steindel, Overview of the WLCG strategy towards HL-LHC computing — April 2020, LHCC Review (2021). https://doi.org/10.5281/zenodo.5499655.

[16] J. Albrecht, A. A. Alves et al., HEP Software Foundation, A roadmap for HEP software and computing R&D for the 2020s, Computing and Software for Big Science 3 (2019) 7. https://doi.org/10.1007/s41781-018-0018-8.

[17] Evolution of Scientific Computing in the Next Decade: HEP and beyond, http://wlcg-docs.web.cern.ch/wlcg-docs/technical_documents/HEP-Computing-Evolution.pdf, accessed 2025 12-01.

[18] High-Luminosity Large Hadron Collider (HL-LHC), https://home.cern/science/accelerators/high-luminosity-lhc, accessed 2025-12-01.

[19] A Toroidal LHC ApparatuS (ATLAS), https://home.cern/science/experiments/atlas, accessed 2025-12-01.

[20] The Compact Muon Solenoid (CMS), https://home.cern/science/experiments/cms, accessed 2025-12-01.

[21] Jiangmen Underground Neutrino Observatory (JUNO), http://juno.ihep.cas.cn/, accessed 2025-12-01.

[22] The Baikal Deep Underwater Neutrino Telescope (Baikal-GVD), https://baikalgvd.jinr.ru/, accessed 2025-12-01.

[23] Square Kilometre Array (SKA), https://www.skao.int/en, accessed 2025-12-01.

[24] V. V. Korenkov, Trends and prospects of the development of distributed computing and Big Data analytics for support of megascience projects, Physics of Atomic Nuclei 83 (6) (2020) 965–968. https://doi.org/10.1134/S1063778820050154.

[25] S. Campana, A. Di Girolamo, P. Laycock, Z. Marshall, H. Schellman, G. A. Stewart, HEP computing collaborations for the challenges of the next decade, 2022. https://arxiv.org/abs/2203.07237.

[26] D. Guest, K. Cranmer, D. Whiteson, Deep learning and its application to LHC physics, Annual Review of Nuclear and Particle Science 68 (2018) 161–181. https://doi.org/10.1146/annurev-nucl-101917-021019.

[27] S. Farrell, D. Anderson, P. Calafiura, G. Cerati, L. Gray, J. Kowalkowski, M. Mudigonda, Prabhat, P. Spentzouris, M. Spiropoulou, A. Tsaris, J. Vlimant, S. Zheng, The HEP.TrkX Project: Deep neural networks for HL-LHC online and offline tracking, European Physical Journal Web of Conferences 150 (2017) 00003. https://doi.org/10.1051/epjconf/201715000003.

[28] A. Radovic, M. Williams, D. Rousseau, M. Kagan, D. Bonacorsi, A. Himmel, A. Aurisano, K. Terao, T. Wongjirad, Machine learning at the energy and intensity frontiers of particle physics, Nature 560 (2018) 41–48. https://doi.org/10.1038/s41586-018-0361-2.

[29] R. Carrasco-Davis, G. Cabrera-Vives, F. F¨orster, P. A. Est´evez, P. Huijse, P. Protopapas, I. Reyes, J. Mart´ınez-Palomera, C. Donoso, Deep learning for image sequence classification of astronomical events, Publications of the Astronomical Society of the Pacific 131 (1004) (2019) 108006. https://doi.org/10.1088/1538-3873/aaef12.

[30] Beijing Spectrometer (BESIII) Experiment, http://bes3.ihep.ac.cn/, accessed 2025-12-01.

[31] NOvA (NuMI Off-axis νe Appearance) Experiment, https://novaexperiment.fnal.gov/, accessed 2025-12-01.

[32] HybriLIT Heterogeneous Platform, http://hlit.jinr.ru/, accessed 2025-12-01.

[33] A. Baginyan, A. Balandin, S. Belov, A. Dolbilov, I. Kadochnikov, V. Korenkov, P. Zrelov, JINR Network Infrastructure for Megascience Projects, in: Proceedings of the IEEE Xplore:2020 International Scientific and Technical Conference Modern Computer Network Technologies (MoNeTeC), Moscow, Russia, 2020, pp. 1–5. https://doi.org/10.1109/MoNeTeC49726.2020.9258004.

[34] V. E. Velikhov, V. V. Korenkov, E. A. Ryabinkin, A. G. Dolbilov, Y. V. Gugel, and T. A. Strizh, The Russian Segment (RU-VRF) in WLCG Infrastructure: High Performance Computing Network, in: Proceedings of the IEEE Xplore:2022 International Conference Modern Computer Network Technologies (MoNeTec), Moscow, Russia, 2022, pp. 1–4. https://doi.org/10.1109/MoNeTec55448.2022.9960772.

[35] LHC Optical Private Network (LHCOPN), http://lhcopn.web.cern.ch/, accessed 2025-12-01.

[36] LHCONE, https://lhcone.web.cern.ch/, accessed 2025-12-01.

[37] GEANT, https://geant.org/, accessed 2025-12-01.

[38] National Research Computer Network of Russia (NIKS), https://niks.su/, accessed 2025-12-01.

[39] A Large Ion Collider Experiment (ALICE), https://home.cern/science/experiments/alice, accessed 2025-12-01.

[40] N. S. Astakhov, S. D. Belov, I. N. Gorbunov, P. V. Dmitrienko, A. G. Dolbilov, V. E. Zhiltsov, V. V. Korenkov, V. V. Mitsyn, T. A. Strizh, E. A. Tikhonenko, V. V. Trofimov, S. V. Shmatov, The Tier-1-level computing system of data processing for the CMS experiment at the Large Hadron Collider, Journal of Information Technologies and Computing Systems 4 (2013) 27–36.

[41] V. Korenkov, I. Pelevanyuk, A. Tsaregorodtsev, DIRAC at JINR as a general-purpose system for massive computations, Journal of Physics: Conference Series 2438 (2023) 012029. https://doi.org/10.1088/1742-6596/2438/1/012029.

[42] Slurm Workload Manager, https://slurm.schedmd.com/overview.html, accessed 2025-12-01.

[43] Advanced Resource Connector (ARC), https://www.nordugrid.org/arc/arc6/, accessed 2025-12-01.

[44] BM@N (Barionic Matter at Nuclotron) Experiment, https://bmn.jinr.int/, accessed 2025-12-01.

[45] MPD (Multi-Purpose Detector), https://mpd.jinr.ru/, accessed 2025-12-01.

[46] SPD (Spin Physics Detector), https://spd.jinr.int/, accessed 2025-12-01.

[47] dCache — Distributed Storage for Scientific Data, https://www.dcache.org/, accessed 2025-12-01.

[48] A. Moibenko, J. Bakken, E. Berman, C. Huang, D. Petravick, M. Zalokar, The Status of Fermilab Enstore Data Storage System, in: Proceedings of the Computing in High Energy Physics and Nuclear Physics (CHEP 2004), Interlaken, Switzerland, 2004, p. 1210. https://doi.org/10.5170/CERN-2005-002.1210.

[49] M. Davis, J. Afonso, R. Bachmann, V. Bahyl, J. Camarero Vera, J. Leduc, P. Oliver Cort´es, F. Rademakers, L. Wardenær, V. Yurchenko, The CERN Tape Archive beyond CERN: An open source data archival system for HEP, European Physical Journal Web of Conferences 295 (2024) 01048. https://doi.org/10.1051/epjconf/202429501048.

[50] EOS Open Storage, https://eos-web.web.cern.ch/eos-web/, accessed 2025-12-01.

[51] Rucio — Scientific Data Management, https://rucio.cern.ch/, accessed 2025-12-01.

[52] HTCondor Software Suit, https://htcondor.org/htcondor/overview/, accessed 2025-12-01.

[53] Project JupyterHub, https://jupyter.org/hub, accessed 2025-12-01.

[54] Ceph, https://ceph.io/en/, accessed 2025-12-01.

[55] iTop Platform, https://combodo.com/, accessed 2025-12-01.

[56] N. A. Balashov, I. S. Kuprikov, N. A. Kutovskiy, A. N. Makhalkin, Ye. Mazhitova, I. S. Pelevanyuk, R. N. Semenov, D. A. Shpotya, Changes and challenges at the JINR and its Member States cloud infrastructures, Physics of Particles and Nuclei 55 (3) (2024) 366–370. https://doi.org/10.1134/S1063779624030092.

[57] A. Anikina, D. Belyakov, T. Bezhanyan, M. Kirakosyan, A. Kokorev, M. Lyubimova, M. Matveev, D. Podgainy, A. Rahmonova, S. Shadmehri, O. Streltsova, S. Torosyan, M. Vala, M. Zuev, Structure and Features of the Software and Information Environment of the HybriLIT Heterogeneous Platform, in: Proceedings of the 27th International Conference “Distributed Computer and Communication Networks”, Lecture Notes in Computer Science 15460 (2024) 444–457. https://doi.org/10.1007/978-3-031-80853-1_33.

[58] RSC Group, https://rscgroup.ru/, accessed 2025-12-01.

[59] E. A. Druzhinin, A. B. Shmelev, A. A. Moskovsky, V. V. Mironov, A. Semin, Server level liquid cooling: Do higher system temperatures improve energy efficiency?, Supercomputing Frontiers and Innovations 3 (1) (2016) 67–73. https://doi.org/10.14529/jsfi160104.

[60] Lustre, http://www.lustre.org/, accessed 2025-12-01.

[61] D. V. Belyakov, A. A. Kokorev, D. V. Podgainy, The distributed parallel file system Lustre for processing and analyzing data of the NICA megascience project, Physics of Particles and Nuclei (2026) (accepted).

[62] S. Shadmehri, T. Bezhanyan, M. Bondarev, O. Streltsova, M. Zuev, A. Chigasova, A. Osipov, N. Vorobytea, A. N. Osipov, A deep learning model for automated quantification of DNA repair foci in somatic mammalian cells, Physics of Particles and Nuclei 56 (2025) 1623–1627. https://doi.org/10.1134/S1063779625700984.

[63] Yu. Palii, D. V. Belyakov, A. A. Bogolubskaya, M. I. Zuev, D. A. Yanovich, Simulation of the QAOA algorithm at the JINR quantum testbed, Physics of Particles and Nuclei 56 (2025) 989–993. https://doi.org/10.1134/S1063779625700066.

[64] V. Korenkov, I. Pelevanyuk, A. Tsaregorodtsev, Integration of the JINR Hybrid Computing Resources with the DIRAC Interware for Data Intensive Applications, in: Proceedings of the International Conference “Data Analytics and Management in Data Intensive Domains”, Springer International Publishing, 2020, pp. 31–46. http://dx.doi.org/10.1007/978-3-030-51913-1_3.

[65] N. Balashov et al., Cloud integration within the DIRAC interware, CEUR Workshop Proceedings 2507 (2019) 256–260.

[66] N. Kutovskiy et al., Integration of distributed heterogeneous computing resources for the MPD experiment with DIRAC interware, Physics of Particles and Nuclei 52 (4) (2021) 835–841. http://dx.doi.org/10.1134/S1063779621040419.

[67] K. V. Gertsenberger, I. S. Pelevanyuk, BM@N Run 8 data processing on a distributed infrastructure with DIRAC, Physics of Particles and Nuclei Letters 21 (4) (2024) 778–781. https://doi.org/10.1134/S1547477124701334.

[68] N. Kutovskiy, I. Pelevanyuk, D. Zaborov, Using distributed clouds for scientific computing, CEUR Workshop Proceedings of the 9th International Conference “Distributed Computing and Grid Technologies in Science and Education” (2021) 196–201. http://dx.doi.org/10.54546/MLIT.2021.78.51.001.

[69] A. Petrosyan, D. Oleynik, A. Zhemchugov et al., Production system of the SPD experiment, Physics of Particles and Nuclei 56 (2025) 1576–1580. https://doi.org/10.1134/S1063779625700893.

[70] V. Abazov et al. (SPD Collab.), Technical Design Report of the Spin Physics Detector at NICA, Natural Science Review 1 (1) (2024). https://doi.org/10.54546/NaturalSciRev.100101.