Helmholtz Supercomputing and Big Data/en
Helmholtz Programme Supercomputing and Big Data
Sub projects: DLCL Energy
Over the last decade, both High performance computing (Supercomputing) and the management and analysis of Big Data have become strategic key technologies for theoretical and experimental research as well as for industrial product and production optimisation. The continuously growing complexity of model systems in science and engineering reflects the growing demands which have to be met by the computational instruments and methods. A special challenge today is the handling of highly complex data streams generated by large-scale experiments and simulations as well as the storage, management and analysis of big data.
The main goal of the programme “Supercomputing & Big Data” is the provision of world-class instruments and infrastructures for high performance computing and for the management and analysis of large-scale data for computational science and engineering in Germany as well as in Europe and within the context of national and European frameworks. To address these challenges the programme will focus on the one hand on developing methods and tools—used to optimise HPC applications from various research fields—as well as on establishing and operating, in part, Helmholtz-wide simulation laboratories, where expert scientists and computer-experts work together with users on solutions for community-based HPC challenges. On the other hand, the programme aims to investigate and develop federated systems and infrastructures specifically designed to support the secure, efficient and sustainable management and analysis of big data. A third part comprises the procurement, installation and operation of leading supercomputers in coordination with national and European partners.
Hartmut Schmeck; Achim Streit, 2017/03/07, KIT, Fakultät für Wirtschaftswissenschaften