Back to Volume
Paper: FACT-Tools – Processing High-Volume Telescope Data
Volume: 521, Astronomical Data Analysis Software and Systems XXVI
Page: 584
Authors: Buss, J.; Bockermann, C.; Adam, J.; Ahnen, M.; Baack, D.; Balbo, M.; Bergmann, M.; Biland, A.; Blank, M.; Bretz, T.; Bruegge, K.; Dmytriiev, A.; Dorner, D.; Egorov, A.; Einecke, S.; Hempfling, C.; Hildebrand, D.; Hughes, G.; Linhoff, L.; Mannheim, K.; Morik, K.; Mueller, S. A.; Neise, D.; Neronov, A.; Noethe, M.; Paravac, A.; Pauss, F.; Rhode, W.; Ruhe, T.; Shukla, A.; Temme, F.; Thaele, J.; Walter, R.
Abstract: The amount of data produced in astroparticle physics exceeds the capacities of traditional data processing systems. Within the Big Data era, the field of computer science has brought up a plethora of tools and methods to scale data processing to a large number of nodes. In the stress field of rapid prototyping scientific algorithms and the implementation of scalable processing pipelines, we propose an abstraction for high-level modelling of such pipelines. This enables physicists to concentrate on their domain specific use-case at hand, while maintaining the benefit to deploy their algorithms within a scalable execution platform.
In this paper, we outline the collaborative work of physicists and computer scientists to develop a modeling framework that fits the aforementioned setting. The framework is used within the FACT project, showing its real-world practicability and ease-of-use.
Back to Volume