ETN application submitted - January 2020

The SMARTHEP network has submitted an application to the MSCA ITN funding program, proposing to fund 12 students on academic and industrial real-time analysis applications in European network institutes.

Advanced Study Group REALTIME Pufendorf Institute of Advanced Studies - 2019/2020

Lund researchers and entrepreneurs from the SMARTHEP network have proposed an Advanced Study Group to the Pufendorf Institute of Advanced Studies, and it was funded to begin in September 2019. Read more about its activities below.

REALTIME Advanced Study Group at the Pufendorf Institute of Advanced Studies - 2019/2020

This Advanced Study Group brings together the faculties of Science (Physics, Mathematics, Astronomy), Engineering (EIT), Social Sciences (Psychology), and Law to discuss and investigate solutions to this challenge in the realm of real-time data acquisition and data analysis. The traditional data taking and data analysis paradigm requires data to be stored before it is analysed, while the solutions investigated by this Advanced Study Group see a near-simultaneous (real-time) execution of the data collection and analysis.

Institut Pascal "Learning To Discover" real-time analysis workshop - July 2019

A number of researchers from the SMARTHEP team have organized the first track of Institut Pascal HEP and Data Science event in 2019, called "Learning to Discover", titled "1st Real Time Analysis Workshop". The objective of this workshop is to bring together the community of high-energy physicists who are driving the developments of real-time analysis within their domain with key real-time analysis specialists from industry. By coalescing around concrete problems, and searching for common solutions, new collaborations have been started bringing the community closer together. The workshop alternated formal presentation, brainstorming sessions and hackathons, and ample free time for unorganised interactions.

...
RAPID Workshop - October 2018

The Large Hadron Collider (LHC) generates terabytes of data per second and handling this flood of data is a major challenge for the physicists. This November one of the SMARTHEP institutes, the Technical University in Dortmund, in collaboration with two other institutes (CERN and LPNHE) has hosted a RAMP challenge to tackle a particularly interesting problem in this domain : can machine learning assist high-energy physics in finding vertices where the particles are created? One of the keynote talks was given by A. Sopasakis of Ximantis, another SMARTHEP participant.

Kickoff group photo
SMARTHEP kicks off at Lund - May 2017

Researchers at the Large Hadron Collider (LHC) deal with enormous datasets on a daily basis. The experiments which observe the LHC's proton collisions generate hundreds of Exabytes of data per year, comparable to the entire current worldwide mobile phone traffic [1] and potentially consumings tens of millions of hard disks. It would not be physically possible to store all this data, and even if could be stored, it could never be distributed to the thousands of researchers around the world who are waiting to analyze it. For this reason, most of the analysis carried out on this data is performed in real-time. Custom processing hardware making microsecond decisions about which data to keep and discard is backed up by thousands of computers which analyze the remaining data and look for specific interesting physics processes within it. At the end of this process the data stream is reduced by a factor of 10000, small enough to finally send to the physicists who will sift through it in search of new particles and forces governing our world. This process of real-time analysis is called "triggering" by physicists because it primarily consists of looking for specific interesting features in the data, for example a particle with exceptionally high energy, and selecting the subset of proton collisions with such features for future inspection. Triggering has been at the heart of the more than 1000 papers published by the LHC collaborations. It is nevertheless a primitive approach to real-time analysis, which assumes that only a small fraction of proton collisions produce interesting physics and that the trigger's job is to find these needles in the haystack of data. As the LHC experiments increase their data rates 100 times over the next decade, in order to probe nature with ever greater precision, this assumption will break down in a fundamental way, and instead of looking for needles in haystacks real-time analysis will have to categorize haystacks of needles. To do this, real-time analysis will have to become more sophisticated, no longer relying on simple easily visible features but rather studying the entire event in detail, using the best quality detector calibration in order to maximize its power.

Motivated by these problems, researchers who have been at the forefront of today's real-time analysis at the four big LHC experiments --- ATLAS, CMS, LHCb, and ALICE --- met together with industrial partners for a three-day workshop in Lund to launch the SMARTHEP network. Our objective? To build common infrastructures which will allow researchers to confidently perform their analyses in real-time, from continuously aligning and calibrating their detectors, selecting interesting signals and rejecting backgrounds, and persisting the resulting analysis output for future use. We are linking together because we are convinced that our problems share a common core, and because we believe in benefitting from the best of each other's ideas. As we develop our network we will keep you posted on its progress, so look our for further stories soon!

[1] Ericsson mobility report 2016