publication . Other literature type . Report . 2017

Distributed Lhc Event-Topology Classification

Presutti, Federico; Pierini, Maurizio;
Open Access
  • Published: 02 Feb 2017
  • Publisher: Zenodo
Abstract
Abstract High data volumes and data throughput are a central feature of the CMS detector experiment in the search for new physics. The aim of this project is to develop prototype systems capable of speeding up and improving the quasi-real-time analyses performed by the triggers during the data-acquisition stage of the experiment. This is of importance as the high luminosity upgrade of the LHC is expected to increase the raw data throughput significantly. The options explored to improve the trigger farm performance are the use of GPUs for parallelization of razor variable analysis, and inference based on distributed machine learning algorithms.
Subjects
arXiv: Physics::Instrumentation and Detectors
free text keywords: CERN openlab summer student
Download fromView all 2 versions
Zenodo
Other literature type . 2017
Provider: Datacite
ZENODO
Report . 2017
Provider: ZENODO
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue
publication . Other literature type . Report . 2017

Distributed Lhc Event-Topology Classification

Presutti, Federico; Pierini, Maurizio;