Report Open Access

Distributed LHC Event-Topology Classification

Presutti, Federico; Pierini, Maurizio

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Presutti, Federico</dc:creator>
  <dc:creator>Pierini, Maurizio</dc:creator>

High data volumes and data throughput are a central feature of the CMS detector experiment in the search for new physics. The aim of this project is to develop prototype systems capable of speeding up and improving the quasi-real-time analyses performed by the triggers during the data-acquisition stage of the experiment. This is of importance as the high luminosity upgrade of the LHC is expected to increase the raw data throughput significantly. The options explored to improve the trigger farm performance are the use of GPUs for parallelization of razor variable analysis, and inference based on distributed machine learning algorithms.</dc:description>
  <dc:subject>CERN openlab summer student</dc:subject>
  <dc:title>Distributed LHC Event-Topology Classification</dc:title>
All versions This version
Views 7575
Downloads 1717
Data volume 8.9 MB8.9 MB
Unique views 7171
Unique downloads 1717


Cite as