13 August 2019
CERN
Europe/Zurich timezone
There is a live webcast for this event.

Contribution List

18 out of 18 displayed
Export to PDF
  1. 13/08/2019, 14:00
  2. Debdeep Paul
    13/08/2019, 14:10

    The project dealt with real time monitoring of the server hosting the FPGA for Convolution Neural Network inference for the Proto-DUNE Project. I also investigated methods to deal with the numerical overflow of the weights and activation while working with fixed point arithmetic in FPGA.

    Go to contribution page
  3. Ishank Arora
    13/08/2019, 14:17

    The IT-ST group at CERN runs and evaluates innovative cloud storage technologies for their application to big data problems in high-energy physics research. One of the entities it focuses on is EOS, the CERN multi-Petabyte disk-based storage service built from commodity hardware, heavily used as well by LHC and non-LHC experiments. The massive scale at which EOS runs leads to room for multiple...

    Go to contribution page
  4. Giovanni De Toni
    13/08/2019, 14:24
  5. Andrea Lacava
    13/08/2019, 14:31

    CERN is one of the most heterogeneous network in the world and in order to keep its traffic safe we’ve to inspect it in real time.

    We have already an Intrusion Detection System that inspect CERN traffic firewall, but we're looking to make it more powerfull and more reliable.

    In my work I've focused on IDS upgrade in order to support multiple hardware vendors to improve its scalability...

    Go to contribution page
  6. Raghav Kansal
    13/08/2019, 14:38
  7. Ms Elisabeth Ann Petit-Bois (Kennesaw State University)
    13/08/2019, 14:45

    OpenStack is a popular open source cloud-computing software platform used widely at CERN. EOS is a disk-based, low-latency storage service powering user, project, and experiment data on services such as CERNBox.

    This project strives to improve user experience by integrating EOS into OpenStack Manila, a shared storage system. This way, users are able to request and access project space via...

    Go to contribution page
  8. Rajula Vineet Reddy
    13/08/2019, 14:52
  9. Akash Gupta
    13/08/2019, 14:59
  10. Leticia Farias Wanderley
    13/08/2019, 15:26
  11. Venkata Ravicharan Nudurupati
    13/08/2019, 15:33
  12. Shreya Krishnan
    13/08/2019, 15:40
  13. Shahnur Isgandarli
    13/08/2019, 15:47
  14. Anwesha Bhattacharya
    13/08/2019, 15:54

    Using Micron's FPGA-based inference engines and FWDNXT firmware + software for compiling models and running the inference

    Go to contribution page
  15. Hamza Javed
    13/08/2019, 16:01

    In order to benefit of modern machine learning in the early stages of the data acquisition of a typical HEP experiment, one has to be able to execute ML model inference within the
    latency of the L1 trigger system. At the LHC, this time is of O(10) μs. The aim of this project is to deploy a set of LHC-related neural networks to the Intel FPGAs.

    Go to contribution page
  16. Jayaditya Gupta
    13/08/2019, 16:08
  17. Maksim Artemev
    13/08/2019, 16:15
  18. Foteini Panagiotidou
    13/08/2019, 16:22

    Tool that generates report on the status of inveniosoftware repositories and suggests suitable actions.

    Go to contribution page