8–12 Sept 2025
Hamburg, Germany
Europe/Berlin timezone

Sparsity in neural networks

Not scheduled
30m
Hamburg, Germany

Hamburg, Germany

Poster Track 2: Data Analysis - Algorithms and Tools Poster session with coffee break

Speaker

Dr Mahsa Taheri Ganjhobadi (Postdoc Hamburg university)

Description

While physical systems are often described in high-dimensional spaces, they frequently exhibit hidden low-dimensional structures. A powerful way to exploit this characteristic is through sparsity. In this talk, we explore the role of sparsity in neural networks in two key contexts: (1) in generative models, particularly diffusion models, where we demonstrate how sparsity can accelerate the sampling process; and (2) in structured sparsity—at the levels of connections, nodes, and layers—where we analyze its impact on the generalization error of neural networks. We also discuss how concepts of sparsity in machine learning extend to physics, uncovering deep connections between the two fields.

Author

Dr Mahsa Taheri Ganjhobadi (Postdoc Hamburg university)

Co-author

Presentation materials

There are no materials yet.