8โ€“12 Sept 2025
Hamburg, Germany
Europe/Berlin timezone

Repurposing Large Language Models

11 Sept 2025, 15:10
20m
ESA C

ESA C

Oral Track 3: Computations in Theoretical Physics: Techniques and Methods Track 3: Computations in Theoretical Physics: Techniques and Methods

Speaker

Daniel Schiller (Institute for Theoretical Physics Heidelberg)

Description

Foundation models are a very successful approach to linguistic tasks. Naturally, there is the desire to develop foundation models for physics data. Currently, existing networks are much smaller than publicly available Large Language Models (LLMs), the latter having typically billions of parameters. By applying pretrained LLMs in an unconventional way, we introduce large networks for cosmological data with a relatively cheap training cost.

Significance

We present a novel and unconventional method of utilising LLMs for physics data.

Authors

Ayodele Ore Caroline Heneka Daniel Schiller (Institute for Theoretical Physics Heidelberg) Florian Nieser (Heidelberg University) Tilman Plehn

Presentation materials