Scientific principles have played a central role in physics. The principle of relativity, the equivalence principle, the gauge principle, and the correspondence principle, to name a few, form the basis of our best current theories of nature. On the other hand, the apparent failure of the naturalness principle has sparked a crisis about the future direction of particle physics. From a philosophical point of view, however, physical principles remain undertheorized. Can the much more elaborate philosophical debates about the nature of laws and symmetries also be transferred over to principles, or do principles raise novel metaphysical issues? What methodological role have principles played in the historical development of physics and what can be learnt for contemporary practice?
The aim of this workshop is to bring together philosophers, historians and physicists to discuss these and similar issues and to try to initiate a debate about principles in physics
Invited Speakers:
Emily Adlam (Western University)
Alexander Blum (MPIWG Berlin)
Karen Crowther (University of Oslo)
David DiVincenzo (RWTH Aachen University)
Astrid Eichhorn (University of Southern Denmark)
Enno Fischer (Ruhr University Bochum)
Marco Giovanelli (University of Turin)
Andreas Weiler (Technical University Munich)
| This workshop is organized within the framework of the DFG Research Unit "The epistemology of the Large Hadron Collider (LHC)". |
Consistency is the most basic principle that constrains any theory, whether physical or philosophical. This principle comes in various forms, including: internal consistency, external consistency, and empirical consistency. It also underlies the generalised correspondence principle in physics. Now, consistency is heavily relied upon in the search for a new theory: quantum gravity (QG). Arguably, the primary motivation for QG is to find a theory that consistently combines our current theories of fundamental physics, GR and QFT, and describes the domains where both these theories are thought necessary. Additionally, there are perceived inconsistencies within GR and QFT which many physicists take to indicate that these are not in fact fundamental theories, and thus also motivate the search for QG. The goal of consistency also underpins some of the most basic constraints upon the theory being sought. I explore the role of inconsistencies in motivating and constraining QG, as well as the status of consistency as a principle more generally.
In searching for a quantum theory of gravity, one may follow various different candidate principles. I will advocate for an approach based on the three principles
i) be conservative
ii) connect to experiment and
iii) go beyond research silos.
I will use research activities in asymptotically safe quantum gravity as an illustrative example.
Einstein’s general theory of relativity is often presented as a significant turning point indicating a methodological shift in theoretical physics towards mathematically-based patterns of reasoning, and theories populated by abstract mathematical structures constructed using non-empirical guiding principles. This approach, which was advocated inter alia by Einstein himself from the mid 1920s, stands in contrast to his early presentation of the theory using principles of general covariance and a principle of equivalence, which he described as empirical observations extended and promoted to fundamental principles. Einstein’s appeal to these principles, however, was immediately criticized, and never reached a mature form as a coherent construction of the theory. In this paper we follow the lead of the earlier Einstein, presenting a reconstruction of the basic structure of the theory, stressing how one can go a long way towards the general theory via inductive and empirical principles, and moreover without invoking geometrical considerations. The key theoretical device which we deploy in order to underwrite our demonstrations is what we call the ‘Methodological Equivalence Principle', that together with general covariance, is understood and employed as a straightforward extrapolation of empirical considerations. We show that understanding general covariance and the equivalence principle as methodological principles of theory construction can resolve issues concerning their physical content and applicability in general relativity, while at the same time keeping them as clearly formulated theoretical principles.
In the above derivation the violation of general covariance prescribes the introduction of an additional field, to be interpreted as the metric field. We further show that this derivation can be regarded as a template for later employment of invariance arguments. Thus, applications of the gauge principle similarly do not set-off merely from an invariance requirement but rather from the particular way in which the requirement is violated in an existing successful theory.
Finally, we turn to discuss the place of locality in the argument. Locality is commonly understood as a fundamental physical principle, or at least as a theoretical virtue. Yang and Mills, for example, motivated local gauge invariance by arguing that global invariance is inconsistent with the idea of a local field theory. We argue that this kind of argument misses the point. According to our construction, relativistic gravity and gauge theories share a notion of locality which is not imposed, but is rather an identified characterization of the evidence for the interaction-free theory. The applicability of inertial frames for the description of non-gravitational forces is perceived, in accordance with existing evidence, as a local matter, and therefore explained as such. The applicability of a preferred isospin convention is similarly explained as a local matter, determined by the local values of a conjectured bosonic field (that depends contingently by itself on its interaction with fermionic matter fields). Our reconstruction of relativistic gravitation from principles therefore suggests that the case of general relativity shows that the weight of empirical considerations (as opposed to mathematical ones) is sometimes greater than is usually appreciated.
The principle that the dynamics of any open system should be derivable from the fundamental automorphic dynamics of a larger closed system represents what we will be calling, in this talk, the closed systems view. The closed systems view is deeply entrenched in physics. Standard quantum theory (ST) is no exception, and within it the closed systems view finds expression in the principle of complete positivity, a principle governing the dynamics of density operators that has accordingly taken on the status of a fundamental physical principle for many. Although ST is a highly successful theoretical framework that has been used fruitfully for the study of all sorts of systems, there are nevertheless reasons to motivate looking beyond it. In particular we will argue in this talk that the proper subject of foundational and philosophical study in quantum theory is what we will be calling the general quantum theory of open systems (GT), an alternative theoretical framework for quantum physics, formulated in accordance with what we call the open systems view, in which systems are fundamentally represented as being in interaction with their environments. In GT, physical systems are represented by density operators evolving non-unitarily in general. As we will argue, complete positivity need not be imposed as a fundamental physical principle in GT, and it is in this sense a more general dynamical framework than ST, even though it adds nothing to the Hilbert space formalism of quantum theory. That is, GT, unlike ST, straightforwardly allows us to model the non-unitary dynamics of systems in fundamental terms, and in particular allows us to model the dynamics of the universe as a whole as if it were initially a subsystem of an entangled system. We will argue that there are reasons, that stem from considering gravitational physics and cosmology, as well as from applications of ST itself, to motivate taking such dynamical possibilities seriously and for adopting GT as the preferred theoretical framework for quantum physics.
Gauge symmetry is most successful at predicting the structure of all interactions among particles in the Standard Model. Yet, among the principles of QFT: causality, covariance, probability (Hilbert space), it stands out: it addresses exclusively non-observable entities (gauge potentials and Fermi fields). It may be doubted how “fundamental” such a principle is from an ontological point of view.
Quantum gauge symmetry can only be formulated on auxiliary state spaces with indefinite metric (“negative probabilities”). The BRST method allows to return to a Hilbert space. Charged interacting fields are not defined on this Hilbert space, because they are not BRST-invariant.
An alternative is presented with the same (and even superior) predictive power on the structure of interactions. It proceeds directly on the physical Hilbert space, and allows to construct interacting charged fields. They necessarily have a weaker localization than the observable fields (which are the same as in the BRST setup).
Joint work with Jens Mund and Bert Schroer, arxiv:2209.06133v2
One sense in which a physical claim may qualify as a “principle”, as opposed to a law for example, is that it expresses a typically general-sounding but vague idea. The talk will deliver a case study for this notion. While there is a longstanding discussion about the interpretation of the extended, general principle of relativity, there seems to be a consensus that the special principle of relativity is an absolutely clear statement. However, a closer look at the literature on relativistic physics reveals a more confusing picture. The talk will illustrate this situation by discussing how Einstein uses the special relativity principle in his 1905 paper. It will be pointed out that Einstein applies three different versions of the principle—three different statements with different physical content. The first version is the relativity principle as applied in the magnet-conductor thought experiment, by which Einstein famously begins, and motivates, his analysis. The second version is the requirement of covariance that Einstein uses in deriving the transformation laws of the electric and magnetic field strengths in the electrodynamical part of his article. The third variant is the way in which Einstein applies the relativity principle when deriving the equation of motion for the moving point charge in the closing section of the 1905 paper. It will be shown how each of the three versions is problematic and often vague in its own terms, and how they are manifestly nonequivalent, two of them being even contradictory together. Along the way, our analysis will lead us to pose several obvious, but not obviously answerable, questions about the precise meaning of the principle of relativity. To turn the statement of the relativity principle from a vague idea into an unambiguous physical claim, each of these questions requires a sharp answer.
In this talk I will introduce the operational theories approach to research in quantum foundations and undertake a reconstruction of the epistemic significance of this research. I argue that the space of operational theories is analogous to the space of possible worlds employed in the possible world semantics for modal logic, so research of this sort can be understood as probing modal structure. Thus I will argue that operational axiomatisations of quantum mechanics may be interpreted as a novel form of structural realism; I discuss the consequences of this interpretation for the philosophy of structural realism and the future of operational theories.
My main job is to make quantum computers (QCs) happen, and I will tell about that. QCs are very low energy machines, based on the premise that there are extremely accurate effective Lagrangians describing their domain. Trapped atom devices have been pretty successful for building QCs, and the atomic Lagrangian is indeed highly descriptive -- even considered ``fundamental" by some workers, despite requiring many more than 92 arbitrary parameters (atomic masses). The Lagrangian for superconducting devices is, as in nuclear physics, not so descriptive, and troubles constantly occur due to this fact. I will reflect more generally on effective Lagrangians, and pose the question, in the epistemic spirit of "turtles all the way down", whether we can feel we can do fundamental physics without quarks, and without most of the Standard Model.
Scientific principles in physics can be understood as useful in the context of discovery, but rarely crucial in the context of justification (to recall a distinction due to Hans Reichenbach)---at least not in the physics of space-time and gravity, where principles have seemed particularly important. Consequently neither the need to evaluate principles philosophically nor the apparent failure of some attractive principle(s) generates a crisis for physics or the philosophy of science.
The dispensability of principles in the justification of space-time physics can be understood using two converging lines of research: (1) the particle physics tradition of gravity, which during the late 1930s-early 1970s generated a more compelling nuts-and-bolts argument for Einstein’s equations, something like an eliminative induction, than is yielded by such principles as relativity, equivalence, general covariance, etc., and (2) historiographic work on the history of General Relativity, which uncovered Einstein’s dual strategy, including his now-famous principles and a less remembered “physical strategy” involving criteria such as an analogy to electromagnetism, a link between energy conservation and gravitational field equations, and an analogy to Newtonian gravity.
The particle physics tradition in gravity, involving key contributions by Pauli, Fierz, Kraichnan, Gupta, Feynman, Weinberg, van Nieuwenhuizen, Deser, Duff, van Dam, Veltman, etc., explored gravity assuming (at least) Poincare invariance, stability (a key criterion ruling out vector potentials and vector parts of tensor potentials), gravity as described by a symmetric rank 2 tensor potential (the simplest possibility once the bending of light refuted scalar theories such as Nordström’s), and long or infinite range. Excluding negative-energy vector components largely fixes the free gravitational equation to have gauge freedom, which in turn requires that any source by conserved, which leaves the total (material + gravitational) stress-energy as the only plausible candidate. A change of variables then shows that gravity and space-time merge in the field equations. “Graviton mass terms,” though prima facie possible, encounter various devils in the details, a long story with no little historical contingency and much modern (2010s) attention in physics. “Principles” such as equivalence and general covariance emerge rather as theorems starting from premises involving modest empirical facts and mathematical requirements for viable field theories. Much of this reasoning, one notices, parallels Noether’s discussion of the converse Hilbertian assertion, that the stress-energy complex involves a term vanishing with the field equations and an identically conserved term.
Recent historiography on Einstein’s process of discovery, due to Stachel, Renn, Janssen, Norton, Sauer, van Dongen, etc., has recovered Einstein’s above-mentioned “physical strategy.” Van Dongen has argued that Einstein suppressed his physical strategy, which he deemed a failure, in order to bolster his unified field theory program. One notices that Einstein’s physical arguments resemble the ideas later developed in the particle physics tradition, so the physical strategy was indeed viable.
Hence the prominence of principles in space-time physics is historically contingent; physics still primarily rests on mathematics, logic and experiment.
In the context of information-theoretical reconstructions of quantum mechanics,
Einstein’s distinction between constructive and principle theories is typically believed
to support arguments against realist (or ψ-ontic) interpretations of quantum mechanics.
For instance, one argues that since such interpretations have neither the explanatory
power of a principle theory, nor the explanatory power of a constructive theory, they
must be rejected as explanatorily defective. Realist interpretations typically have a
representation-based metasemantics, i.e., they explain why quantum terms have the
semantic values they are taken to have by appeal to the very semantic rules that assign
those values to the terms. In this sense, realist interpretations make metasemantics easy.
The question addressed in this paper is whether information-theoretical reconstructions
can keep easy metasemantics despite their rejection of realist interpretations.
After reviewing a couple of arguments advanced by proponents of information-
theoretical reconstructions against realist interpretations, I consider in some detail the
following answers to that question:
1. Adopt some version of a ψ-epistemic interpretation of quantum mechanics, ac-
cording to which the nature of the quantum state is such that although it does not
represent physical reality, it does represent our knowledge of (or information about) it.
2. Re-conceive the nature of fundamental physical reality. For instance, this may
be considered not mechanical, but informational, or as not consisting of objects, but
(informational) structure all the way down.
3. Transfer the easy metasemantics of the general principles to quantum mechanics
by reconstruction. If such transfer is possible, then quantum mechanics is said to
acquire a precise meaning in virtue of the first principles.
4. Give up easy metasemantics. Reconstructed quantum mechanics can be taken as
a non-Boolean probability theory, rather than a theory that describes physical reality.
Its meaning then must be determined by something else than semantic rules.
My discussion will be focused especially on options 2 and 4. In particular, I argue
that it is precisely as a consequence of considering the implications of 2 that one is led
to endorse 4. However, I also argue that giving up easy metasemantics is avoidable.
Quantum mechanics, reconstructed from general principles, may be considered as a
probability theory without blocking easy metasemantics.
I will give a historical overview of the various claims that (certain) physical quantities should be represented by analytic functions. I will show that there were two very distinct reasons for postulating such an “analyticity principle": (i) the expectation that (our representation of) the world should be both mathematically simple and infinitely smooth, an expectation that came to be disputed, in particular by the French mathematicians Henri Poincaré and Jacques Hadamard, and (ii) the association of analyticity with causality, which began in the work of Hans Kramers and Ralph Kronig and then became central to 1960s S-Matrix theory.
In this paper I will analyse Cassirer’s conception of physical principles in order to argue that (i) they are universal, meaning that they do not entail any definite content belonging to a particular phenomenon or a specific region of physical domain, rather they refer to the algebraic operations according to which we decipher and understand classes of physical phenomena; (ii) physical principles prove to have a heuristic value in the construction of physical theories, namely, a capacity to orient the inquiry towards the derivation of physical laws for a normative determination of empirical domains; (iii) finally, the function of physical principles is grounded in the capacity of ‘synopsis’, i.e., the functional connection that is made possible between different phenomena and regions of physical domains. Then I will argue that the characteristics of physical principles and their function allow to shed light on the process of theory construction as a theoretical development independent from any ontological commitment of the underlying physical theory.
In this regard, I will apply Cassirer’s conception of physical principles to the historical development of Schrödinger’s undulatory mechanics, in order to argue that, on the one hand, Schrödinger sought after an undulatory mechanics by means of a generalization of Hamilton’s principle, and on the other hand, the undulatory ontology results as a by-product of the constitution of the theory itself. I will focus on Schrödinger’s first and second papers on undulatory mechanics: Quantization as a Problem of Proper Values (Ger. Eigenwert) (Part I) and Quantization as a Problem of Proper Values (Part II). In the first paper Schrödinger reflects on the derivation of the whole quantum numbers in the case of nonrelativistic hydrogen atom, by treating quantization as a variational principle. In the second paper, Schrödinger pursues further the formulation of undulatory mechanics by deepening Hamilton’s mechanical-optical analogy.
Hamilton formulated an analogy between Fermat’s principle of least time of light rays and Maupertuis’s principle of least action of mechanical systems, by working on the extremal laws for geometrical optics and classical particle mechanics. I will show that Hamilton’s analogy between physical principles allowed Schrödinger to gain two argumentative strategies for the justification of undulatory mechanics. This is because, Hamilton’s analogy provided Schrödinger with two formal levels of analogy, i.e., according to variational principles and differential equations. With respect to variational principles, Hamilton’s theory is grounded in the analogy between Fermat’s principle and Maupertuis’s principle. According to the differential equations, it is based on the analogy between the eiconal equation and the Hamilton-Jacobi equation for the characteristic function W.
In conclusion, I will argue that the reflection on the functions of physical principles, and not the search of a wave picture of nature, originally paved the way for the construction of Schrödinger’s undulatory mechanics.
Keywords: Cassirer; Physical Principles; Schrödinger; Hamilton’s Analogy; Undulatory Mechanics.
I describe a new principle of relativity
involving the topology of space-time.
On its basis I derive the following:
1) the gauge group of nature must be SU(5)
2) there must be exactly 3 generations of fermions
3) after symmetry breaking the standard model is
the only possible low-energy theory in the chiral sector
4) The dark matter of the universe consists of
Dirac triplets, or at least an even number of Majorana triplets
Scientific principles are not static. As scientific inquiry proceeds, principles can go through a series of phases and processes, including a prehistory, a phase of elevation, and processes of formalization, generalization, and challenge. In this talk I will illustrate this ‘life cycle’ of scientific principles with a few examples from physics. I will also make some tentative suggestions as to whether a principle’s going through these phases and processes might correlate with the principle’s usefulness in scientific research.
Talk based on joint work with Radin Dardashti and Robert Harlander.
Toward the end of 1919, in a two-column contribution for The Times of London, Einstein famously declared relativity theory to be a 'principle theory,' like thermodynamics, rather than a 'constructive theory,' like the kinetic theory of gases. In the last twenty years, this distinction has attracted considerable attention in both the historically- and the theoretically-oriented scholarship. As it turns out, its popularity has somewhat overshadowed its core message. Einstein introduced not only a distinction between two types of theories, but also between two strategies for finding theories. Thus, Einstein's 1919 article should be read not as much as an abstract philosophical reflection, but as a personal testimony of a practicing physicist. As Einstein once wrote jokingly in a letter to his friend Ehrenfest, he was, with few others, a principle-pincher (Prinzipienfuchser), ready to squeeze as much as possible from few fundamental principles, rather than a profligate virtuoso, squandering his calculation mastery in trifling puzzle solving. The distinction between constructive and principle theories allows us to share a glimpse behind the wizard's curtain, to unveil his most successful trick: instead of proceeding in a constructive way, by directly searching for new theories, Einstein preferred to search for principles first, formal conditions which constraint the number of possible theories. Most, if not, all of Einstein's scientific successes were obtained by following the principle strategy. Most, if not all, of his failures, happened when he was forced to fall back to the constructive strategy.
Differential time-evolution equations have long been the paradigm examples of laws of nature. Yet, the 20th century saw the rise to prominence of atemporal conditions and principles in many areas of physics. This paper looks at a particularly radical strand of this tendency: the attempt to do away with time-evolution equations entirely in relativistic quantum theory. I make a case for recognising a continuous tradition of these attempts—which I call the long S-matrix programme—stretching from the 1940s to the 1970s.
The story starts with Heisenberg’s introduction of the S-matrix in the 40s. He believed that the divergence difficulties facing quantum field theory (QFT) stemmed from the use of a differential time evolution equation. Not unreasonably, since the dynamical equations governing locally interacting fields contained products of field operators at coincident space-time points, and this even today, is seen as the root of many of the foundational difficulties with the theory. Heisenberg’s idea was that one could, instead, set up a relativistic scattering theory by imposing conditions directly on the S-matrix, with unitarity and Lorentz invariance being the two he came up with. Rather than dismissing Heisenberg’s approach as an oddity of pre-war physics which was entirely swept aside by the success of renormalized quantum electrodynamics, I argue that a number of influential programmes in the 50s and 60s were direct continuations of Heisenberg’s dream of a Lagrangian free physics.
The consensus that emerged amongst these followers was that a causality condition of some kind needed to be added to Heisenberg’s list of principles. Different causality conditions were developed, leading to a family of sibling research programmes:
i) Causal perturbation theory: Stueckelberg and later Bogoluibov developed a causality condition designed to set up a series expansion for the S-matrix without starting from a Lagrangian. This approach remained peripheral in the 50s and 60s but would later be adopted by mathematical physicists attempting to reconstruct the conventional perturbative formalism in a more rigorous way.
ii) Axiomatic quantum field theory: Somewhat surprisingly, the early work of Haag and Wightman can be seen as a continuation of the S-matrix concept, with the microcausality condition now being imposed on the basic structures of the theory. Early axiomatic QFT was also driven by a desire to avoid writing down particular Lagrangians and to progress as far as possible using only general principles.
iii) The 60s S-matrix programme: Chew’s S-matrix approach to strong interactions in the 60s did not come out of nowhere. It was a continuation of work on dispersion relations and n-point functions in the 50s which had often been allied to early axiomatic QFT. The analyticity conditions which were now imposed on the S-matrix were interpreted as expressing a new causality condition and a new concerted effort was made to construct a Lagrangian free phenomenological description of strong interactions.
While the dream of completely eliminating Lagrangians from high energy theory never came to pass, many of these sub-programmes were ultimately rebranded and remain influential today, especially in mathematical physics.
Physical principles separate the physical from the unphysical. They come in various forms and functions. The naturalness principle, for instance, is primarily a heuristic for model builders in present day elementary particle physics whose specific form and validity has been broadly debated recently. In some cases, such as Mach’s principle, debates last forever and extend into genuinely epistemological domains. Other principles, among them general covariance, the gauge principle, or quantum locality, are deeply ingrained into the formal structure of the respective theories. Debates about their meanings combine formal and philosophical arguments. This is especially the case if those theories can be given an axiomatic foundation, such that the traditional Carnapian methods of philosophy of science as logical analysis of conceptual frameworks can take hold. While doing so follows in the footsteps of the modern understanding of axiomatic developed by the Hilbert school, the concrete execution of Hilbert’s Sixth Problem, the axiomatization of physics after the model of geometry, has often been less pure than present-day philosophers assume. In my talk, I discuss Hilbert’s own axiomatization of relativity theory, von Neumann’s larger program of the axiomatization of quantum theory, and the early phase of axiomatic approaches in the late 1950s and early 1960s. It will turn out that such axiomatizations are often guided by broader physical and philosophical intuitions, are sometimes more opportunistic than foundational, and strongly influenced by the availability of suitable mathematical theories.
Physical principles are statements pertaining to physical theories, the possibilities they represent, and what can be inferred from them. One can classify principles in many cross-cutting ways, but doing so according to the principles’ scientific function can be particularly illuminating, illustrating the diversity of roles they can play, and how those roles can change over time. Using the theory of relativity (both special and general) as an example, I describe five functions: representational, axiomatic, law-like, inferential, and heuristic.
Representational principles delineate how mathematical structures in the models of a physical theory represent objects, properties, or relations. For instance, the “clock hypothesis” of relativity theory asserts that the length of any timelike curve represents the duration of a point-like process along that curve. Thus, if that curves traces the worldline of a particle, the lapse of a parameter tracking this duration serves as a dynamical parameter for the particle’s equation of motion. Representational principles are especially important in what Einstein called constructive theories, where the mathematical models of the theory are given explicitly. In order for such models to play their theoretical role, they must be supplemented with representational principles.
Axiomatic principles, by contrast, serve at once to specify the theory itself and what possibilities the theory represents. In Einstein’s original formulation of the special theory of relativity, the principle of (special) relativity allows one to generate possibilities by uniformly translating a coordinate system. Such principles are essential for what Einstein called principle theories, where the mathematical models are not given explicitly but must be found as those that provide models for the principles. Over different formulations of a theory, an axiomatic principle can become a representational one, as the principle of general covariance arguably has.
Law-like principles of constrain or reduce the possibilities that a theory—constructive or principle—would otherwise endorse. As the name suggests, they are sometimes expressed as physical laws, but often are deemed principles when they find expression across many different theories. In special relativity, energy conservation (i.e., that the energy-momentum tensor is divergence-free) expresses such a principle that is then subsumed under the Einstein Field Equation in general relativity. Energy conditions are also examples of law-like principles.
Focal principles draw out or emphasize a particular consequence of a theory for understanding, inference, or calculation. They do not provide extra constraints or meaning to mathematical models, but facilitate explanations and deductions. Action principles for deriving field equations are examples.
Heuristic principles suggest connections with or constraints on future theories. In this way, they are like aspiring axiomatic principles. For instance, the strong principle of equivalence suggests a procedure for taking a special relativistic matter theory and producing a corresponding general relativistic matter theory. The principle of background independence suggests constraints for future theories of quantum gravity.