- Compact style
- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
Within the last decade progress in quantum information theory (QIT) has revolutionized our understanding and computational capacity of fundamental concepts in physics and science in general. Tools and techniques from QIT are bringing new perspectives into fields such as quantum field theory and cosmology, and are inspiring new questions and research directions. Many of these ideas have the potential to fundamentally shift our thinking about complex quantum systems to a perspective in which information and information-theoretic concepts play a primary and essential role. Our proposed workshop will highlight the recent developments in these areas.
The goal of this workshop is twofold:
In this lecture I shall introduce some basic notions needed for quantum information theory including qubits, density matrices, entanglement, quantum logic gates, basic quantum algorithms. The lecture will assume familiarity with basic quantum mechanics.
In this lecture I shall introduce some basic applications of quantum information ideas such as quantum key distributions (QKD), Bell's theorem, and NMR as basic quantum processors.
Abstract
The notion of computational complexity has recently attracted attention in several fields, in part due to its conjectured holographic connection to the volume of black hole interiors. In the context of classical or quantum computers it arises as a natural quantity that counts the smallest number of gates needed to produce a desired output state from a circuit. With this analogy it can be readily computed for many physical systems upon an appropriate identification of input and output state, the gates and their associated cost. This identification is, however, far from unique. Specifically, the choice of gates and their cost is highly ambiguous. A natural question is whether all these choices have some holographic interpretation.
In this talk I will discuss the notion of spread complexity and how it succeeds in providing a less ambiguous approach to complexity than, for example, the Nielsen approach. Particular attention will be given to the physical features of a quantum system it encodes and its relation to symmetries of the manifold of accessible target states. Finally, I will discuss ways spread complexity is related to the Fubini-Study metric and how one of these hints at a possible gravitational dual.
The last few years have seen the birth of in multi-messenger astronomy. Two prominent publications initiated the field: the first described the concurrent discovery of gravitational waves (GW170817) and a gamma-ray burst (GRB 170817A) linked to the binary neutron star merger event; the second claimed the association of the blazar TXS 0506+056 as the source of extremely-high-energy neutrinos detected by IceCube, in addition to it being a source of very-high energy gamma rays as observed by H.E.S.S. Locally, astronomers have been hard at work to establish strong collaborations between astronomical facilities in the different wavebands: gamma rays (H.E.S.S.), optical (SALT), and radio (MeerKAT). Additional wavebands are accessible via archival data repositories of gamma rays (Fermi LAT) and X-rays (XMM-Newton, NICER, NuSTAR, Chandra). Within this dynamic field, our group’s work has centred on modelling pulsars and pulsar-related systems: we study particles and photons from isolated and binary pulsars, pulsar wind nebulae, and pulsars in globular clusters. We have also branched out into radio science, successfully obtaining time on the MeerKAT radio telescope to search for persistent radio emission surrounding fast radio bursts (FRBs). The latter are probably linked to magnetars, which are highly magnetic pulsars. We have also embarked on radio pulsar searches in nearby galaxies using MeerKAT. Many challenges and unanswered questions remain, with precision data providing opportunities to test and fine-tune sophisticated pulsar models that have strong computational requirements. Cross-disciplinary research invoking quantum information technologies (QITs) to increase computational and classification capabilities is a promising avenue. I will review our pulsar modelling research, and explore some ideas where QIT may aid this fascinating work.Write your abstract here
The observation of distant supernovae moving at speeds greater than that expected has changed our perception of cosmology. It now appears that the current universe is accelerating as opposed to decelerating as thought of previously. To try to explain this acceleration is one of the most important issues in cosmology. A new kind of matter, dubbed dark energy, with negative pressure has been postulated. The most widely accepted model is the so-called ΛCDM model in general relativity. Despite being the best model to date, there are several unresolved issues which motivate researchers to search for better models either in general relativity or in alternative theories of gravity.
In this talk, a review of the standard ΛCDM model in general relativity is first given. Then some other alternative models in general relativity and modified gravity theories is touched upon. Finally, “non-standard” ways of resolving the dark energy problem is briefly mentioned.
We review neutrino oscillations from the theoretical and experimental points of view, by analysing how neutrino oscillations have been proposed, how we measured them and what is the current status.
Some comments on the absolute neutrino mass scale and mass ordering are also proposed.
We discuss mostly the standard three-neutrino case, and shortly introduce the case of light sterile neutrinos.
After a short introduction on cosmology, we discuss how neutrinos affect the expansion of the universse in different epochs, by analysing their impact on early- and late-time observables and how we can use cosmological measurements to constrain neutrino properties.
Neutrino decoupling, Big Bang nucleosynthesis, Cosmic Microwave Background and related constraints are presented.
A possibility is considered that the observed neutrino oscillations are due to refraction effects on a very light scalar dark matter. Properties of the effective neutrino mass-squared responsible for the oscillation effects are studied. In particular, the dependence of the effective mass on state of medium: a cold gas of particles or a classical scalar field is explored. Cosmological consequences of such a scenario are considered. It is shown that the cosmological bound on sum of neutrino masses can be avoided in the case of refraction on the bath of particles.
The phenomena of entanglement and the nonlocal features of quantum correlations were initially introduced to elegantly abase the opponents of quantum mechanics. However, owing to the development of quantum information science, these quantum mechanical features have to be reassessed and to be elevated as resources that may be exploited to achieve tasks that are not possible within the realm of classical physics. Along these lines, quantum resource theories provide the framework to study and quantify these quantum effects, develop new protocols for their detection, and identify processes that optimize their use for a given application. Due to its weakly interacting nature, the system of oscillating neutrinos can maintain quantum coherence over a long distance, which can be detected in long baseline experiments. Hence, neutrinos can prove to be promising candidates for various quantum information tasks. Also, analyzing various aspects of quantum information and computation serve as an alternative platform that can reveal important information about several open problems in the neutrino sector. In this talk, I will discuss various such measures of nonclassicality in the context of neutrino flavor oscillations.
Terrestrial and solar neutrino experiments have a variety of anomalous data that has resisted clarification. Recently, it has appeared that measurements of neutrinos from intense sources on gallium have passed 5σ and other hints from MicroBooNE and elsewhere remain interesting. I will present the latest update of these anomalies. I will then explain the primary reasons why these cannot be simply interpreted as a 1 eV sterile neutrino due to constraints from other experimental probes, notably solar neutrinos and cosmological data sets. I will present a novel, simple model that evades many of these constraints by adding in one new particle, which is the dark matter, beyond a sterile neutrino leading to shape-shifting sterile neutrinos.
The Standard Model is widely accepted as one of the most successful predictive theories of Physics, providing insight into the fundamental building blocks of the universe. Over the last few decades this model has shown signs of incompleteness, most of which are attributed to Neutrinos. Within the confines of the standard model a discrepancy exists related to vanishing Neutrino masses, which contradicts the experimental observation of Neutrino Oscillation [arXiv:hep-ph/0211168]. Neutrino oscillation depends on 7 parameters (3 mixing angles $\theta_{12}$, $\theta_{23}$, $\theta_{13}$, a Dirac Phase due to CP violation $\delta_{CP}$, and the 3 mass states $m_1$, $m_2$, $m_3$). Values of the parameters $\theta_{12}$, $\theta_{13}$, $\Delta m^2_{21}$, $|\Delta m^2_{32}|$ are well determined whilst $\theta_{23}$, $\delta_{CP}$ and the mass Hierarchy, whether ($m_1< m_2
Gamma-Ray Bursts (GRBs) have been detected to very high redshift (z = 9.4) which make them interesting cosmological probes. In attempts to use GRBs as cosmological standard candles, like Type-Ia supernovae (SNe), many studies have been conducted throughout in recent decades. These studies explore different phenomenological relations, such as the Amati and Yonetoku correlations between the GRB spectral properties and energetics, of which the latter strongly relies on the cosmological model. If GRBs can be used as cosmological standard candles just like Type-Ia SNe, which have been observed only up to z < 2, they can be used to probe evolution of dark energy in the history of the universe. I will explain how to use different machine-learning techniques to constrain GRBs as cosmological parameters and how to estimate redshifts of these bursts.
The detection of Gravitational Waves (GWs) allows the study of massive binary systems that may or may not have any electromagnetic (EM) emission. The joint detection of GW~170817 and the Gamma Ray Burst (GRB) GRB~170817A, marked the beginning of GW multi-messenger astronomy. It presented the potential to reveal new insights into the emission mechanisms of GRBs as well as a more accurate picture of what happens during such mergers as more information is available than with each messenger alone. In particular this event confirmed that binary neutron star (BNS) mergers are progenitors for at least some short GRBs (sGRB). An estimated joint detection rate of 0.3 - 1.7 per year between the LIGO Hanford, LIGO Livingston and Virgo (HLV) GW network at design sensitivity, and the Fermi Gamma-ray Burst Monitor (GBM) was predicted. However, to date the GW~170817/GRB~170817A joint detection has been the only event of its kind so far. In the LIGO's 3rd observing run, BNS merger, GW~190425, and Black Hole Neutron star (BHNS) events, GW~200115_042309 and GW~191219_163120 were detected with no EM counterpart. This study aims to find a reason why there hasn't been any more GW-GRB joint detections and possibly provide corrections to the current predicted joint detection rates. In this study, we make the hypothesis that these GW events with no EM counterpart were orientated such that the Earth was not within the viewing angle of the GRB (assuming all BNS and BHNS mergers produces a GRB).In this study we make use of the current Bayesian inference techniques to estimate the inclination angle of these systems to determine the orientation with respect to Earth. This analysis is performed with Bilby, which is a python based user-friendly parameter estimation infrastructure. Using the data obtained from this study we hope to potentially improve the current estimated joint detection rates.
These two pedagogical lectures are aimed to bring attendees up to date with the basic concepts and key results in current theoretical and observational cosmology. The focus is on building physical understanding and making links to modern observations where possible.
The lectures shall begin with an overview of the accelerating universe model (LCDM), from the Big Bang and Inflation to the current era of accelerated expansion. Then we discuss cosmic times, distances, masses and densities and their standard units in order to develop familiarity with the scales of cosmology.
Since General Relativity is not assumed, we use a Newtonian framework to motivate the Friedmann equation. Together with the energy conservation equation, this allows us to solve for the evolution of the universe through the radiation, matter and accelerating eras.
We use basic ideas of Special Relativity to describe the past lightcone of the observer, and thus derive the comoving, angular diameter and luminosity distances. This leads to qualitative descriptions of supernova cosmology and of the baryon acoustic oscillation scale. It also lays the basis to analyse number counts in galaxy surveys.
I will also discuss in some detail two of the most urgent problems to be sorted by any cosmological model: the dark energy explaining the cosmological accelerated expansion and the dark matter responsible for many astrophysical phenomena.
These two pedagogical lectures are aimed to bring attendees up to date with the basic concepts and key results in current theoretical and observational cosmology. The focus is on building physical understanding and making links to modern observations where possible.
The lectures shall begin with an overview of the accelerating universe model (LCDM), from the Big Bang and Inflation to the current era of accelerated expansion. Then we discuss cosmic times, distances, masses and densities and their standard units in order to develop familiarity with the scales of cosmology.
Since General Relativity is not assumed, we use a Newtonian framework to motivate the Friedmann equation. Together with the energy conservation equation, this allows us to solve for the evolution of the universe through the radiation, matter and accelerating eras.
We use basic ideas of Special Relativity to describe the past lightcone of the observer, and thus derive the comoving, angular diameter and luminosity distances. This leads to qualitative descriptions of supernova cosmology and of the baryon acoustic oscillation scale. It also lays the basis to analyse number counts in galaxy surveys.
I will also discuss in some detail two of the most urgent problems to be sorted by any cosmological model: the dark energy explaining the cosmological accelerated expansion and the dark matter responsible for many astrophysical phenomena.
In this lecture, I will briefly overview the standard cosmological model and highlight some of its outstanding problems. I will then suggest potential candidates to alleviate the problems, and ways to observationally constrain them using astrophysical data.
Over the past decades, various researchers have indirectly predicted over a dozen super-Chandrasekhar white dwarfs (white dwarfs which violate the Chandrasekhar mass-limit) from the luminosity observations of peculiar over-luminous type Ia supernovae. Similarly recent gravitational wave observations showed the possibility of existence of massive neutron stars. In my presentation, I will explain that phase space noncommutativity is one of the prominent possibilities to explain these peculiar phenomena. I will further show that the uncertainty in length scale depends both on the Planck scale and the Compton wavelength of the underlying particles, which is followed by Wigner’s idea of the scale of uncertainty. This exploration leads to an indirect observational proof of noncommutativity.
Studies on the rest frame 21 cm spectral emission line of neutral hydrogen (HI; from the hyperfine spin-flip state transition) provides an interesting and novel way of studying the large-scale structure (LLS), baryon acoustic oscillations (BAO), cosmological models and galaxy dynamics and evolution. By modelling the distribution function of HI within dark matter halos and, consequently, the correlation power spectrum from this HI distribution function, it is possible to derive the HI content within galaxies, halos and the universe, and also the LLS of the universe. To achieve this goal we used the HALOMOD python package to carefully model the halo occupation distribution (HOD) for discrete and continuous HI tracers, and also the total HI-galaxy cross-power spectrum. This HI-galaxy cross-power was then fitted with the EMCEE python package to HI-galaxy cross-correlation data from the literature with redshifts between 0.400<z<0.459 and scales between 0.05<k<0.28. The fit allowed to model the LLS of the universe, constrain the HOD parameters and derive several cosmological parameters, for e.g. the density fraction of HI and the average, minimum and maximum mass of HI per halo and galaxy. These results improve previous constraints on the structure and HI content of the universe, and also star and galaxy formation and evolution models.
In this work, we will present the preliminary work on statistical detection of kSZ effect using Aperture Photometry. We use DESI cluster galaxy catalogue that follows in Boss-North and D56 ACTPOL region to measure kSZ effect.
In this short presentation, I will discuss about the research work regarding f(R,T) gravity and solutions under various functional form of f(R,T) for Godel universe.
In this talk, I shall present a novel class of non-Schwarzschild asymptotically flat metrics in exact closed analytical form for pure $R^2$ gravity (Phys. Rev. D 106 (2022) 10, 104004). They were recently derived from a program originated by Buchdahl in 1962. I shall also discuss the existence of Morris-Thorne-Buchdahl wormholes using the metrics.
Most of the matter in the universe is thought to be a form of dark energy, which makes up about 70% of all matter in the universe, 25% of dark matter, and 5% of ordinary matter such as planets and stars. Since it was discovered around 1998, researchers have been trying to determine the nature of this dark energy. Despite many efforts, there is still no good explanation for this. Two possible candidates for dark energy are Chaplygin gas and bulk viscosity. These two proposed formats have many similarities. This work explores the relationship between them, showing that although they have different physical interpretations, they are in some ways mathematically equivalent.
In this presentation is mainly devoted to the extended theory of Einstein’s gravity, particularly focused on the modified teleparallel gravity theory to explore different cosmological interests beyond the standard model of cosmology. The accelerating expansion of the universe, the linear cosmological perturbations and the inflationary Universe in modified teleparallel gravity theory will be presented.
Quantum Complexity has emerged in the past few years as a candidate for quantum chaos diagnostic. This talk is based on a work that appeared last year, in which we show that a notion of quantum complexity (spread complexity) is sensitive to Topological Phase Transitions - at least for the prototypical Kitaev chain. I'll give a brief overview of what we mean when we say "quantum" chaos and proceed to discuss our results by introducing the Krylov subspace methods.