Speaker
Description
With the development of machine learning (ML) algorithms, attempts to use ML techniques like artificial neural networks (ANNs) in the binary black hole (BBH) and binary neutron star (BNS) merger gravitational wave (GW) detection have been made by W. Wei et al. (2021) and many others. Despite the surge of interest in all types of ANN architectures, time-frequency spectrograms remain one of the preferred input data structures due to their relevance to some highly efficient and robust image ANN architectures. Traditional Fourier transforms (FT) based time-frequency decomposition methods have difficulties identifying continuous frequency changes since FTs only fit the input signal to waveforms with constant frequency. BBH and BNS merger GW signal frequency varies continuously by nature and are chirp signals. A transform method that incorporates the rate of frequency change (chirp-rate) may be crucial to improving the performance of existing BBH and BNS merger GW signal detection image ANNs by providing chirp-rate enhanced spectrograms. Building upon the foundation of the linear chirp transform (LCT) by O, A, Alkaishriwo & L.F. Chaparro (2012), in this paper, we develop a version of the short-time linear chirp transform (STLCT) and two types of the joint-chirp-rate-time-frequency transform (JCTFT) for spectrogram generation. Those methods are achieved by replacing the constant frequency waveform model with a linear chirp model. We validate the STLCT and JCTFTs using BBH merger GW waveforms with noise generated using the numerical relativity corrected effective-one-body (EOBNR) formalism and advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) zero-detuned noise models. We plan to further demonstrate the positive effects of JCTFTs on merger signal detection image ANNs in follow-up studies.