Counting Cosmic Cycles: Past Big Crunches, Future Recurrence Limits, and the Age of the Quantum Memory Matrix Universe
We present a quantitative theory of contraction and expansion cycles within the Quantum Memory Matrix (QMM) cosmology. In this framework, spacetime consists of finite-capacity Hilbert cells that store quantum information. Each non-singular bounce adds a fixed increment of imprint entropy, defined as the cumulative quantum information written irreversibly into the matrix and distinct from coarse-grained thermodynamic entropy, thereby providing an intrinsic, monotonic cycle counter. By calibrating the geometry–information duality, inferring today’s cumulative imprint from CMB, BAO, chronometer, and large-scale-structure constraints, and integrating the modified Friedmann equations with imprint back-reaction, we find that the Universe has already completed cycles. The finite Hilbert capacity enforces an absolute ceiling: propagating the holographic write rate and accounting for instability channels implies only additional cycles before saturation halts further bounces. Integrating Kodama-vector proper time across all completed cycles yields a total cumulative age , compared to the of the current expansion usually described by CDM. The framework makes concrete, testable predictions: an enhanced faint-end UV luminosity function at observable with JWST, a stochastic gravitational-wave background with scaling in the LISA band from primordial black-hole mergers, and a nanohertz background with slope accessible to pulsar-timing arrays. These signatures provide near-term opportunities to confirm, refine, or falsify the cyclical QMM chronology.
880
- 10.1103/revmodphys.90.045002
- Oct 15, 2018
- Reviews of Modern Physics
1445
- 10.1093/mnras/152.1.75
- Apr 1, 1971
- Monthly Notices of the Royal Astronomical Society
376
- 10.1088/1475-7516/2012/08/020
- Aug 1, 2012
- Journal of Cosmology and Astroparticle Physics
429
- 10.1146/annurev-astro-120419-014455
- Aug 18, 2020
- Annual Review of Astronomy and Astrophysics
1224
- 10.1086/153853
- Oct 1, 1975
- The Astrophysical Journal
31
- 10.1103/physrevd.82.123517
- Dec 17, 2010
- Physical Review D
29
- 10.1088/1475-7516/2013/08/015
- Aug 1, 2013
- Journal of Cosmology and Astroparticle Physics
183
- 10.1103/physrevd.52.5549
- Nov 15, 1995
- Physical Review D
882
- 10.1088/1475-7516/2016/05/014
- May 1, 2016
- Journal of Cosmology and Astroparticle Physics
2688
- 10.1007/bf02710419
- Jul 1, 1966
- Il Nuovo Cimento B Series 10
- Research Article
9
- 10.1088/1475-7516/2021/10/035
- Oct 1, 2021
- Journal of Cosmology and Astroparticle Physics
We investigate the formation and growth of massive black hole (BH) seeds in dusty star-forming galaxies, relying and extending the framework proposed by [1]. Specifically, the latter envisages the migration of stellar compact remnants (neutron stars and stellar-mass black holes) via gaseous dynamical friction towards the galaxy nuclear region, and their subsequent merging to grow a massive central BH seed. In this paper we add two relevant ingredients: (i) we include primordial BHs, that could constitute a fraction f pBH of the dark matter, as an additional component participating in the seed growth; (ii) we predict the stochastic gravitational wave background originated during the seed growth, both from stellar compact remnant and from primordial BH mergers. We find that the latter events contribute most to the initial growth of the central seed during a timescale of 106–107 yr, before stellar compact remnant mergers and gas accretion take over. In addition, if the fraction of primordial BHs f pBH is large enough, gravitational waves emitted by their mergers in the nuclear galactic regions could be detected by future interferometers like Einsten Telescope, DECIGO and LISA. As for the associated stochastic gravitational wave background, we predict that it extends over the wide frequency band 10-6 ≲ f[Hz] ≲ 10, which is very different from the typical range originated by mergers of isolated binary compact objects. On the one hand, the detection of such a background could be a smoking gun to test the proposed seed growth mechanism; on the other hand, it constitutes a relevant contaminant from astrophysical sources to be characterized and subtracted, in the challenging search for a primordial background of cosmological origin.
- Research Article
62
- 10.1088/1475-7516/2020/07/021
- Jul 1, 2020
- Journal of Cosmology and Astroparticle Physics
Based on the rate of resolved stellar origin black hole and neutron star mergers measured by LIGO and Virgo, it is expected that these detectors will also observe an unresolved Stochastic Gravitational Wave Background (SGWB) by the time they reach design sensitivity. A background from the same class of sources also exists in the LISA band, which will be observable by LISA with signal-to-noise ratio (SNR) ∼ 121. Unlike the stochastic signal from Galactic white dwarf binaries, for which a partial subtraction is expected to be possible by exploiting its yearly modulation (induced by the motion of the LISA constellation), the background from unresolved stellar origin black hole and neutron star binaries acts as a foreground for other stochastic signals of cosmological or astrophysical origin, which may also be present in the LISA band. Here, we employ a principal component analysis to model and extract an additional hypothetical SGWB in the LISA band, without making any a priori assumptions on its spectral shape. At the same time, we account for the presence of the foreground from stellar origin black holes and neutron stars, as well as for possible uncertainties in the LISA noise calibration. We find that our technique leads to a linear problem and is therefore suitable for fast and reliable extraction of SGWBs with SNR up to ten times weaker than the foreground from black holes and neutron stars, quite independently of the SGWB spectral shape.
- Dissertation
- 10.13097/archive-ouverte/unige:27706
- Jan 1, 2013
Solid-state light-matter interfaces based on crystals doped with rare-earth ions have shown great progress in recent years, heading towards quantum technology applications. In this thesis we present three experiments testing the suitability of an atomic frequency comb light-matter interface, implemented in an Nd:YSO crystal, for the storage of quantum information. A dedicated source of entangled photons has been developed, and the storage and retrieval of quantum correlations and entanglement been studied. We show the successful transfer of photonic entanglement to the crystal, and the generation of a matter-matter entangled state between two crystals. These are two of the main steps in a quantum repeater protocol. We also demonstrate faithful storage and retrieval of quantum information encoded in the polarization of heralded single photons. These experiments prove the strong potential of solid-state light-matter interfaces based on atomic frequency combs for quantum communication.
- Research Article
60
- 10.1051/0004-6361/202142208
- Apr 1, 2022
- Astronomy & Astrophysics
The formation of merging binary black holes can occur through multiple astrophysical channels such as, e.g., isolated binary evolution and dynamical formation or, alternatively, have a primordial origin. Increasingly large gravitational-wave catalogs of binary black-hole mergers have allowed for the first model selection studies between different theoretical predictions to constrain some of their model uncertainties and branching ratios. In this work, we show how one could add an additional and independent constraint to model selection by using the stochastic gravitational-wave background. In contrast to model selection analyses that have discriminating power only up to the gravitational-wave detector horizons (currently at redshifts z ≲ 1 for LIGO–Virgo), the stochastic gravitational-wave background accounts for the redshift integration of all gravitational-wave signals in the Universe. As a working example, we consider the branching ratio results from a model selection study that includes potential contribution from astrophysical and primordial channels. We renormalize the relative contribution of each channel to the detected event rate to compute the total stochastic gravitational-wave background energy density. The predicted amplitude lies below the current observational upper limits of GWTC-3 by LIGO–Virgo, indicating that the results of the model selection analysis are not ruled out by current background limits. Furthermore, given the set of population models and inferred branching ratios, we find that, even though the predicted background will not be detectable by current generation gravitational-wave detectors, it will be accessible by third-generation detectors such as the Einstein Telescope and space-based detectors such as LISA.
- Research Article
284
- 10.1088/1475-7516/2017/09/037
- Sep 1, 2017
- Journal of Cosmology and Astroparticle Physics
We study the production of primordial black hole (PBH) binaries and the resulting merger rate, accounting for an extended PBH mass function and the possibility of a clustered spatial distribution. Under the hypothesis that the gravitational wave events observed by LIGO were caused by PBH mergers, we show that it is possible to satisfy all present constraints on the PBH abundance, and find the viable parameter range for the lognormal PBH mass function. The non-observation of a gravitational wave background allows us to derive constraints on the fraction of dark matter in PBHs, which are stronger than any other current constraint in the PBH mass range 0.5−30M⊙. We show that the predicted gravitational wave background can be observed by the coming runs of LIGO, and its non-observation would indicate that the observed events are not of primordial origin. As the PBH mergers convert matter into radiation, they may have interesting cosmological implications, for example in the context of relieving the tension between high and low redshift measurements of the Hubble constant. However, we find that these effects are suppressed as, after recombination, no more that 1% of dark matter can be converted into gravitational waves.
- Single Report
1
- 10.5703/1288284315500
- Dec 15, 2014
Elimination expansion joints in the superstructure of integral abutment bridges offers the advantage of reducing the initial and life cycle costs of the structure. However, such elimination may have an adverse effect on the displacement demand at the pile-abutment connection and on the earth pressures on the abutment wall due to the thermal expansion/contraction cycles of the bridge. These adverse effects have resulted in regulations that impose restrictions on the maximum length and skew angle of integral abutment bridges. This research consisted of a deep analysis of the problem by considering soil-structure interaction. The approach was multifaceted as it included experimental and numerical analysis. Upon calibration and verification of the constitutive model, it was used as part of a parametric analysis to provide recommendations for the design limits of integral abutment bridges. The analysis results showed that active state earth pressure is reached after the first contraction cycle. The displacement demand on piles is a function of the abutment wall displacement. Larger displacement demand of the pile at the acute corner when compared to the obtuse corner was observed during expansion and contraction cycles. The inflection point of the piles deformed shape was found to be at relatively shallow depth. Concrete shrinkage and sequence of loading affected significantly the displacement demand of the supporting piles, lower displacement demand of piles during the expansion cycle and larger displacement demand during contraction cycles. The analysis showed that a 500 ft bridge with 60° skew will provide acceptable long term performance.
- Research Article
83
- 10.1093/mnras/sty2568
- Sep 20, 2018
- Monthly Notices of the Royal Astronomical Society
In paper I of this series we showed that a large percentage of the binary black hole (BBH) mergers that form through dynamical interactions in globular clusters will have significant eccentricity in the ~10^{-3}-10^{-1} Hz LISA band. In this work we quantify the evolution of these highly eccentric binaries through the LISA and LIGO bands, and compute the stochastic gravitational wave background from the merging, eccentric population. We find that the population of BBHs that merge in-between three-body encounters inside their cluster (~50% of all cluster-formed BBH mergers) will have measurable eccentricity for their entire lifetime in the LISA band. The population of BBHs that merge during three-body encounters (~5% of all cluster-formed BBH mergers), will be detectable by LIGO with eccentricities of e~0.1. The gravitational wave background from dynamically assembled BBHs encodes a characteristic bump due to the high initial eccentricities of these systems. The location and amplitude of this bump depends on globular cluster properties.
- Research Article
43
- 10.1103/physrevd.99.023534
- Jan 30, 2019
- Physical Review D
The geometric optics approximation traditionally used to study the propagation of gravitational waves on a curved background, breaks down in the vicinity of compact and extended astrophysical objects, where wave-like effects like diffusion and generation of polarization occur. We provide a framework to study the generation of polarization of a stochastic background of gravitational waves propagating in an inhomogeneous universe. The framework is general and can be applied to both cosmological and astrophysical gravitational wave backgrounds in any frequency range. We derive an order of magnitude estimate of the amount of polarization generated for cosmological and astrophysical backgrounds, in the frequency range covered by present and planned gravitational wave experiments. For an astrophysical background in the PTA and LISA band, the amount of polarization generated is suppressed by a factor $10^{-4}$ $(10^{-5})$ with respect to anisotropies. For a cosmological background we get an additional $10^{-2}$ suppression. We speculate on using our approach to map the distribution of (unresolvable) structures in the Universe.
- Dissertation
- 10.5451/unibas-006483789
- Jan 1, 2015
Stable quantum information in topological systems
- Research Article
159
- 10.1111/j.1365-2966.2007.11734.x
- Mar 14, 2007
- Monthly Notices of the Royal Astronomical Society
The formation, merging, and accretion history of massive black holes along the hierarchical build--up of cosmic structures leaves a unique imprint on the background of gravitational waves at mHz frequencies. We study here, by means of dedicated simulations of black hole build--up, the possibility of constraining different models of black hole cosmic evolution using future gravitational wave space--borne missions, such as LISA. We consider two main scenarios for black hole formation, namely, one where seeds are light (~10^2 \msun, remnant of Population III stars), and one where seeds are heavy (>~10^4 \msun, direct collapse). In all the models we have investigated, massive black hole binary coalescences do not produce a stochastic GW background, but rather, a set of individual resolved events. Detection of several hundreds merging events in 3 year LISA mission will be the sign of a heavy seed scenario with efficient formation of black hole seeds in a large fraction of high redshift halos. On the other extreme, a low event rate, about few tens in 3 years, is peculiar of scenarios where either the seeds are light, and many coalescences do not fall into the LISA band, or seeds are massive, but rare. In this case a decisive diagnostic is provided by the shape of the mass distribution of detected events. Light binaries (m<10^4\msun) are predicted in a fairly large number in Population III remnants models, but are totally absent in direct collapse models. Finally, a further, helpful diagnostic of black hole formation models lies in the distribution of the mass ratios in binary coalescences. While heavy seed models predict that most of the detected events involve equal mass binaries, in the case of light seeds, mass ratios are equally distributed in the range 0.1-1.
- Research Article
61
- 10.1103/physrevd.106.103520
- Nov 17, 2022
- Physical Review D
Light primordial black holes may comprise a dominant fraction of the dark matter in our Universe. This paper critically assesses whether planned and future gravitational wave detectors in the ultra-high-frequency band could constrain the fraction of dark matter composed of sub-solar primordial black holes. Adopting the state-of-the-art description of primordial black hole merger rates, we compare various signals with currently operating and planned detectors. As already noted in the literature, our findings confirm that detecting individual primordial black hole mergers with currently existing and operating proposals remains difficult. Current proposals involving gravitational wave to electromagnetic wave conversion in a static magnetic field and microwave cavities feature a technology gap with respect to the loudest gravitational wave signals from primordial black holes of various orders of magnitude. However, we point out that one recent proposal involving resonant LC circuits represents the best option in terms of individual merger detection prospects in the range $(1\div 100) \, \text{MHz}$. In the same frequency range, we note that alternative setups involving resonant cavities, whose concept is currently under development, might represent a promising technology to detect individual merger events. We also show that a detection of the stochastic gravitational wave background produced by unresolved binaries is possible only if the theoretical sensitivity of the proposed Gaussian beam detector is achieved. Such a detector, whose feasibility is subject to various caveats, may be able to rule-out some scenarios for asteroidal mass primordial black hole dark matter. We conclude that pursuing dedicated studies and developments of gravitational wave detectors in the ultra-high-frequency band remains motivated and may lead to novel probes on the existence of light primordial black holes.
- Dissertation
- 10.5451/unibas-006041271
- Jan 1, 2012
Building a working quantum computer that is able to perform useful calculations remains a challenge. With this thesis, we are trying to contribute a small piece to this puzzle by addressing three of the many fundamental questions one encounters along the way of reaching that goal. These questions are: (i) What is an easy way to create highly entangled states as a resource for quantum computation? (ii) What can we do to efficiently quantify states of noisy entanglement in systems coupled to the outside world? (iii) How can we protect and store fragile quantum states for arbitrary long times? The first two questions are the subject of part one of this thesis, `Entanglement Measures & Highly Entangled States'. We devise a particular proposal for generating entanglement within a solid-state setup, starting first with the tripartite case and continuing with a generalization to four and more qubits. The main idea there is to realize systems with highly entangled ground states in order for entanglement to be created by merely cooling to low enough temperatures. We have addressed the issue of quantifying entanglement in these systems by numerically calculating mixed-state entanglement measures and maximizing the latter as a function of the external magnetic field strength. The research along these lines has led to the development of the numerical library 'libCreme'. The second part of the thesis, 'Self-Correcting Quantum Memories', addresses the question how to reliably store quantum states long enough to perform useful calculations. Every computer, be it classical or quantum, needs the information it processes to be protected from corruption caused by faulty gates and perturbations from interactions with its environment. However, quantum states are much more susceptible to these adverse effects than classical states, making the manipulation and storage of quantum information a challenging task. Promising candidates for such 'quantum memories' are systems exhibiting topological order, because they are robust against local perturbations, and information encoded in their ground state can only be manipulated in a non-local fashion. We extend the so-called toric code by repulsive long-range interactions between anyons and show that this makes the code stable against thermal fluctuations. Furthermore, we investigate incoherent effects of quenched disorder in the toric code and similar systems.
- Single Book
494
- 10.1017/cbo9781139034807
- Sep 5, 2013
Quantum computation and information is one of the most exciting developments in science and technology of the last twenty years. To achieve large scale quantum computers and communication networks it is essential not only to overcome noise in stored quantum information, but also in general faulty quantum operations. Scalable quantum computers require a far-reaching theory of fault-tolerant quantum computation. This comprehensive text, written by leading experts in the field, focuses on quantum error correction and thoroughly covers the theory as well as experimental and practical issues. The book is not limited to a single approach, but reviews many different methods to control quantum errors, including topological codes, dynamical decoupling and decoherence-free subspaces. Basic subjects as well as advanced theory and a survey of topics from cutting-edge research make this book invaluable both as a pedagogical introduction at the graduate level and as a reference for experts in quantum information science.
- Research Article
- 10.1016/j.jmmm.2017.12.074
- Jan 4, 2018
- Journal of Magnetism and Magnetic Materials
Quantum information generation, storage and transmission based on nuclear spins
- Research Article
20
- 10.1109/access.2020.3025619
- Jan 1, 2020
- IEEE Access
Quantum information is prone to suffer from errors caused by the so-called decoherence, which describes the loss in coherence of quantum states associated to their interactions with the surrounding environment. This decoherence phenomenon is present in every quantum information task, be it transmission, processing or even storage of quantum information. Consequently, the protection of quantum information via quantum error correction codes (QECC) is of paramount importance to construct fully operational quantum computers. Understanding environmental decoherence processes and the way they are modeled is fundamental in order to construct effective error correction methods capable of protecting quantum information. Moreover, quantum channel models that are efficiently implementable and manageable on classical computers are required in order to design and simulate such error correction schemes. In this article, we present a survey of decoherence models, reviewing the manner in which these models can be approximated into quantum Pauli channel models, which can be efficiently implemented on classical computers. We also explain how certain families of quantum error correction codes can be entirely simulated in the classical domain, without the explicit need of a quantum computer. A quantum error correction code for the approximated channel is also a correctable code for the original channel, and its performance can be obtained by Monte Carlo simulations on a classical computer.
- New
- Addendum
- 10.3390/e27111135
- Nov 4, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111134
- Nov 2, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111123
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111125
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111130
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111127
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111131
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111121
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111133
- Oct 31, 2025
- Entropy
- New
- Research Article
- 10.3390/e27111128
- Oct 31, 2025
- Entropy
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.