The Lambda Cold Dark Matter model stands today as the foundational framework of modern cosmology. It is the product of a century-long journey through some of the most transformative discoveries in physics, from general relativity and cosmic expansion to the faint afterglow of the Big Bang. ΛCDM was not a theory born fully formed. Rather, it is the culmination of successive refinements and additions, each driven by new observations that reshaped our understanding of the cosmos.
The story begins with Einstein's general theory of relativity (GR) in 1915, providing a new geometric understanding of gravity and spacetime. Initially resistant to the idea of a dynamic universe, Einstein introduced the cosmological constant (Λ) to keep it static. This view was upended in the 1920s by Edwin Hubble's observations of distant galaxies revealed an unmistakable pattern: galaxies are receding from us, and the farther away they are, the faster they are moving. This discovery of cosmic expansion gave empirical support to dynamic models of the universe, particularly those derived from Alexander Friedmann’s solutions to Einstein’s equations. Einstein abandoned the cosmological constant, calling it his “greatest blunder”.
The next milestone came in 1965 with the accidental discovery of the CMB by Arno Penzias and Robert Wilson. This relic glow from the early universe provided new support for Big Bang theory, and launched precision cosmology as an observational science. Over the following decades, increasingly detailed measurements of the CMB offered a wealth of information about the universe’s age, composition, and geometry. Meanwhile, other observations introduced new puzzles. First hinted at by Zwicky in the 1930s, it was the 1970s that astronomers found that the rotational speeds of galaxies did not match expectations based on the gravitational effects of visible matter alone, leading to the postulation of “Dark Matter”: an unseen form of mass that exerts gravitational influence without emitting or absorbing light. Early candidates included massive astrophysical objects or exotic particles, but whatever its nature, non-relativistic (i.e. slow-moving) cold Dark Matter became a necessary ingredient to explain large-scale structure formation and behaviour. The final key component entered the picture in the late 1990s, when two independent teams observing distant Type Ia supernovae discovered that the universe’s expansion was not, as expected, slowing down under the mutual gravitational attraction of all matter. Rather, it appears to be accelerating. This stunning result revived interest in Einstein’s cosmological constant, now re-interpreted as “Dark Energy”: a mysterious, repulsive force driving this apparent acceleration. Incorporating Dark Energy (Λ) alongside cold Dark Matter (CDM), cosmologists arrived at what is now the standard model of cosmology: ΛCDM.
In its current form, ΛCDM describes a universe composed of approximately 5% ordinary matter, 25% cold Dark Matter, and 70% Dark Energy. It posits an initial Big Bang, followed by a near-instantaneous period of cosmic inflation and then a matter-dominated phase of decelerating expansion. During this time, light elements formed via primordial nucleosynthesis, followed by the release of the CMB as the universe cooled and became transparent. Over time, gravity gathered matter to form galaxies and clusters of galaxies. More recently, the expansion of the universe began accelerating again, driven by this “Dark Energy.” This model has proved effective at fitting a broad array of cosmological data, but elegant it is not.
To be blunt, ΛCDM is now heading towards the state that Ptolemaic geocentrism had degenerated into by the 16th century. It consists of an ever-expanding conglomeration of ad-hoc fixes, most of which introduce as many problems as they solve. The following list may seem sprawling, but that is indicative of the intractability of the underlying situation. These problems cannot be cleanly classified because cosmology itself has no unified theory that can make sense of them. Instead, each anomaly is patched in isolation, creating an overall model that is riddled with contradictions.
There are countless ways of restating this question. Why does anything exist? Why isn't there just nothing? What caused the Big Bang? Etc... This is a problem faced by any cosmology, but that doesn't make it any less relevant to ΛCDM. Some sort of answer is required.
The fundamental constants of nature appear to be exquisitely calibrated to allow for the existence of life. Why does the universe appear to be precisely set up to make life possible?
The universe began in an extraordinarily smooth, low-entropy (highly ordered) state, as shown by the near-uniform cosmic microwave background (CMB). Physics does not demand or explain such fine-tuning.
To address problem (3) above and also problem (6) below, cosmologists proposed “inflation” – a fleeting period of superluminal expansion that smoothed the early cosmos. Inflation ends when its driving potential energy decays into matter and radiation, a process called reheating. For today’s universe to emerge, this reheating must occur with extreme precision in both timing and efficiency, yet no known mechanism explains this. Inflation therefore fails to avoid fine-tuning, because it actually requires more fine-tuning than it gets rid of.
Countless additional fine-tuning issues exist. The universe shows an unusually favourable balance of elemental abundances for stable stars and biochemistry. Galaxies and stars also formed at just the right time – early enough for life to evolve, but not so early as to disrupt cosmic smoothness. Further tunings include the matter–radiation equality and primordial perturbation amplitude problems.
Grand Unified Theories (GUTs) of particle physics predict the abundant production of magnetic monopoles – massive, stable particles carrying a net magnetic charge – during symmetry-breaking transitions in the early universe. The problem is that no magnetic monopoles have ever been observed. Inflation solves it by “diluting” them with empty space.
A foundational assumption of particle physics and cosmology is that the laws of nature are nearly symmetric between matter and antimatter. In the earliest moments after the Big Bang, the universe should have produced equal quantities of baryons (matter) and antibaryons (antimatter) through high-energy particle interactions. What we actually observe is a universe composed almost entirely of matter.
This is a large and persistent discrepancy between two different (early universe vs recent) measurements of the rate of cosmic expansion.
This is a persistent mismatch between the level of matter clumpiness predicted by ΛCDM for the early universe and what we actually observe in the late universe. CMB measurements fix a precise value for how strongly structures should have grown, but weak lensing, galaxy clustering, and cluster counts all find a smoother cosmos with a significantly lower S8. The gap has widened as data improved, creating a second major early-versus-late tension that the standard model cannot resolve.
Dark energy was invented to account for a surprising set of astronomical observations that contradicted long-standing expectations. A repulsive force appears to be pushing the universe apart at an accelerating rate. Today, Dark Energy accounts for roughly 70% of the total energy density in standard ΛCDM, but its origin, nature, and ontological status are unknown.
Dubbed "worst theoretical prediction in the history of physics", the cosmological constant problem is a staggering mismatch between theoretical prediction of the repulsive force described above and the observational measurement of that force. The mismatch is between 60 and 120 orders of magnitude.
Dark Matter has never been directly detected, but regardless of that it is now thought to comprise approximately 85% of the matter content of the universe. The hypothesis of Dark Matter emerged as a unifying explanation for multiple independent observational anomalies across different astrophysical and cosmological scales. In each case, visible (baryonic) matter alone proved insufficient to account for the observed gravitational effects. After decades of experiments, we still have no clear idea what it is or where it came from.
A central goal of theoretical physics for nearly a century has been the unification of quantum mechanics and General Relativity, but the two most successful theoretical frameworks remain conceptually incompatible.
The James Webb Space Telescope has detected massive, metal-rich, well-formed galaxies at redshifts greater than 13 – meaning they already existed 325 million years after the Big Bang. The abundance, size, composition and apparent maturity of these early galaxies outpace the predictions of hierarchical structure formation, challenging both the timeline and mechanisms of ΛCDM.
Our theories suggest life should be abundant in the cosmos, but after over a century of intense searching, we have found no sign of it. Where is everybody?
The black hole information problem asks whether information that falls into a black hole is lost when the black hole evaporates via Hawking radiation. Modern approaches suggest that unitarity is preserved, but only by abandoning naïve locality, independent interior–exterior descriptions, or observer-independent global states. This raises a deeper conceptual question: what counts as information, where does it reside, and when does it become physically real?
Human experience and natural processes clearly distinguish past from future, yet the fundamental laws of physics are time-symmetric, treating both directions equally. Why, then, do we perceive a one-way arrow of time? A related puzzle concerns the present moment: in relativity, time is just another dimension, and all events coexist in a four-dimensional block universe with no privileged “now.” Yet the present is all we ever experience.
In QM the state of a system can be mathematically expressed in many different "bases" (ways of describing the states), each providing a valid description of the system’s properties. However, in actual observations, we only ever perceive outcomes corresponding to certain specific bases. What determines the “preferred basis”?
This is a problem for both quantum foundations and cosmology. The relevance for cosmology is as follows.
In the early universe, quantum fluctuations are thought to have seeded the large-scale structures (galaxies and stars) we see today. If the entire universe is a quantum system, there is no "external environment" to cause decoherence. Without an observer or an outside environment to "measure" the universe, how did the early quantum soup settle into the specific, classical distribution of matter we see today? Without a solution to the preferred basis problem in a closed system, our models of Cosmic Inflation struggle to explain how a quantum fluctuation becomes a "real" physical galaxy.
The boundary between physics and philosophy has been blurry for a long time, at least where consciousness and quantum mechanics are concerned. You still find physicalists who insist that both belong entirely to science and always will, but most people who work across disciplines understand that the hardest questions about mind and the quantum world are philosophical. Cosmology is different. Almost nobody thinks the crisis in cosmology could be the result of a serious philosophical mistake, including most critics of the standard model. It is taken for granted that the only people who should touch the problems inside ΛCDM are cosmologists and physicists, leading to a ubiquitous assumption that these are straightforward empirical puzzles that will eventually yield to empirical fixes. Yet those fixes never arrive. The field has been impotently watching the problems multiply since the 1970s.
It is no longer possible to pretend that this situation is healthy. Sabine Hossenfelder has been calling out the culture that rewards papers few people can understand, filled with predictions most people suspect will be ruled out when tested. The work keeps coming because the incentives demand it, not because the theory is converging on truth. The system keeps running, but the crisis is not being resolved.
We can anticipate huge resistance to the coming paradigm shift from the established powers within professional cosmology. It is not that scientific cosmology is going to come to an end. The problem is that a lot of people actually working in cosmology have spent their whole careers busily barking up trees which aren't just the wrong trees, but aren't even in the right forest. The real problem isn't the empirical work but the rotten foundation: physicalism [link TBD].
From the perspective of the old paradigm, which is also the perspective of academic cosmology, the claims I have made in this wiki are outrageous. I am expecting them to be ignored, dismissed and ridiculed, in that order. But the reality is that if we change the metaphysical foundation – if we reject physicalism and replace it with the hierarchical neutral monism I have described in this book, then the problems that ΛCDM cannot solve either disappear, or are fundamentally reframed. The truth is that ΛCDM is like a ship holed beneath the waterline, and as fast as cosmologists try to seal the holes, more holes appear. This ship is heading for the bottom. The only thing that has been keeping it afloat is the absence of an empirically adequate paradigm to replace it. Or rather I should say was, because The Two-Phase Cosmology is that paradigm. This will be no ordinary paradigm shift though, because physicalism is going down with ΛCDM, because ΛCDM is the best model that physicalism can support.