*sans*the universe, and temporal since creation, but am as of yet undecided. However, I wanted to look at the scientific evidence for premise 2. The two pieces of scientific evidence that can be used to infer that the universe began to exist are: the expansion of the universe and the second law of thermodynamics. Regarding expansion, two theorems are cited, the Hawking-Penrose Singularity Theorem(s) and the Borde-Guth-Vilinken theorem.

As it stands, there are five exceptions to the Hawking-Penrose theorem:

1. Closed time-like curves.

2. Strong energy condition violated (eternal inflation.)

3. Quantum Gravity models.

4. Generic energy condition violated ("exotic" spacetime.)

5. No closed trapped surface in our past.

Of these, number 4 and 5 are not expected to be part of "reasonable" physical models of the universe.

**Closed Time-like Curves**

A Closed Time-like Curve is essentially where time is "closed." What this means is that if time were represented as a drawing, rather than being a straight line, it loops back on itself, thus returning to the starting point. As a physical reality, CTCs are "time-machines" whereby the universe is not cylical, but that it returns back to its original starting point, in essence "creating itself." This is a model proposed by J. Richard Gott and Li-Xin Li, whereby the universe's timeline loops back upon itself to "become it's own mother." In other words, the SAME big bang is occuring over and over again, rather than a series of different big bangs and big crunches. So, you can think of a CTC very much like a time-machine.

Of course, there is pretty serious physical problem with this model as it violates the Chronology Protection Conjecture. Gott and Li state that CTCs should exist in a pure vacuum state, with no real particles, Hawking radiation, or bubbles because this stray radiation would destroy the CTC. The reason for this is because the radiation would build up to infinity, producing infinite spacetime curvature and destroying the CTC. This is described by Kip Thorne using the following example:

"The radiation keeps piling on top of itself, over and over again with Doppler-boosted energy making an infinitely strong beam of radiation. This beam would then produce a singularity, thereby (probably) destroying the wormhole and preventing a time-machine from ever existing. Gott and Li, however, are aware of this problem and have proposed the following solution. They have proposed a special zero-temperature empty space, called an "adapted Rindler vacuum" as a special initial state for the universe. However, such a move has produced further problems, indicated by William Hiscock and D.H. Coule,Imagine that Carole is zooming back to Earth with one wormhole mouth in her spacecraft, and I am sitting at home on Earth with the other. When the spacecraft gets to within 10 light- years of Earth, it suddenly becomes possible for radiation (electromagnetic waves) to use the wormhole for time travel: any random bit of radiation that leaves our home in Pasadena traveling at the speed of light toward the spacecraft can arrive at the spacecraft after 10 years’ time (as seen on Earth), enter the wormhole mouth there, travel back in time by 10 years (as seen on Earth), and emerge from the mouth on Earth at precisely the same moment as it started its trip. The radiation piles right on top of its previous self, not just in space but in spacetime, doubling its strength. What’s more, during the trip each quantum of radiation (each photon) got boosted in energy due to the relative motion of the wormhole mouths (a “Doppler- shift” boost)."

*et al.*. The first problem is that their choice of initial conditions are incredibly finely-tuned. Coule notes that the Gott-Li model is only possible in Misner space with identification scale b = 2p, or b = 2pr0 for the multiple de Sitter case. Not only is this inconsistent with Quantum uncertainty, but this parameter is not a constant but liable to change dynamically. When it does change, the CTC destabilises. Secondly, given more realistic physical force fields, such a vacuum is also unstable.

**Eternal Inflation**

The second exception are inflation models, such as chaotic inflation. Cosmic inflation was proposed to solve a number of problems that otherwise pervaded the standard model. For example, the horizon, flatness, and cosmic relic problems. Alan Guth's proposal was that the universe underwent a period of exponential super-rapid expansion. Inflationary theory is now generally well-accepted. However, inflationary theory has led to inflationary models of how the universe came to be, such as Andrei Linde's chaotic inflation model. In this model, the big-bang is just a regional event within a multiverse of expanding bubbles, with each bubble giving rise to even more expanding regions, and so on

*ad infinitum.*However, a theorem produced by Guth, along with Arvin Borde and Alexander Vilenkin states that such an inflating universe must have had a beginning. They explain:

"Interestingly enough, it has also been argued that eternal inflation cannot be future eternal either: http://arxiv.org/abs/1106.3542. Of course, as with the Hawking-Penrose theorem, there are exceptions to the Borde-Guthe-Vilenkin theorem:Our argument shows that null and time-like geodesics are, in general, past-incomplete in inflationary models, whether or not energy conditions hold, provided only that the averaged expansion condition Hav > 0 holds along these past-directed geodesics. A remarkable thing about this theorem is its sweeping generality. We made no assumptions about the material content of the universe. We did not even assume that gravity is described by Einstein’s equations. So, if Einstein’s gravity requires some modification, our conclusion will still hold. The only assumption that we made was that the expansion rate of the universe never gets below some nonzero value, no matter how small. This assumption should certainly be satisfied in the inflating false vacuum. The conclusion is that past-eternal inflation without a beginning is impossible."

1. Infinite contraction (such as de Sitter cosmology.)

2. Asymptotically static models.

3. Infinite cyclicity (such as the Baum-Frampton model, or Penrose's CCC.)

4. Time reversal at singularity (Aguirre-Gratton model.)

**Infinite Contraction**

This is where a spatially infinite universe contracts down to a singularity and then "bounces" into our present expansion. Therefore, the universe would not be, on average, in a state of cosmic expansion since the contraction phase 'cancels it out.' This is problematic for several reasons. Not only do the initial conditions have to be extremely fine-tuned, the collapse phase is so unstable as to require an additional level of fine-tuning so that it will be able to the collapse can turn around back into an expansion. Otherwise, a certain phenomenon known as "BKL chaos" occurs, which prevents such a bounce occurring. This requires an extraordinary amount of fine-tuning.

**Asymptotically Static Space-time**

An asymptotically static space is one in which the average expansion rate of the universe over its history is equal to zero, since the expansion rate of the universe “at” infinity is zero. Hence, the universe, perhaps in the asymptotic past, is in a static state neither expanding nor contracting. However, we know from observation that the universe has indeed been expanding, so how can it be said to have an average expansion of zero throughout it's history? William Lane Craig and James D. Sinclair offer the following example:

"Ellis,Would not the average expansion rate have to be greater than zero? No, not when we include “infinity” in the average. Consider an analogy in which the local government decides that, henceforth, everyone will pay property taxes according to the average value of property (per acre) in the county instead of on one’s individual assessment. This might be good or bad for you, depending on whether you live in the high end district. But suppose that your county suddenly expanded to include the Sahara Desert. The Sahara is worthless and big, hence the average value of property, by the square mile, dives precipitously. Further, the larger the Sahara is, the closer to zero one’s property taxes will be. In the limit as the Sahara grows to infinite size, one’s property taxes will go to zero. In a similar way, a zero expansion condition at infinity would have the same impact on the average expansion rate."

*et al.*propose that the universe initially existed in such a state (known as an Einstein Static State) and transformed into the universe we see today via an inflationary phase. However, such a model is not past eternal either. How such an ESS is unstable. Small fluctuations in the size of the universe are inevitable (given Quantum Theory) and thus such a state cannot remain in balance for an infinite time. Secondly, the observable universe is NOT static, and the required mechanism to force such a transition between an ESS and our own universe (either a quantum or thermal fluctuation) implies that this initial ESS is not infinite. If you utilise the low-energy solution of loop quantum gravity, then this protects against perturbations of a limited size, however smaller perturbations would eventually build up leading to the creation of a universe such as ours. In other words, we can again infer the finitude of the ESS based upon the mechanism that gets us from the ESS to our current universe.

**Cyclic Models**

Cyclic models are those that postulate that our big bang is but one in an infinite cycle of big-bangs and crunches. In this respect, our universe never begins to exist; or rather, our big bang is A beginning rather than THE beginning. The overwhelming problem for such models lies in the second law of thermodynamics, as for every cycle, entropy increases. In other words, only a finite number of cycles are possible until the universe reaches a state of maximum entropy and suffers heat death. In recent years, there have been serious efforts to revitalise such models with the Baum-Frampton model and (even more recently) the Penrose Conformal Cyclic Cosmology. Paul Frampton and Lauris Baum propose that phantom energy, a type of dark energy with an equation of state (the ratio between pressure and energy density) less than -1, pervades the universe. This leads to a type of expansion that typically is thought to lead to a Big Rip. However, Baum and Framptom propose that, very close to the Big Rip event, the universe splits into noninteracting and causally disconnected patches. Most of these patches only contain phantom energy and are devoid of normal matter and radiation. The entropy content of the universe is contained in the patches containing thinly spread out particles and radiation. The patches only containing phantom energy then contract by an amount exactly equal to the expansion the universe underwent since the Big Bang. Prior to reaching a singularity, the contracting patches then "bounce" into another expansion phase. The process then repeats, with each patch fractioning into more baby universes with each cycle.

The problems with such a model, however, are legion. First, in order to avoid the BVG theorem, the average contraction for every geodesic must equal exactly the average expansion, but this is again something that would require extraordinary amounts of fine-tuning. The second problem is how exactly the various patches remain causally disconnected. Given infinite past time, an infinite amount of matter and radiation would have been produced, and the model avoids infinite entropy by removing it to different patches. However, simply shoving the entropy into other patches raises the question of whether or not, given infinite time and a countable infinity of patches, that these patches must eventually collide. The causal disconnection mechanism that occurs in lieu of the Big Rip does not work because the disconnected patches do come back into causal contact. After all, causal horizons are not real physical barriers. Thirdly, certain factors would prevent a singular bounce. Contracting space filled with quantum fields will have a certain "ergodic" property as it shrinks. The fields will become excited and produce chaotic fluctuations which will prevent cycling as spontaneously created matter with a different equation of state will dominate the energy field. Lastly, these inhomogeneous fluctuations would result in the appearance of a fluid of black holes leading to a "Black Crunch."

A second cylic model is Roger Penrose's Conformal Cyclic Cosmology. Penrose suggests that the initial "singularity" is, in reality, the same thing as the open ended de-Sitter like expansion which our universe seems about to experience. However, the main problem with such an equivalence is that the entropy of the initial state of the universe is vanishingly small, whereas the entropy of the de-Sitter like end state is maximized. Penrose, however, has opted to invoke a non-unitary loss of information at black holes in order to equalise this entropy. However, this approach does not succeed as information is not lost to black holes. Lastly, Penrose has recently claimed observational support of concentric rings of low variance in the CMB data provided by the WMAP. However, two papers have been published both of which were unable to replicate Penrose's results:

http://arxiv.org/abs/1012.1268

and

http://arxiv.org/abs/1012.1305

**Time-Deconstruction**

A fourth way to avoid the BGV theorem where the arrow of time reverses at the t = -infinity hypersurface so that the universe expands into both halves of the full de-Sitter space. It is possible, then, to evade the theorem through a gross deconstruction of the notion of time. Suppose one asserts that in the past contracting phase the direction of time is reversed. Time then flows in both directions away from the singularity. However, this model denies the evolutionary continuity of the universe which is topologically prior to t and our universe. The other side of the de Sitter space is not our past. For the moments of that time are not earlier than t or any of the moments later than t in our universe. There is no connection or temporal relation whatsoever of our universe to that other reality. Efforts to de-construct time thus fundamentally reject the evolutionary paradigm.

**Quantum Gravity Models**

The final exception to the Hawking-Penrose theorem are Quantum Gravity models. The Hawking-Penrose singularity theorem is based on General Relativity, but GR breaks down when we get to the very small. Thus, a Quantum description of Gravity is required to explain the earliest phases of the universe where GR breaks down. Quantum Gravity Models can be categorised into three groups:

1. String models (such as the Veneziano-Gasperini Pre-Big Bang inflation model and the Steinhardt-Turok Ekpyrotic Cyclic model.)

2. Loop quantum models (such as Bojowald,

*et al.*'s cyclic LQG model, and Ellis

*et al.*'s asymptotically static model.)

3. Semiclassical models (such as Vilenkin's Quantum Tunnelling model and the Hartle-Hawking No Boundary model.)

**String Scenarios**

String theory is probably the most popular area within Quantum Gravity, and perhaps the most well-known publicly. String theory postulates that the most fundamental elementary particles are tiny 1-dimensional vibrating strings and postulated a minimum size known as the planck length. This area of research has opened up a wide new range of ideas, and is a very fruitful area of research. As such, there are currently two string models that seek to describe a past-infinite pre-Big Bang era, the pre-Big Bang scenario of Gabriele Veneziano and Maurizio Gasperini, and the Ekpryotic Cyclic model of Paul Steinhardt and Neil Turok.

**Pre-Big Bang Inflation**

This model postulates a pre-Big Bang state much like our post-Big Bang state. Since our state is eternal into the future, the previous state was eternal into the past. Whereas our state will develop into a nearly empty, widely dispersed collection of thin gas and radiation, this is what the previous state was developing from. Through gravitational contraction, regions of the pre-Big Bang universe turned into black holes and, due to quantum effects, once the black holes have reached a certain critical density, they undergo a "bounce" into a big bang. However, Veneziano suggests that this beginning is infinitely distant and never reachable. If this is to be interpreted realistically, then one might ask how we ever arrived at the present if this is the case. However, the models does have an initial phase:

(1) a static (Milne) universe, or string perturbative vacuum (SPV) phase, which is then followed by...

(2) a quasi-Milne phase, which is a "perturbed" SPV.

(3) an inflationary phase.

(4) a Post-Big Bang FRW phase typical of the standard big bang model.

The SPV is unstable, which is why it is not eternally static but leads to the following phases, meaning that it must be finite in the past.

**Ekpyrotic Cyclic Scenario**

The Ekpyrotic Cyclic Scenario is one that postulates a higher-dimensionality. That is to say, in this model, our universe exists within a brane, and these branes collide. Our big bang and subsequent expansion was caused when our brane collided with another. Eventually, our universe is subject to heat death death and then another big bang is started by the next collision. But has such a model been cycling forever? Steinhardt and Turok suggest that the universe began in a singularity but that it has been cycling forever. The Ehpyrotic scenario successfully evades the BKL chaos, but such a model is not an exception to the BGV theorem, and so requires a beginning. Borde, Guthe, and Vilenkin apply their theorem even to higher dimensional cosmologies such as the Steinhardt-Turok model and so this is does not avert a beginning.

**Loop Quantum Gravity**

Loop Quantum Gravity takes the view that space-time is quantised, in other words, divided into discrete constituent parts. As with string theory, there is a minimum size in nature that prevents infinite singularities from occurring. It is important to note that Bojowald,

*et al.*are not committed a model with an infinite past, so falsifying an infinite cycle would not necessary falsify LQG

*per se*. However, our focus here is whether or not LQG can really be infinite.

There are two problems that the LQG model must overcome and that is:

(1) there is no known physical mechanism for producing a cyclic "bounce."

(2) thermodynamic considerations show that the universe of this present day should have reached thermodynamic equilibrium (aka 'heat death.')

Martin Bojowald, the foremost exponent and defender of LQG today believes he has solutions to both of these. With regards to the first problem the major issue is overcoming the aforementioned BKL chaos. This chaos has shown to be calmed by a LQG approach. However, the second problem is far more challenging. How can there be truly cyclic behaviour when the second law of thermodynamics predicts that entropy increases from cycle to cycle? Using a semi-classical approach, the end of out current cycle should differ in entropy from the beginning by a factor of 10^22. Given no energy input, how can such an outcome be avoided? There are three proposed solutions to this:

1. The problem is epistemic only.

2. Our current classical understanding of entropy is misleading.

3. Cycle by cycle, the entropy state is genuinely reversible.

Regarding the first solution, Bojowald proposes that there is a large, unobservable part of the initial singularity that is genuine generic manifold, which is a state of maximum entropy featuring random inhomogeneity and anisotropy. The entropy of the initial and final singularities would be similar and an inflation mechanism of a small patch of this manifold would then produce the requisite homogeneity and isotropy. However, it is exceedingly improbable to find ourselves as the product of an inflationary event of a generic manifold. We should be seeing a universe roughly 1/10th the size of our own if it were the case as it is exceedingly improbable for a universe of our size to arise from such an event. The second, bigger, problem is that this epistemic account takes no account of entropy generation during the cycle, and over infinite time it would build up. The same thing goes for the second solution. Even if the classical approach to entropy is misleading, the quantum approach would still recognise entropy build up. Therefore, the only remaining option is that LQG would need to be fully reversible but, as Bojowald admits, the jury is still out. In addition to the entropy issue, however, is the issue of dark energy. If dark energy is the form of a cosmological constant, then it would have led to open-ended expansion the first time. However, if dark energy takes the form of quintessence it is possible that there could be reverse consistent with a collapse phase, but this leads to a new problem. Different modes of the matter field would become excited such that the next bounce would differ from the preceding one, and so one particular cycle can lead to open-ended expansion. It would eventually reach open-ended expansion, and so cannot be infinite.

**Semi-classical Inflation**

Semi-classical models refer to models that attempt to explain how inflation got started quantum-mechanically. In such models, the universe is treated quantum mechanically and thus described by a wave function rather by classical space-time. The two major examples of such attempts come to us from none other than Alexander Vilenkin himself, who utilised a quantum tunnelling model, and from James Hartle and the famed Stephen Hawking. These models are interesting in that they describe the universe quantum mechanically, but our universe DOES have a beginning. How then do they overcome the problem of origins?

**Vilenkin's Quantum Tunnelling Model**

Alexander Vilenkin, one of three authors of the Borde-Guth-Vilenkin theorem, describes his approach thus:

"Vilenkin uses an analogy of a particle that quantum tunnels through a potential well. An ordinary FRW universe classically does not have enough energy to escape into an open-ended expansion. However, in Quantum Mechanics, there is a probability that instead of recollapse, the universe will tunnel through the energy barrier and lead into an inflationary phase. Vilenkin's solution is that our universe arised from a state of null topology; a small, closed, spherical, and metastable universe. However, such a state has only existed, and in fact can only have existed for a finite amount of time, so whilst Vilenkin's model solves the problem of creation of OUR universe, it one-ups the problem. The universe clearly has a beginning in this approach.Many people suspected that in order to understand what actually happened in the beginning, we should treat the universe quantum-mechanically and describe it by a wave function rather than by a classical spacetime. This quantum approach to cosmology was initiated by DeWitt and Misner, and after a somewhat slow start received wide recognition in the last two decades or so. The picture that has emerged from this line of development is that a small closed universe can spontaneously nucleate out of nothing, where by ‘nothing’ I mean a state with no classical space and time. The cosmological wave function can be used to calculate the probability dis- tribution for the initial configurations of the nucleating universes. Once the universe nucleates, it is expected to go through a period of inflation, driven by the energy of a false vacuum. The vacuum energy is eventually thermalized, inflation ends, and from then on the universe follows the standard hot cosmological scenario."

**Hartle-Hawking No Boundary Model**

A similar approach us utilised by Stephen Hawking and James Hartle. They opt for Richard Feynman's approach to QM by seeking to find the probability for a certain final quantum state by a path integral "sum over histories." This is a superposition of states where every possible universe history is part of the wave function and each possible state has an associated probability of becoming actual. They estimate the subset of universes that are expected to dominate the calculations. They describe the earliest state that the universe emerges from as a Euclidian metric, which removes the initial singularity.

These models more or less take the same approach. Vilenkin notes:

"Whilst the authors of these models might think otherwise, there are a number of factors why such a state prior to the universe is itself finite. Gott and Li offer a number of critique of these semi-classical approaches (which ironically also serve to undercut their preferred CTC model) and show that such a prior state is not past eternal.I understand that a universe of zero radius is not necessarily the same thing as no universe at all. But mathematically my quantum tunneling from nothing is described by the same “shuttle- cock” geometry as Hartle and Hawking [NB Figure 3.21]. (In fact, the shuttlecock first appeared in my 1982 paper.) This geometry is certainly not past-eternal, and there is no point on the Euclidean sphere which you can identify as the initial universe of zero size. So, if the Hartle- Hawking approach avoids the “paradoxes of creation”, I don’t see why my mine doesn’t."

(1) Transitions in QM are always between allowed classical states (Vilenkin and Hartle–Hawking’s approach has a transition from a classically forbidden region to a classically allowed region).

(2) The Vilenkin and Hartle–Hawking approaches should contain realistic energy fields (something closer to what we actually see in nature). If they did, then Heisenberg’s uncertainty principle would require that the initial state of their models have a finite and nonzero energy. It that is the case, then semi-classical quantum models actually start in a classically allowed metastable state, rather than “nothing.”

The semi-classical quantum gravity approaches of Vilenkin, Hartle, and Hawking ironically posit a universe that began to exist. A fact that is either not realised or glossed over. Probably a bit of both, considering physicists aren't philosophers (which Stephen Hawking's book The Grand Design makes abundantly clear.) So, yes, there are exceptions to both theorems, however either the exceptions are not viable, or there is another condition that requires a beginning.