God vs The Multiverse (Part 2: The Mystery)
(Taken from www.BlogoShiur.com - Rabbi E. Feder & Rabbi E. Zimmer)
Science tries to explain things through a process of simplification. This means explaining one thing in terms of something else more basic. Simplification generally means unifying different phenomenon by explaining them in terms of fewer things. For example, Newton's theory of gravity unified the phenomenon of things falling to the ground on Earth, with the phenomenon of planets orbiting the sun. Both things were explained in terms of one principle (gravity) which is more fundamental.
The most basic things are called 'fundamental'. The most basic laws are called the 'fundamental laws of physics'. The concept of 'fundamental' is of utmost importance in science. Science is seeking to explain the most fundamental reality. Science is seeking to explain everything in terms of one (ideally) fundamental theory. This "theory of everything" will be the fundamental law of physics, in the sense that all other laws can be derived from it, but it cannot be explained in terms of anything simpler.
The most basic particles, 'fundamental particles', are those that can combine to make everything else that is more 'complex'. These fundamental particles have intrinsic properties like mass. The more mass something has, the more it weighs. Every single electron in the universe has the exact same amount of mass. We can quantify the amount of mass in an electron by comparing it to any proton. Every proton is always 1,836.15267245 times more massive than any electron. It is constantly that amount. Hence, we call the mass of an electron a 'constant.'
The term 'constant' is used in physics to refer to a particular number that doesn't change, and tells us how big something is. It could be how heavy an electron is, how fast light moves, how strong gravity is, etc. All these things are finite quantities, which have particular, unchanging values that we only know through measurements and observations. These quantities are called constants.
How can science explain the value of the above mentioned constant in terms of something more fundamental? What determines this number? Why isn't it 2000 or 7.6453 or .000001? Why aren't electrons more massive than protons? Can science go any further? How do you explain a number?
Richard Feynman expresses this difficulty in his book QED (page 129), with regard to one of these constants, the fine structure constant (Don't get scared if you don't understand what the fine structure constant is. It's not essential to the proof. Think about the mass of the electron if it is easier to relate to.) :
"There is a most profound and beautiful question associated with the observed coupling constant...It is a simple number that has been experimentally determined to be close to 0.08542455. (My physicist friends won't recognize this number, because they like to remember it as the inverse of its square: about 137.03597 with about an uncertainty of about 2 in the last decimal place. It has been a mystery ever since it was discovered more than fifty years ago, and all good theoretical physicists put this number up on their wall and worry about it.) Immediately you would like to know where this number for a coupling comes from: is it related to pi or perhaps to the base of natural logarithms? Nobody knows. It's one of the greatest damn mysteries of physics: a magic number that comes to us with no understanding by man. You might say the "hand of God" wrote that number, and "we don't know how He pushed his pencil." We know what kind of a dance to do experimentally to measure this number very accurately, but we don't know what kind of dance to do on the computer to make this number come out, without putting it in secretly!"
What was the mystery that all good theoretical physicists worried about for 50 years?
In our current conception of the fundamental laws of physics, there are 25 or so physical constants (specific quantities like the mass or charge of an electron), some of which are dimensionless physical constants (a pure number with no units. This is not as abstract a concept as it sounds. It basically just means a ratio between two things with similar units.) One of these dimensionless constants is 0.08542455, which characterizes the strength of the electromagnetic force and is directly related to the charge of an electron. (The bigger the number, the stronger the repulsive force between two electrons would have been.) The essential mystery is not tied to the fine structure constant in particular. It is just one of 25 examples. When Feynman wrote this in 1985, all these constants were shrouded in this tremendous mystery. What sense is there to specific numbers being fundamental?
In order to understand Feynman's question, you have to realize what he is assuming. He is assuming that a number cannot be fundamental. This is because it makes very little sense to say that the most basic existences in reality are 25 arbitrary numbers. What Feynman is asking is that if these numbers are not fundamental, how can science possibly explain these constants it terms of something more fundamental?
An appreciation of this problem is necessary before we can move forward in the story. Specific fundamental numerical values seem to defy any possible form of explanation. It doesn't seem reasonable to believe that any qualitative physical theory will ever spit out a number like 137.03597 (and some of the other numbers are even worse). They seem totally arbitrary. (It would be a different story if the numbers we were trying to produce were 1, 3, or the square root of 2 pi; if it were numbers like these, maybe we could stand a chance at deriving them from some qualitative concept. For instance, if it involved pi, we would look for a qualitative law involving circles...) This was one of the biggest difficulties in modern physics. We had absolutely no understanding about these fundamental constants, yet they were essential parts of our equations.
Two solutions were proposed (and still are by a minority of scientists) to try to explain where these arbitrary numbers came from. The first theory simply stated that these 25 numbers were Necessary Existences (this is the theory Feynman is implicitly rejecting). Needless to say, this did not satisfy most physicists. While it is obvious that you will ultimately arrive at an idea which is irreducible and not explainable in terms of simpler concepts, it is one thing when your axiomatic ideas are nice theories such as general relativity and quantum mechanics (or maybe a grand unified theory if you prefer one eternal existence); it is altogether a different thing to have a pantheon filled by general relativity, quantum mechanics, and 25 arbitrary numbers, all necessarily coexisting.
A second theory speculated that perhaps these 25 numbers were necessary results of some qualitative Master Mathematical Equation that had yet to be discovered. This too did not satisfy most physicists as it does not seem plausible that any qualitative law would naturally generate the specificity of numbers required by observation.
There was a general state of discontent with these forced explanations as they did not provide very much understanding or insight into the values of the constants. What could possibly have determined these numbers? Or, if nothing determined them, how could an arbitrary number be a fundamental part of reality?
God vs The Multiverse (Part 3: The Solution)
The major breakthrough in our understanding of the constants became widespread in 1986 with the publication of Barrow and Tippler's landmark book called the The Anthropic Cosmological Principle. In it, they explained the constants using the strong anthropic principle. (It comes in a weak form and a strong form, as well as many other misused forms. Different authors use it in different ways, which has led to much confusion. The key thing is not the labels, but rather an understanding of the different logical arguments employed. See the Hawking article from the introduction for a specific example.)
The significant advance in our knowledge was the recognition that the constants were not arbitrary. Rather, the constants were fine tuned in a way that only these specific values, within a very small range of variation, result in a universe with order, structure, complex life, etc. Even slightly different values of the constants would lead to a random, chaotic, meaningless universe.
Some particular examples, among many, deal with stars. Stars produce energy by fusing two hydrogen atoms into a single helium atom. During that reaction, 0.007 percent of the mass of the hydrogen atoms is converted into energy. If the percentage were 0.006, the universe would be filled only with hydrogen. If it was 0.008, the universe would have no hydrogen, and therefore no water and no stars like the sun.
Another example is the fine tuning of the fine structure constant of the previous post. Barrow showed that if the constant was greater or smaller by 4%, the nuclear fusion in stars would not produce carbon, thereby making carbon-based life impossible. (Max Born was actually the first physicist to recognize the key role this constant played in determining atomic structure in 1935 when he gave a lecture called The Mysterious Number 137. It was only after 1986 however, that this type of explanation for many of the constants became widely understood.)
One of the deeper ways to look at it is, if the fundamental laws of physics stayed the same but the values for different constants changed, we would still have physics but we wouldn't have cosmology, astronomy, chemistry, or biology. Change one number, and right after the big bang the universe either collapses in on itself or blows up too quickly to produce galaxies. Change a different constant and stars don't form. Change a different number and there are no atoms or the periodic table. Change another one and life never evolves. Yet all the constants are perfectly fine tuned just right so we have these complex phenomenon, and areas of beauty and wisdom in addition to physics.
It is important to realize how this teleological explanation (the strong anthropic principle) removes the difficulty presented by Feynman in the prior post. The mystery of the constants was how seemingly arbitrary numbers could be fundamental. What was discovered was that these numbers were not arbitrary as they seemed at first, but were rather fine tuned, in the sense that only these numbers in conjunction with the qualitative laws of relativity and quantum mechanics would lead to the universe we observe.
A teleological explanation is an explanation of something based upon a final cause or a purpose. For example, we could explain why a salt shaker has little holes on its top, based upon it's purpose of sprinkling salt on people's food. That doesn't tell us what made the little holes, but it does explain why they are there based upon the concept that the salt shaker was made to serve a certain purpose.
Similarly, the reason why the constants and the laws are designed the way they are, is in order for the universe to result from them. Were they to be even slightly different, all that would exist would be chaotic nonsense. The particular number for the constants was chosen because the purpose of the laws and constants of physics are to produce a meaningful universe.
This explanation only became possible once science had an understanding of the laws of physics and the critical role that these quantities play in them. Prior to this understanding, it would have been totally speculative to posit any type of teleological explanation.
The solution to the mystery is that the constants are not ultimately fundamental. The Fundamental of the 'fundamental constants' is an Intelligent Agent who selected the specific values. It is important to understand why this solution is not beset by the problem of having to determine the values of the constants to the 120th decimal place. The demand to explain every last decimal place is only upon the Master Mathematical Equation theory which speculates that there exists some unique mathematical equation which precisely determines the numbers. A unique equation does not determine a range of values. (In fact, the Necessary Existence theory fails, not because it doesn't explain the number to precision, but because it fails to explain why it's even in the range.)
An Intelligent Agent is able to choose between a range of numbers (i.e. between 130 and 150) all of which yield the same result. We can explain and understand why He didn't choose 129 or 151, because since they are outside the range of values, He wouldn't have accomplished His purpose. Unless we have more knowledge, we can't explain why he picked the exact number 137.03597. If we discover in the future that it mattered more (meaning the range is only 136-138), then we will know why He didn't choose 135. And if it didn't matter which value He chose so long as it was within the range, an Intelligent Agent is capable of choosing one value among many choices that all serve His purpose. (You do it all the time.)
Explaining the constants with a final cause was unacceptable to many scientists. 'Purpose' is something we attribute to an Intelligent Agent. While most physicists were willing to accept eternal, non-physical, non-intelligent laws as the cause of the universe, they were unable to consider that the cause of the universe was an Intelligent Agent who works with a final cause. An Agent that was able to understand the result of His own actions was simply unacceptable.
Nevertheless, the point was clear. The tie between the fine tuning of the constants and the order in the universe was undeniable. It was incumbent upon scientists to either accept a teleological explanation and the clear inference to an Intelligent Cause, or to explain why the universe seemed like it was designed. The fine tuning directly pointed to an Intelligent Designer, and the burden of proof was on those who denied intelligent design to explain the illusion of design based upon some unintelligent mechanism.
The theories mentioned in the first post, that of the constants being necessary existences and that of the Master Mathematical Equation of the Universe, were no longer sufficient in any sense at all. They were developed when the conceptual problem of the constants was one of arbitrariness. Given our new knowledge of the connection between the values for the constants and the resultant order and complexity in the universe, these theories rapidly fell even further out of favor. It is too coincidental to assume that the values determined by the hypothesized necessary existences or the Master Mathematical Equation of the Universe happen to be those which result in order and complexity many years later.
To illustrate the point, consider the following hypothetical example. After years of unsuccessfully looking for life on Mars, scientists discover "something" which they cannot quite figure out. After years of analysis of its various parts, they realize that it is a one million year old spaceship which is perfectly suited for travelling on and around Mars. Despite the fact that we have not as of yet found life on Mars, the perfect design of the spaceship is clear evidence that it was designed by some intelligent being (which we would know nothing about, other than the fact that it was intelligent). If someone wanted to deny this and claim that it emerged by random chance or some master mathematical equation that necessitates spaceships on mars, the burden of proof would be on them to develop a compelling theory of how this could have happened.
We have included a short video about the cosmological constant and fine tuning with Leonard Susskind (one of the fathers of string theory and an advocate of the multiverse). The cosmological constant is recognized as one of the most striking examples of fine tuning, and also plays a critical role in big bang cosmology. It is an excellent video that will blow your mind (http://youtu.be/i4T2Ulv48nw).