Neon Lights In Motion, Whistler by Bill Gracey http://www.flickr.com/photos/9422878@N08/9070440775/ Attribution-NonCommercial-NoDerivs License

Chapter 8

Science as a Language, the Non-Probativity Theorem and the Complementarity of Complexity and Predictability

In the last chapter when we studied the nature of the impact of the four spheres on the human condition we omitted science from our discussion. In this chapter we will study the nature of science as one of our objectives in this project is to create a dialogue between science and the humanities and the social sciences using information as a bridge connecting them.

In particular we will try to answer the questions:

  • What is science?
  • How does it differ from mathematics?
  • What is the relationship of information to science?
  • What is the reliability and truth-value of the information generated by science?
  • Can a scientific analysis prove anything?

At the Humanity and the Cosmos Symposium held at Brock University in St. Catharines Canada in January 2000 (see the acknowledgement at the end of this chapter) a number of the participants made statements to the effect that science could prove this or that. During the course of our discussions it suddenly occurred to me that science cannot prove anything but only offer up hypotheses to be explored empirically. This chapter is an elaboration of that thought.

A linguistic analysis and a formal mathematical proof will be presented to show that science cannot prove the truth of a proposition but can only formulate hypotheses that continually require empirical verification for every new domain of observation that is encountered. A number of historical examples of how science has had to modify theories and/or approaches that were thought to be absolutely true and unshakable are presented including the shift in which linear dynamics is now considered anomalous and non-linear dynamics the norm. Complexity and predictability are shown to have a complementarity like that of position and momentum in the Heisenberg uncertainty principle. The relationship of complexity and predictability is also shown to be similar to that of completeness and logical consistency within the framework of Gödel’s Theorem.

Science as a Language
Atacama Large Millimeter Array

Because science is a form of organized knowledge in order to understand the relationship of information to science we need to understand the relationship of information and knowledge. In Chapter 2 we identified information as structured data and knowledge as the ability to use information strategically to achieve ones objectives. The objective of science is to describe nature as accurately and simply as possible. As Einstein opined a theory should be as simple as possible but not too simple. In Chapter 3 we also argued that science and mathematics are languages and therefore part of culture and hence the symbolosphere. Within the framework of this model of the evolution of language, mathematics and science are seen to be distinct languages each with their own unique informatic objectives.

Mathematics strives to solve equations and to prove the equivalence of sets of propositions involving the semantical elements of its language, namely, abstract numbers (such as integers, other rational numbers, irrational numbers, and imaginary numbers), geometrical objects (such as points, lines, planes, triangles, pyramids, vectors, and tensors), sets, operators, etc. A theorem or a proof is a unique syntactical element of the language of mathematics, which we will show cannot be an element of the language of science. A theorem or a proof establishes, using logic, the equivalence of one set of statements with another set of statements, a proposition whose truth is to be established by the theorem. The first set of statements includes axioms, whose truths are assumed to be self-evident and, at times, other theorems, which have already been proven based on the same set of axioms.

Science, on the other hand, establishes the veracity of a proposition using the technique of the scientific method of observation, generalization, hypothesis formulation, and empirical verification of the predictions that emerge from the hypothesis. The scientific method is a unique syntactical element of the language of science. In addition to trying to provide an accurate description of nature science also attempts to describe nature in a systematic manner using the minimum number of elements possible. The description of one phenomenon in terms of another is often claimed to be an explanation. This is one way to interpret this reduction of the number of basic elements needed to describe nature, which is a basic goal of science. Science also endeavors to make predictions that can be tested to establish the accuracy of its models. No matter how refined these processes become and no matter how many reductions and simplifications are made there always remain some irreducible elements that resist explanation or description in terms of simpler phenomena. The process of reduction has to end somewhere. The basic elements in terms of which other phenomena can be described can be thought of as the basic atoms or elements of scientific description (MacArthur, 2000).

Scientists often make use of mathematical language to construct their models of nature, especially in the physical sciences. They often employ mathematical proofs to establish the equivalence of mathematical statements within the context of their models. This has led to the popular belief that science can actually prove things about nature. This is a misconception, however. No scientific hypothesis can be proven; it can only be tested and shown to be valid for the conditions under which it was tested. Each proposition must be continually verified for each new domain of observation that is encountered.

The purpose of this chapter is to make use of mathematical reasoning to show and actually prove that science can never prove the truth of any of its propositions or hypotheses. Our proof is based on an axiom proposed by Karl Popper (1959), namely that a hypothesis, proposition or theory is scientific only if it is falsifiable, namely it has the possibility of being shown to be false by an observation or a physical experiment, in other words it is testable with the possibility of a negative outcome.

For the purposes of our study we need to clarify what we mean when we use the word truth by distinguishing two types of truth, empirical or verifiable truth and necessary or analytic truth. Empirical truth arises from the matching of a measurement with a model and is always approximate to some degree or other depending on the precision of the measurement and the accuracy of the model. Necessary truth arises out of mathematical reasoning or the use of logic and is exact. Although necessary truth is exact its validity depends totally on the basic axioms from which one starts and which one assumes are self-evidently true. At some point one must rely on belief to establish that an axiom is self-evidently true. The necessary truth of mathematics or logic is therefore artificial. The most one can say about the truth of mathematics and logic is that subject to the limitations of Gödel’s Theorem it can only demonstrate the equivalence of one set of propositions with another. Mathematics and logic are therefore our very first examples of virtual reality. Empirical truth while less precise than necessary truth at least attempts to describe reality. The scientific models are artificial and are only representations of reality but they do have to measure up.

To establish our theorem, the Science Non-Probativity Theorem, we will make use of Popper’s basic axiom, namely, that for a statement or an assertion to be considered as a scientific statement it must be tested and testable and, hence, it must be falsifiable. If a proposition must be falsifiable or refutable to be considered by science then one can never prove it is true for if one did then the proposition would no longer be falsifiable, having been proven true (in the sense of necessary truth), and, hence, could no longer be considered within the domain of science. We have therefore proven that science cannot prove the truth of anything. Any proof of the truthfulness of a proposition would put that proposition outside the realm of science and place it within the domain of mathematics or logic. And as was pointed out by Stephen Clark (2000), “Not all proofs are ever intended as ‘necessitations’. So what counts as ‘proof’ will vary between disciplines and practices.”

The Science Non-Probativity Theorem

Let us repeat the above argument as a formal theorem making use of two axioms.

Axiom 1: A proposition must be falsifiable to be a scientific proposition or part of a scientific theory.

Axiom 2: A proposition cannot be proven necessarily true and be falsifiable at the same time. [Once proven true, a proposition cannot be falsified and, hence, is not falsifiable.]

Theorem: A proposition cannot be proven to be true by use of science or the scientific method.

Proof: If a proposition were to be proven to be true by the methods of science it would no longer be falsifiable. If it is no longer falsifiable because it has been proven true it cannot be considered as a scientific proposition and hence could not have been proven true by science. Q.E.D.

Spiral galaxy

In the spirit of the Science Non-Probativity Theorem and our distinction between necessary and empirical truth, we cannot be certain that this line of reasoning is absolutely valid or “true”. After all we have just used the theorem, a syntactical element of the language of mathematics to establish a proposition about the language of science. The validity of our conclusion is no greater than that of our starting axioms. Our theorem is not scientifically valid but as a result of mathematical reasoning we have created a useful probe; one that can lead to some interesting reflections and insights into the nature and limitation of science. If it helps scientists and especially the public, who tend to accept the authority of science more or less uncritically, to adopt a more humble and modest understanding of science, it will have served its purpose. The purpose of this exercise was not, as some have suggested, to challenge the usefulness of science or the validity of its methodologies but to clarify the nature of scientific truth and contrast it with the necessary truth of logic.

All that science can do is to follow its tried and true method of observing, experimenting, generalizing, hypothesizing and making predictions then testing its hypotheses and predictions. The most that a scientist can do is to claim that for every experiment or test performed so far, the hypothesis that has been formulated explains all the observations made to date and that all predictions have been validated within experimental errors. Scientific truth is always equivocal and dependent on the outcome of future observations, discoveries and experiments. It is never absolute. I hope these arguments establish that the verification of a scientific proposition through empirical testing or observation is not equivalent to proving the truth of that proposition as some would claim.

A scientist who claims to have proven anything is being dogmatic. Every human being, even a scientist, has a right to their beliefs and dogmas in their basic axioms upon which their proofs are based. But it does not behoove a person who claims to be a rational scientist and who claims that science is objective and universal to be so absolute in their beliefs and in the value of their belief system, science. Scientists are not immune to dogmatic and intolerant views as Dr. George Coyne (2000) has pointed out in his talk at the Humanity and the Cosmos Symposium at Brock University, “When the Sacred Cows of Science and Religion Meet”.

I believe, it is altogether fitting and appropriate, that scientists should display greater humility and tolerance in the practice of their vocation and calling (Bertschinger, 2000) in view of the lessons to be learned from the following historical vignettes where well established scientific theories and dogmas had to give way to newer ones.

Albert Einstein, 1921

Newton’s theory of motion gave way to Einstein’s theory of relativity once one considers velocities that approach the speed of light. The Newtonian picture also underwent major revisions with the introduction of quantum mechanics needed to describe atomic systems. Neither the contribution of Newton to science nor the validity of his model of dynamics for non-relativistic and non-quantum events were in any way diminished by these 20th century discoveries. In fact, many elements of Newton’s theory survived in both relativity and quantum mechanics and one cannot imagine how these theories could have been formulated without the pioneering work of Newton. Even today’s current version of quantum mechanics requires the use of the classical Newtonian Hamiltonian to formulate the energy operator.

Einstein helped to launch quantum mechanics with his explanation of the photoelectric effect in 1905. Despite this pioneering work he turned on the child of his own creation, quantum mechanics, claiming that it is an incomplete theory. Einstein’s objections have given way to the acceptance by the main stream of the physics community of probability as being an intrinsic part of our observation of nature due to the Heisenberg uncertainty principle. Einstein’s hypothesis that quantum mechanics is an incomplete science can never be disputed or disproved according the Non-Probativity Theorem formulated above. The usefulness of his hypothesis, however, dwindles in the absence of any concrete progress towards a complete non-probabilistic theory of quantum mechanics and atomic systems. And this despite the valiant efforts of David Bohm, Roger Penrose, and others to find the hidden variables or structures that they claim would make quantum mechanics a complete theory. One cannot but help to conjecture that perhaps the reason that these variables are so well hidden is that they do not exist. But this is only my conjecture and belief and not anything that I could prove.

Einstein, Time magazine’s man of the 20th century and whose name is synonymous with genius had no problem rejecting one of the elements of his theory of general relativity. He introduced a cosmological constant into his theory in 1914 to describe what he thought at the time was a steady state universe. When Hubbell showed in 1929 that we lived in an expanding universe Einstein immediately dropped this element of his theory. Some contemporary cosmologists have since resurrected it because they find it might serve a useful purpose in their attempts to explain or describe certain specific observations of the cosmos.

Another interesting shift in attitudes within the physics community is illustrated by the recent emergence (pun intended) of chaos theory, complexity, simplicity, plectics, emergence, and self-organizing criticality all of which concern themselves with non-linear dynamic systems. It was once claimed, not very long ago, that the complications that non-linear equations presented were mere details not worthy of attention since the basic equations of motion, while not soluble in closed form, are at least amenable to numerical analysis if one needed to solve these equations. In fact Poincaré showed that there is no unique solution to the 3-body problem.

When simple laws govern systems with a large number of variables, the underlying order is obscured by our inability to track every component, and it becomes inaccessible to our limited brainpower. Within the last decade this view of the origin of complexity has been strongly challenged…. At the frontiers of today’s mathematics are startling paradoxes about the way the world can change. In particular, we now know that rigid, pre-determined, simple laws can lead to behavior so irregular that it is to all intents and purposes random (Cohen and Stewart 1994, 20).

With the availability of computers, especially microcomputers because they provided researchers with low cost computing power that allowed them to play, scientists were able to explore and examine the complexity of non-linear dynamical systems and their sensitivity to initial conditions. As a consequence many interesting results were arrived at and it is now widely recognized that non-linear physics is not a special case or the anomaly of nature but rather the norm that requires detailed attention. The shoe is now on the other foot and it is realized that it is the dynamical systems that can be described by linear equations that are the anomalies or unusual cases. It was only because they could be described in simple closed mathematical equations that they received as much attention as they did.

In light of the Non-Probativity Theorem it is clear that the role of science is to probe and not to prove. It is interesting that the two English words, prove and probe, both derive from the same Latin root, proba, which means prove. The words probability, problem and probable all have the same root. This makes Einstein’s rejection of probability in quantum mechanics all the more ironic.

Science, the Language of Metaphor

Science involves the process of representing empirical observations in terms of models many of which are mathematical. These models whether or not they are mathematical are metaphors for and abstractions from nature. The spirit in which scientific models are described as metaphors is the same as that of the proposition that all the words of a spoken language are metaphors. The idea that all communication is based on metaphor is an idea that “has ancient origins in oral cultures and has been repeated and debated through history” (Gozzi, 2000) by Plato, Vico, Keats, Shelley and many modern linguists. McLuhan (1964) quotes Quintillian “Nearly everything we say is metaphor.”

Once a scientific model is formulated in terms of some basic axiomatic metaphors, mathematical and/or logical relationships between these metaphors are explored leading to predictions in the form of new metaphors. The relationships between the axioms and the predicted metaphors have the rigor of a mathematical proof but the validity of the model is determined by how well the predicted metaphors match the observations of nature. The most one can say, ala Hume, is that the newly predicted metaphors transformed by mathematics from the original axiomatic metaphors of the starting model make a good match to the observed phenomena of nature. This empirical agreement supports the scientist’s model but does not prove that the model is correct because one must leave open the possibility that the model can be falsified or refuted or perhaps just improved.

If as noted above all words are metaphors and all scientific models are also metaphors there is no need to prove that scientific statements are true. One cannot prove a metaphor is true one can only test whether or not it provides a useful description of nature, which leads to greater insights and in the case of science to more predictions or in the case of the arts to deeper insights. It is the natural process of a language to evolve, the same is true of the meaning of words and metaphors. Words are continually bifurcating keeping their old meaning and taking on new meanings. The new meanings, however, carry with them vestigially some of the structure or meaning of their ancestors just as animals and plants vestigially retain structures from their ancestors. Scientific theories, which are made up of metaphors, also evolve and bifurcate into new models, which vestigially retain remnants of earlier theories. Relativity and quantum physics still retain much of classical Newtonian physics. Plus ca change plus c’est la meme chose.

All models are abstractions from nature and hence represent a reduced reality. Mathematical transformations of the abstractions or metaphors of a model may further degrade their accuracy and reduce their match with empirical reality.

The role of science is not to prove or even to explain the phenomena of nature but rather to uncover patterns that relate one set of phenomena to another. The mathematicizing of scientific models and metaphors and the process of subjecting them to mathematical operations has proven to be a successful technique in uncovering these patterns especially when predictions are made that can be observed or measured.

The Complementarity of Complexity and Predictability

The assumption that the metaphors contained in mathematical models used to describe nature can then be operated upon using linear mathematical operators to obtain new relationships among the elements of the model which will then correspond to what is observed in nature is premised on the notion that the relationship between the elements of the model and the elements of reality are linear. This is an assumption or basic presupposition, which cannot be proven mathematically but must be tested empirically and cannot be presumed to be necessarily true.

The effect of a non-linearity between the model and reality can become magnified if the mathematical equations relating the elements of the model are themselves non-linear. A small difference or non-linearity between the mathematical model and the reality being modeled can lead to vastly different outcomes ala the butterfly effect of Lorenz.

Quantum mechanics and the uncertainty principle have taught us that the process of measuring nature at the atomic scale changes the phenomena we are observing and scrutinizing. Something similar happens with complex processes, which

generate counterintuitive, seemingly acausal behavior that’s full of surprises…. Complexity is an inherently subjective concept; what’s complex depends upon how you look…. Whatever complexity such systems have is a joint property of the system and its interactions with another system, most often an observer and/or controller.” (Casti, 269–71)

The modeling of nature using metaphors introduces a new level of uncertainty in matching one’s model with nature especially when one attempts to represent the non-linear phenomena using classical pre-chaotic physics. Paradoxically the introduction of chaos has led to the discovery of new patterns and insights into the nature of non-linear dynamic systems ranging from the behavior of ecosystems to the origin of the universe.

Isaac Newton, Godfrey Kneller, 1689

Within the new physics of chaos or complexity theory, the chaos or the uncertainty associated with not being able to make predictions of the behavior of non-linear systems leads, as Prigogene first suggested, to new levels of order. The Heisenberg uncertainty principle in quantum mechanics, which does not allow the simultaneous determination of the position and the momentum, leads to an understanding of the wave nature of particles and the particle behavior of light and by association to an understanding of the wave behavior of the probability amplitudes needed to describe atomic and sub-atomic particles and make predictions about their behavior. Just as momentum and position (or energy and time) play complementary roles in the Heisenberg uncertainty principle, complexity and predictability seem to play a similar complementary role. Complexity and predictability are hard to quantify in this context unlike the uncertainty in momentum and position, ∆p and ∆x, in quantum mechanics. But it is the case that one cannot at the same time take into account all of the variability of a non-linear system and still formulate the equations that will predict the behavior of the system.

Indeed, any theory of complexity must necessarily appear insufficient. The variability precludes the possibility that all detailed observations can be condensed into a small number of mathematical equations, similar to the fundamental laws of physics…. If, following traditional scientific methods, we concentrate on an accurate description of the details, we lose perspective…. Chaos theory tells us that many simple mechanical systems, for example pendulums that are pushed periodically, may show unpredictable behavior. We don’t know exactly where the pendulum will be after a long time, no matter how well we know the equation for its motion and its initial state (Bak, 9–11).

When dealing with non-linear phenomena like the weather, the greater the scope of a model the more complexity it must embrace and the less predictability it incorporates and hence the greater is its chaos. This parallels the Heisenberg uncertainty principle, where the more one knows about the momentum the less one knows about the position and vice versa. A similar situation holds in dynamic modeling as well. The greater the predictability of a model the less complex it is and the smaller the number of elements that can be successfully modeled. Consider gravitational systems like the solar system. The two-body problem yields total predictability as the equations describing motion can be solved in a closed form. With three or more bodies as the number of bodies increases, the complexity increases and the predictability decreases. Complexity and predictability are complementary in the same sense as momentum and position within the context of the Heisenberg uncertainty principle.

The decrease in the predictability of a model of a non-linear dynamics system because of the increase in chaos does not represent a shortcoming of the model but rather an attempt to be complete by including the full complexity of the phenomenon being represented. In the spirit of the Non-Probativity Theorem there is no reason to believe apriori that a model representing nature should be both complete and totally predictable. Gödel’s Theorem can serve as a possible model to better understand the complementarity of complexity and predictability. Gödel’s Theorem states that a mathematical system cannot be both complete and logically consistent at the same time. If we think of predictability of phenomena as a form of logical consistency with the basic laws of nature and consider complexity as a form of completeness then Gödel’s Theorem also supports the notion that total complexity or completeness of a model precludes complete predictability.

The rejection of chaotics and complexity theory by adherents of the older paradigms of Newtonian physics, relativity and quantum mechanics is due to the fact that the new physics places limitations on the predictability of nature. Einstein critiqued quantum mechanics when he proclaimed, “God does not play dice”. The new physics is even more disturbing to this new generation of skeptics who have to contend with the notion that not only does God play dice at the atomic and sub-atomic level but he also plays it at the macro level. Even though the interactions of complex classical systems are causal they are not predictable because of their complexity and their non-linear dynamics and therefore seem random like quantum effects. As a consequence of this one must give up on the notion of the prediction of certain phenomena at the macro level, something that not even quantum mechanics required despite the fact that it made use of probability at the micro level. Equally disturbing to some is the fact that the very existence of human life might also be the result of a random role of the dice.

The new physics places limitations on the ultimate ability of science to predict certain phenomena critical to human survival such as the weather and large scale climatic change no matter how sophisticated our computational skills become. Buying into the new physics requires accepting the fact that some problems are intractable. This requires a new level of humility on the part of science, which has enjoyed a period of unprecedented success for over 500 years in which it has been able to describe and explain almost every phenomenon it has encountered. Are we willing to sacrifice the sacred cow of predictability and accept a more modest role for ourselves in our quest for understanding our universe? Will we accept a worldview in which chaos and non-predictability is regarded as natural outcomes of the complexity and diversity of our universe, a richness, which gives rise to this dilemma. I believe that the next generation of physicists will happily sacrifice this sacred cow and move on to a higher and deeper understanding of nature in much the same way that the Hebrews gave up the golden calf at Sinai and embraced ethical monotheism, but not without becoming stiff-necked, however. The only solace that can be offered to those who are disturbed by the lack of predictability of the new physics is that events are still causally connected but that at the edge of chaos where self-organizing criticality takes place science will not be able to determine which new form of equilibrium will emerge.

Conclusion

In this chapter we have attempted to show the strengths and limitations of science when regarded as a language with its dual role of communication (description) and information processing (predictability). The Non-Probativity Theorem underscores a long held belief that scientific truth is not absolute but always subject to further testing. We have tried to link the limitations on predictability within the framework of the new physics of non-linear dynamics with the Heisenberg Uncertainty Principle and Gödel’s Theorem. We have suggested that the chaos and non-predictability of complexity theory allows a more complete and fuller description of nature.

Acknowledgments: I wish to thank the organizers and participants of the Humanity and the Cosmos Symposium, January 20–22, 2000 sponsored by Brock University and the Brock Philosophical Society where the ideas for this chapter were incubated. I would also like to thank George Coyne personally for his powerful keynote address, which inspired the ethical dimensions of this chapter. For their insightful presentations and lively discussions I would like to individually thank, Hugo Fjelstad Alroe, David Atkinson, Richard Berg, Edmund Bertschinger, Leah Bradshaw, Bruce Buchanan, David Crocock, G.E. Dann, Darren Domski, George Ellis, David Goicoechea, Anoop Gupta, Calvin Hayes, Daniel MacArthur, K. McKay, M.J. Sinding. I would also like to acknowledge my colleagues on the Media Ecology listserv for a recent stimulating discussion of metaphors with Jim Curtis, Raymond Gozzi, Randolph Lummp, John Maguire, Eric McLuhan, Lori Ramos and Lance Strate. I also wish to acknowledge the help of my former colleague in the Physics Department at the University of Toronto, Prof. Ken McNeill of blessed memory.