Core Concepts
Actualism Adequate Determinism Agent-Causality Alternative Possibilities Causa Sui Causal Closure Causalism Causality Certainty Chance Chance Not Direct Cause Chaos Theory The Cogito Model Compatibilism Complexity Comprehensive Compatibilism Conceptual Analysis Contingency Control Could Do Otherwise Creativity Default Responsibility De-liberation Determination Determination Fallacy Determinism Disambiguation Double Effect Either Way Enlightenment Emergent Determinism Epistemic Freedom Ethical Fallacy Experimental Philosophy Extreme Libertarianism Event Has Many Causes Frankfurt Cases Free Choice Freedom of Action "Free Will" Free Will Axiom Free Will in Antiquity Free Will Mechanisms Free Will Requirements Free Will Theorem Future Contingency Hard Incompatibilism Idea of Freedom Illusion of Determinism Illusionism Impossibilism Incompatibilism Indeterminacy Indeterminism Infinities Laplace's Demon Libertarianism Liberty of Indifference Libet Experiments Luck Master Argument Modest Libertarianism Moral Necessity Moral Responsibility Moral Sentiments Mysteries Naturalism Necessity Noise Non-Causality Nonlocality Origination Paradigm Case Possibilism Possibilities Pre-determinism Predictability Probability Pseudo-Problem Random When?/Where? Rational Fallacy Reason Refutations Replay Responsibility Same Circumstances Scandal Science Advance Fallacy Second Thoughts Self-Determination Semicompatibilism Separability Soft Causality Special Relativity Standard Argument Supercompatibilism Superdeterminism Taxonomy Temporal Sequence Tertium Quid Torn Decision Two-Stage Models Ultimate Responsibility Uncertainty Up To Us Voluntarism What If Dennett and Kane Did Otherwise? Philosophers
|
The Illusion of Determinism
Abstract
Adequate (or statistical) determinism is an emergent property in the universe that was initially chaotic and which remains chaotic at atomic and molecular levels. Consequently all physical processes are statistical and all knowledge is only probabilistic. Strict determinism is an illusion, a consequence of idealization.
Statistical knowledge always contains errors that are normally distributed according to a universal law that ultimately derives from the discrete quantum nature of matter. The existence of this universal distribution law of errors convinced many scientists and philosophers that the randomness of errors was not real, that strict deterministic laws would be found to explain all phenomena, including human beings. To the extent that randomness is needed to break the causal chain of strict physical determinism, many philosophers continue to think that free will is the illusion. The fundamental nature of the universe is discrete (all things are particulate - atoms for example) and chaotic (irreducibly random). In some parts of the universe however, stable information structures have emerged from a creative process involving indeterministic quantum mechanics (wave function collapse) and transport of entropy away from the new structures to empty parts of the expanding universe. This core process of information creation underlies the formation of microscopic objects like atoms and molecules and macroscopic objects like galaxies, stars, and planets. Physically large objects appear to be continuous and highly deterministic, for example the motion of planets around the sun. Planetary positions are predictable to a very high degree of accuracy, using analytic differential equations of motion. But this apparently perfect determinism is an idealization, an abstraction from reality, which is only statistically deterministic becasue the indeterministic influences of random quantum events are averaged over. When small numbers of atoms and molecules interact, their motions and behaviors are indeterministic, governed by the rules of quantum mechanics. Werner Heisenberg's principle of indeterminacy (mistakenly called "uncertainty," as if the problem is epistemic/subjective and not ontological/objective) gives us the minimum error in simultaneous measurements of position x and momentum p,
Δp Δx ≥ h,
where h is Planck's constant of action. To see how "adequate" determinism emerges for large numbers of particles, note that the momentum p = mv, the product of mass and velocity, so we can write the indeterminacy principle in terms of velocities and positions as
Δv Δx ≥ h / m.
When large numbers of microscopic particles get together in massive aggregates Determinism is thus an emergent property. The "laws of nature," such as Newton's laws of motion, are all statistical in nature. They "emerge" when large numbers of atoms or molecules get together. For large enough numbers, the probabilistic laws of nature approach practical certainty. But the fundamental indeterminism of component atoms never completely disappears. And some microscopic randomness may be amplified and appear in the macroscopic world. Isaac Newton's laws of motion perfectly explain Kepler's observation that a planet moves in an ellipse around the sun. But this result depends on treating the sun and planet as point masses and ignoring the other planets. If they are included, classical mechanics becomes only approximate and the determinism only adequate (albeit accurate to many significant figures). In addition, measurements of planetary position and motion (indeed all experimental measurements) are only approximate because of observational errors. These errors are a combination of human ignorance (our minds and instruments contain limited information and lack perfect precision) and fundamental randomness - whose source is quantum uncertainty. At the other extreme of physically small objects, James Clerk Maxwell and Ludwig Boltzmann successfully described the motions of atoms or molecules in an "ideal gas" by ignoring the details of their interactions (collisions with one another and with the walls of their container). Maxwell and Boltzmann knew very well that their results were only approximate and statistical. But their statistical mechanics provided a quantitative physical explanation for macroscopic thermodynamic observables that seemed to confirm the deterministic nature of physics. The astounding success of deterministic mechanical theories describing the largest and the smallest objects in the universe appeared to late nineteenth-century scientists and philosophers to confirm physical determinism, and by association the many other forms of determinism. But quantum chaos is the fundamental condition of the early universe and the present microcosmos. How can we reconcile this fundamental and irreducible randomness and disorder with the appearance of cosmic order, including life and intelligence? The "adequate" determinism of macroscopic structures is simply a consequence of the very large number of quantum particles involved. Since the fundamental particles follow the laws of quantum mechanics, their macroscopic behavior approaches classical mechanics in the limit of large quantum numbers (the Bohr correspondence principle) and as a consequence of the law of large numbers that describes macroscopic objects made up of vast numbers of quantum particles. We are of course quite fortunate that the determinism we have, while not the strict, necessary, logical determinism that scientists and philosophers thought, is adequate enough to provide us with a highly predictable and orderly world. Information structures have stability over time scales of the same order as the age of the universe. Parts of DNA have not changed in 2.8 billion years. Living systems have learned to manage the underlying chaos (with sophisticated error detection and correction mechanisms). Far from being the problem that many philosophers think it is, randomness is used by living systems to escape the trap of determinism and provide us with the alternative possibilities needed for freedom of action and creativity.
The Calculus of Probabilities
Many ancient and modern philosophers rejected chance and randomness as unintelligible ideas. Chance was used to describe situations in which humans simply lack the knowledge of what exactly is going on. Randomness was regarded as an epistemological problem, not a metaphysical or ontological reality.
The limits on knowledge were considered to be a problem only for humans. Theologians were confident that God could know details of which humans were ignorant. Metaphysical chance was regarded as atheistic. Gottfried Leibniz and Pierre-Simon Laplace postulated a super intelligence that could know the positions and velocities of all the particles in the universe and thus know the complete future. Laplace and contemporary mathematicians were convinced of the deterministic nature of the universe by their discovery of the underlying distribution law that governs chance events - the law of errors (Legendre, Gauss) or normal distribution. Laplace named his theory about random events the "calculus of probabilities" to signal approbation of a subject that originated in illicit games of chance. Calculating a priori probabilities is a means to justify degrees of belief, the fundamental basis for epistemology. Admitting the reality of metaphysical chance in the world helps a posteriori to explain events after the fact that do not agree with our expectations.
Theories are probable.
Epistemology, the study of what we know, is fundamentally probabilistic. Ontology, the study of what exists, is fundamentally statistical.
Experiments are statistical. Knowledge is (subjective) information in our minds about (objective) information in external things.
Chaos Theory
"Chaos theory" is a deterministic mathematical formalism that describes the dynamics of physical systems near singular points in their motions where infinitesimal differences in position or velocity lead to exponentially large differences at later times. It does not involve quantum uncertainty, simply extreme sensitivity to initial conditions.
Chaos theorists are determinists who think that chaotic behavior is only apparently random. Most complexity theories are also deterministic. Free Will
In the 1870's Maxwell noted the occurrence of singular points in hydrodynamical flows and argued that something like them in the mind might allow living creatures to escape from strict determinism.
After the discovery of quantum uncertainty, some scientists (Arthur Stanley Eddington, Arthur Holly Compton, John Eccles, Henry Margenau) proposed quantum randomness as the source of free will. But they all admitted failure if chance was the direct cause of our actions. Free will is a two-stage process of "free" (random generation of alternative possibilities) followed by "will" (adequately determined selection of the best action). Normal | Teacher | Scholar |