EvoWiki is now a project of the RationalMedia Foundation.
We are moving all content to RationalWiki.
See the EvoWiki project page for details!

Paradigm shifts

From EvoWiki

Jump to: navigation, search



This article gets its name from the thesis of Thomas Kuhn's famous work, The Structure of Scientific Revolutions. That famous book proposes that the progress of science is divided into two parts: "normal science", which is work done inside of a "paradigm", and "scientific revolutions" or "paradigm shifts", where one paradigm gets replaced by another.

However, I (LP) believe that true Kuhnian paradigm shifts are relatively rare, and that new theories tend to subsume or incorporate previous theories rather than replace them -- even when they contain significant departures from the old theories. Thus, if one believes that all of the Earth's surface is dry land, one will be forced to conclude otherwise after running into a body of water. However, the dry-land parts stay dry land.

But first a note on cutting-edge science. It often features the development of conflicting paradigms, with the conflicts being resolved in various ways as work continues. Sometimes one paradigm is right, sometimes different paradigms are right in different circumstances, and sometimes different paradigms are pieces of some larger paradigm. A good example of the latter is the story of the six blind men and the elephant, each one of whom identified a part of that elephant as something else:

Trunk -- snake
Tusk -- spear
Ear -- leaf
Side -- wall
Leg -- tree
Tail -- rope

However, cutting-edge science is work in fields where there are usually no pre-existing paradigms, so despite its paradigm conflicts, it usually does not qualify as having paradigm shifts.

I now turn to some examples of my thesis; I will be including a fair amount of detail as illustration. This detail is not intended to be absolutely comprehensive, since that would increase the volume of this page while obscuring its overall thesis. Thus, it may give the impression that cutting-edge science is relatively linear and straightforward, when it is usually very messy, twisting, and nonlinear. But that false impression is something like how a large-scale map makes coastlines look straighter than a small-scale map does, due to omission of too-small detail, and I believe that it is tolerable for the purpose of illustrating overall trends.

The Elements


The constituents of matter has been wondered about for centuries. The ancient Greeks concluded that there are four elements: earth, water, air, and fire, and that solution was accepted for well over a millennium. Gold was often thought to be the perfect mixture of those elements, and manufacturing it, or at least seeming to manufacture it, was a major preoccupation of the alchemists. However, by early modern times, it was apparent that the manufacture of gold was a wild goose chase that was mixed up with mystical quests and tainted with fraudulent claims, and Robert Boyle renamed the serious science in alchemy to "chemistry".

And in the course of their experiments, they were forced to develop a radically different conception of elements; by comparison, the Greek ones more plausibly resemble states of matter (solid, liquid, gas, and ionized gas (plasma)). In the middle of the eighteenth century, French chemist Antoine Laurent Lavoisier summed up the new conception and proposed a new list of the chemical elements[1], many of which are still recognized as elements or oxides of elements. The main exceptions are light and "caloric" (heat as a fluid).

By the end of the eighteenth century, French chemist Joseph Proust distinguished between different ways that elements can combine, stating that the formation of "true" compounds follows a "law of definite proportions", while the formation of mixtures does not. Thus, burning sugar will consume a quantity of oxygen proportional to its mass, while sugar can be dissolved in water in any proportion less than some critical one where no more can dissolve.

At the beginning of the nineteenth century, British schoolteacher John Dalton proposed the first rigorous version of atomic theory as a result of his work on gases, eventually discovering that it admirably explained the Law of Definite Proportions. Elements come in atoms, which combine in well-defined proportions to form molecules. He also came up with symbols for the elements that were circles with various marks in them. But Swedish chemist J�ns Jakob Berzelius came up with alphabetic abbreviations, which proved easier to typeset, and which are still used today.

Once atomic theory was accepted, it was soon noticed that each element's atoms had some preferred numbers of other atoms to combine with, something that led to "valence theory". Hydrogen has one bond and oxygen has two, thus, water is H-O-H and oxygen gas is O=O (two bonds at the same time). There are various complications, but the idea is fundamentally sound.

And how might elements be interrelated? Some sets of similar elements were known, like copper, silver, and gold, but were there any overall patterns? Chemists were groping toward an overall arrangement, but in the 1860's, Russian chemist Dmitri Mendeleev (or Mendeleyev) beat them to the punch, proposing the "periodic table of elements", complete with some spaces for undiscovered elements. When they were discovered, their properties were close to Mendeleev's predictions.

And what was behind all of this? Electrons were discovered in the late nineteenth century, and quantum mechanics and atomic nuclei in the early twentieth century -- and atoms were revealed to be composite entities, making the term a misnomer. By the mid-twentieth century, the qualitative properties of atoms could be successfully accounted for with the quantum mechanics of electrons orbiting nuclei, though detailed quantitative predictions would often require computing power that would only become available later.

Nuclei were discovered by using alpha-emitting radioactive nuclei as particle accelerators and gold foil as the target. In 1909, British physicist Ernest Rutherford discovered that some of the alphas came off at side angles, or even a bit back toward the alpha source. This indicated that atoms contain massive charged objects much smaller than them -- nuclei.

However, it was quickly discovered that there were more kinds of nuclei than kinds of atoms (chemical elements). British physicist JJ Thomson discovered in 1912 that the gas neon was composed of two isotopes: Ne20 and Ne22, and this discovery was repeated for many other elements. But order was restored when British physicist James Chadwick discovered the neutron in 1932, confirming a couple decades of speculation about "neutral protons" in nuclei.

By the 1950's, a veritable zoo of elementary particles had been discovered, with physicist Enrico Fermi stating in exasperation: "If I could remember the names of all these particles, I'd be a botanist." However, many of them, the strongly-interacting "hadrons" like the nucleons (proton and neutron), had an order in them that was interpreted as the presence of something that got named "quarks".

What were quarks? In the 1960's and 1970's, high-speed collisions with nucleons revealed that there are electronlike elementary particles inside of them ("partons"), repeating the history of the discovery of atomic nuclei. And these "partons" turned out to have the properties expected of quarks, like their electric charge, weak interactions, and spin. As one nuclear physicist joked about particle physicists: "They've turned us all into chemists!"

What lies beyond the well-tested and currently-accepted "Standard Model" of particle physics? Nobody knows for sure, but if history is any guide, the Standard Model will be some low-interaction-energy approximation of some "better" theory. Even so, there are several hints that it is incomplete:

A variety of more-complete theories are being worked on, from supersymmetric extensions of the Standard Model to Grand Unified Theories (GUT's) to superstring-based theories, but predicting the Standard Model is an important goal of all of them.

But note that there is only one theory invalidated by later theories: the four-elements theory. Every other one is subsumed by later ones in some way.

The Laws of Physics


Before Isaac Newton's great synthesis, theories of mechanics were mostly either empirical results or crude hand-waving. Aristotle had believed that terrestrial objects have "natural" places, which they try to move to, and that when they reach such places or cannot move to them, they then try to come to a stop. He also believed that celestial objects' preferred motion is in circles around the Earth.

Galileo, however, came to some different conclusions. He concluded from his experiments that an object's unperturbed motion is at a constant velocity in a straight line, and that motion was relative. He proposed a thought experiment to illustrate his principle of inertia: if you were inside a ship that was steadily moving, could you tell that it was moving without looking outside of it? This was followed in the mid-1600's by such discoveries as the discovery of the law of conservation of momentum by John Wallis and others.

Building on this work, Isaac Newton assembled what was known of mechanics into his three laws of motion, and he showed that celestial objects also obeyed them, while interacting by an inverse-square force of gravity. These included comets, which had been widely feared as something evil -- but which were shown to follow predictable paths. When it could be tested, it was an enormous success, and the celestial realm offered some of the highest-precision tests of all.

Newtonian mechanics was an enormous success, but some difficulties started appearing in the latter part of the nineteenth century. James Clerk Maxwell had unified electromagnetism, and showed that electric and magnetic fields in a vacuum can have waves -- waves that were predicted to travel near c, the speed of light in a vacuum. This, and various experiments on light, showed that it was an electromagnetic wave.

But there was one catch. This is contrary to Newtonian mechanics, which states that there is no such special speed that an object or a wave can travel at. This stimulated experiments like the Michelson-Morley experiment, but every such experiment supported Maxwell over Newton where the two theories' effects could be distinguished. Physicists groped toward solutions, but in 1905, a young patent clerk named Albert Einstein noticed what they were groping toward, and came up with a revision of Newtonian mechanics where an object traveling at c will always be observed to have that speed, no matter how the observer moved (by observer is meant any physical process that can do some equivalent of taking measurements; no mind or consciousness is implied). Special relativity was quickly accepted, as it explained why electrons' inertia increased as they traveled close to c, as was observed at that time.

Einstein took on gravity, working out general relativity, a theory in which space-time is curved by some amount controlled by its matter content. Like special relativity, this theory was constructed so as to have the "correct" Newtonian approximation. And it has passed some stringest observational and experimental tests[2], with surviving alternatives having "fudge factors" that can make them very close to GR.

Likewise, quantum mechanics was developed as a result to various paradoxes of Newtonian mechanics, like why electrons do not spiral into atomic nuclei and why a thermally-glowing object does not glow across the entire electromagnetic spectrum. In 1900, Max Planck proposed a bizarre and ad-hoc idea, that light energy comes in "quanta", with the energy quantum being proportional to the frequency. However, this produced a high-frequency cutoff with the right shape. Soon afterwards, Albert Einstein showed how this "quantum" idea explains a puzzling feature of the photoelectric effect; the energy of electrons ejected from metals by light is proportional to the frequency minus some cutoff frequency, and not to the intensity of the light.

Quantum mechanics was then applied to atoms in an initially kludgy "old quantum theory" fashion, but one that reached mathematical elegance in the formulations of Erwin Schr�dinger and Werner Heisenberg in the 1930's. Elementary entities have interrelated particle and wave properties, both of which can be directly observed in many cases. This not only explained why electrons do not spiral into atomic nuclei, but also explained a variety of details of atomic structure.

And how were quantum mechanics and special relativity related? The answer was relativistic quantum field theory (QFT), a paradigm that has allowed the successful construction of the elementary-particle theories of the Standard Model of particle physics, like Quantum Electrodynamics, the quantum theory of the electromagnetic field and its interactions. One very big success has been the prediction that an electron has a "magnetic moment" (strength of its magnetic field) about one part per thousand larger than one would expect from the simple "Dirac" QFT theory of the electron. Adding QFT effects arising from various self-interactions correctly predicts this "anomalous magnetic moment" to a few parts per billion.

Despite the successes of the "Standard Model" of nongravitational particle physics, reconciling general relativity and quantum mechanics has been much more difficult. A straightforward QFT treatment of general relativity produces some very troublesome divergences; such difficulties have spurred investigation of theories like "superstring theory" that can incorporate general relativity without much trouble. However, one is saddled with the task of deriving the Standard Model from such theories, which can be a far-from-trivial task.

Looking back, the big break has been between Newtonian and previous physics. However, later theories like relativity and quantum mechanics have Newtonian mechanics as a limiting case in certain limits, so the relationship is more subsumption than replacement.



Of these, going from a flat to a round Earth was perhaps the biggest paradigm shift. Flat-earthism was an essentially universal belief before the last few millennia, and it is not surprising that the Bible contains some flat-earthism[3]. Flat-earthism usually required the celestial bodies to travel either along the Earth's rim or through the underworld from their setting places to their rising places. The Earth's rim was favored by the authors of the non-canonical book 1 Enoch, which clarifies much of the Bible's cosmology, while the underworld was favored by the ancient Egyptians, who created some mythology around the Sun's travels there.

Round-earthism was a big paradigm shift, because it required that the inhabitants on the opposite side be upside-down; Church Father Lucius Caecilius Firmianus Lactantius found the notion absolutely absurd[4]:

How is it with those who imagine that there are antipodes opposite to our footsteps? Do they say anything to the purpose? Or is there any one so senseless as to believe that there are men whose footsteps are higher than their heads? or that the things which with us are in a recumbent position, with them hang in an inverted direction? that the crops and trees grow downwards? that the rains, and snow, and hail fall upwards to the earth? And does any one wonder that hanging gardens are mentioned among the seven wonders of the world, when philosophers make hanging fields, and seas, and cities, and mountains? (Divine Institutes 3:24)

In fairness, the Church never officially supported flat-earthism, though both Copernicus and Galileo used Lactantius's statement as an example of how theologians ought not to pontificate about subjects that they do not understand very well.

Returning to our main subject, round-earthism also required that the Earth be surrounded by the celestial realm; the celestial bodies would be its full-time residents. In the European Middle Ages, a common view was that the Earth was a degraded location surrounded by the exalted celestial realm.

Heliocentrism was another paradigm shift; the Earth no longer had a special location but was essentially yet another inhabitant of the celestial realm. Relative to the medieval view, the Earth was befouling the heavens. This conclusion was strengthened by the success of Newtonian mechanics, which showed that the Solar System, at least, followed the same laws of physics as the Earth and its inhabitants.

Starting with Galileo, telescopic observations of Solar System objects also pointed in that direction. The Moon's splotches could be accounted for as contamination from the earthly realm, but the Moon's mountains make it look very Earthlike. Jupiter, and then Saturn, were discovered to have objects that travel around them. The Sun was discovered to have spots. The celestial realm was no longer anything special. Sending out spacecraft only added to this conclusion; some astronauts landed on the Moon's surface and returned in good health and with lots of Moon rocks, and Mars's surface looks like some Earth deserts.

Looking outside the Solar System, when their distances were measured, the stars turned out to be Sunlike objects, with the Sun being a very ordinary kind of star. Their spectra revealed that they are composed of the familiar chemical elements, and stellar structure and evolution calculations have good agreement with observations of them.

The Milky Way turned out to be a sort of supercluster of stars, shaped like a disk with a bulge in the middle.

And while many nebulae proved to deserve their name, being big clouds of interstellar dust and gases, many others turned out to be distant Milky-Way-like objects or galaxies.

Most of them are moving away from our Galaxy; the farther they are, the faster they move -- in an approximately linear relationship (Hubble's law).

Many galaxies and clusters of galaxies also have more mass than can be accounted for in their luminous and otherwise-visible parts; if they did not have such mass, they would quickly fly apart on cosmological timescales. The nature of "dark matter" continues to be a mystery, but there are plausible elementary-particle candidates for it.

The Universe's expansion suggests that the Universe was once much more condensed and heated than it was today; in other words, that it originated from a Big Bang. The abundances of the lightest nuclei are well accounted for by Big Bang nucleosynthesis, and the Universe's large-scale irregularities provide tantalizing hints as to what the very early Big Bang was like.

So while cosmology has had some serious paradigm shifts with round-earthism and heliocentrism, it has been mostly discovery rather than true paradigm shift since early modern times.


Turning to a more biology-related topic, I discuss the development of genetics.


It has been noticed for centuries that organisms tend to produce organisms much like them; the Bible mentions animals breeding after their kinds. But beyond that was such folklore as "maternal impressions", like the "genetic engineering" that Jacob performed on Laban's livestock in Genesis 30. Laban's livestock were solid-colored, but Jacob agreed to take care of them on the condition that he could have any spotted and streaked ones. So Jacob painted some stripes on some sticks and showed them to Laban's livestock as they were mating. Their offspring was spotted and streaked, and Jacob got himself a nice herd.

The only improvement on such ideas came with Gregor Mendel's crossings of pea plants, where he concluded the existence of genes for various pea-plant features. However, his work was not appreciated until his laws of heredity were rediscovered at the beginning of the twentieth century by Hugo De Vries, Carl Correns, and Erich Tschermak.

This work was improved on by Thomas H. Morgan, who in 1908 started crossbreeding fruit flies, using them as a model system. He showed that their genes reside in well-defined places on their chromosomes, a result that was confirmed for other species. This is not an absolute rule, as later work has shown; a tiny fraction of genes are relatively mobile.

But what was the carrier of heredity? Oswald Avery used a quirk of Pneumococcus bacteria to find out. The smooth-surfaced or "S" strain was lethal to mice, while the rough-surfaced or "R" strain was not. And S bacteria had the ability to turn R bacteria into S bacteria, lethality and all. But what was the "transforming factor" that would turn a R bacterium into an S one? Avery did a series of experiments, showing that it was not protein or carbohydrate or lipid or RNA -- it was DNA. A result that he announced in 1944. This result was supported by the discovery that the infectious part of a bacteriophage (bacterium-infecting virus) was its DNA and not its proteins. It has since been abundantly confirmed, with the only known exceptions being RNA viruses -- and RNA is a close chemical relative of DNA.

It was the discovery that heredity is carried by DNA that induced James Watson and Francis Crick to find the structure of the DNA molecule. The most important feature of that structure was that it made possible a simple way of copying DNA molecules -- something absolutely necessary for carriers of heredity. A strand of DNA can serve as a template for assembling a complementary strand, and the complement of the complement of a strand is an identical copy. This idea, coyly expressed in their 1953 paper, was abundantly confirmed by later work.

But how are "useful" molecules assembled from DNA? Some of them, like transfer RNA and ribosomal RNA, are direct copies of the appropriate regions of the organism's (DNA) genome. Proteins require an extra layer of translation, from the original DNA to messenger RNA to amino-acid strands, and many organisms have additional complications like "introns", small bits of genetic material that snip themselves out of messenger RNA.

An organism's genome almost always includes non-copied parts; some of them are known to be involved in regulating "gene expression", the act of copying RNA from the genome DNA. Jacques Monod made the first discovery of a regulatory region in the 1960's; it was one that expresses the genes in the lac operon or gene group of Escherichia coli, which are for digesting lactose. Since then, numerous other regulatory regions have been discovered; gene regulation is sometimes very complicated.


Some creationists and "Intelligent Design" advocates make the argument that "scientists have been wrong in the past", with the implication that evolutionary biology will be overthrown the way that other theories have been. Some have even been known to call for a Kuhnian paradigm shift. But over the last few centuries, old theories have generally not been overthrown, but subsumed, by new theories, as I have described in detail here.

This means that a new theory must somehow subsume evolutionary biology, either as a special case or some sort of emergent order, and creationists have not come close to coming up with such a theory.

Personal tools