Is modern science socially constructed and forever tentative?

Is modern science socially constructed and forever tentative?
Updated 7 April 2024 (c) 2024

Introduction

Writers from the discipline known variously as “postmodern science studies” or “sociology of scientific knowledge” are often cited in discussions of science, philosophy and religion. Some of these writers, notably Karl Popper and Thomas Kuhn, have had significant impact on the field of scientific research.

Issues such as ensuring proper credit for the scientific contributions of non-Western societies, such as the ancient mathematics of China, India and the Middle East, as well as dealing with the chronic under-representation of women, racial minorities and indigenous people in scientific research, are certainly worth additional attention.

However, other sectors of the “postmodern” literature are very problematic, as we will see below.

Karl Popper

Karl Popper, a British economist and philosopher, was struck by the differences in approach that he perceived at the time between the writings of some popular Freudians and Marxists, who saw “verifications” of their theories in every news report and clinical visit, and the writings of Albert Einstein, who for instance acknowledged that if the predicted red shift of spectral lines due to gravitation were not observed, then his general theory of relativity would be untenable. Popper was convinced that falsifiability was the key distinguishing factor, a view he presented in his oft-cited book The Logic of Scientific Discovery [Popper1959, pg. 40-41]:

I shall certainly admit a system as empirical or scientific only if it is capable of being tested by experience. These considerations suggest that not the verifiability but the falsifiability of a system is to be taken as the criterion of demarcation. … It must be possible for an empirical scientific system to be refuted by experience.

Popper’s ideas remain highly influential in scientific research even to the present day. As a single example, several prominent researchers have recently expressed concern about whether it is prudent to continue pursuing string theory, currently a leading candidate for a “theory of everything” in physics, given that string theorists have not yet been able to derive empirically testable consequences even after 25 years of effort. Physicist Lee Smolin, for example, writes, “A scientific theory that makes no predictions and therefore is not subject to experiment can never fail, but such a theory can never succeed either, as long as science stands for knowledge gained from rational argument borne out by evidence.” [Smolin2006, pg. 352].

However, Popper’s ideas do have some limitations, some of which were pointed out by Popper himself. To begin with, in most real modern-day scientific research, major theories are seldom falsified by a single experimental result. There are always questions regarding the underlying experimental design, measurement procedures, and data analysis techniques, not to mention statistical uncertainties. Often multiple follow-on studies, in some cases extending over many years, are necessary to conclusively decide the hypothesis one way or the other. For example, 13 years elapsed between 1998, when two teams of researchers discovered that the expansion of the universe is accelerating, and 2011, when the lead scientists of the two teams were awarded the Nobel Prize in physics.

For that matter, if we were to strictly apply Popper’s principle, Copernicus’ heliocentric theory was falsified from the start and should not have been further considered, because it could not predict planetary motions quite as accurately as the traditional Ptolemaic system. It only gained acceptance when Kepler modified the theory to include elliptical orbits with time-varying speeds, and when Newton showed that this behavior could be mathematically derived, using calculus, from his laws of motion. In a similar way, Newton’s theory was arguably falsified in the mid-19th century, when certain anomalies were noted in the orbit of Mercury. But it would have been irresponsible to discard Newtonian mechanics at that time, because of its overwhelming success in accurately explaining a vast array of phenomena.

In this sense, scientists are more like detectives, in that they must follow leads and hunches, examine evidence, and tentatively proceed with the most likely scenario. Seldom, if ever, are scientific results black-and-white from day one.

It must also be kept in mind that in most cases, “falsified” theories continue to be extremely accurate models of reality within appropriate domains. For example, even today, over 100 years after Newton’s mechanics and Maxwell’s electromagnetics were “falsified” and supplanted by new theories of physics (relativity and quantum mechanics, respectively), they remain the basis of almost all practical engineering and scientific computations, giving results virtually indistinguishable from those of more modern theories. Relativity corrections are employed in the GPS system, which is used by many automobiles and smartphones, and quantum mechanical calculations are employed in semiconductor design, materials science and computational chemistry. But in most other arenas of the modern world, the classical theories of physics are entirely satisfactory.

Thomas Kuhn

Thomas Kuhn’s work The Structure of Scientific Revolutions analyzed numerous historical cases of scientific advancements, and then concluded that in many cases, key paradigm shifts did not come easily [Kuhn1970]. Kuhn was actually trained as a scientist, receiving his Ph.D. in physics from Harvard in 1949. Thus he was able to bring significant scientific insight into his analyses of historical scientific revolutions.

One difficulty with Kuhn’s writings is that there are really two Kuhns: a moderate Kuhn and an immoderate Kuhn [Sokal1988a, pg. 75]. Unfortunately, many modern scholars like to quote only the immoderate Kuhn, such as when he denies that paradigm shifts carry scientists closer to fundamental truth [Kuhn1970, pg. 170], or when he argues that paradigm shifts often occur due to non-experimental factors [Kuhn1970, pg. 135].

Another difficulty is that Kuhn’s “paradigm shift” model has not worked as well in recent years as it did in the historical examples he cited. For example, the “standard model” of physics, the currently reigning fundamental theory of elementary particles and forces, was developed in the 1960s and early 1970s, and was completed in essentially its current form in 1974. Yet by 1980 it had completely displaced previous theories of particle physics, after a very orderly transition — even initial skeptics quickly recognized the new theory’s power, elegance and precision, and soon threw their support behind it [Tipler1994, pg. 88-89].

Kuhn’s writings, much as Popper’s writings before him, have been badly misused by a host of eager but ill-informed writers and scholars who think that they can smash the reigning orthodoxy of modern science. In a recently published interview of Kuhn by Scientific American writer John Horgan, Kuhn was deeply upset that he has become a patron saint to this type of would-be scientific revolutionaries: “I get a lot of letters saying, ‘I’ve just read your book, and it’s transformed my life. I’m trying to start a revolution. Please help me,’ and accompanied by a book-length manuscript.” Kuhn emphasized that in spite of the often iconoclastic way his writings have been interpreted, he remained “pro-science,” noting that science has produced “the greatest and most original bursts of creativity” of any human enterprise [Horgan2012].

Mathematics and computer science

Some writers have pointed out that even in the purest of scientific disciplines, namely mathematics and computer science, upon which all other scientific research is based, researchers have identified weaknesses and uncertainties. In particular, it has been known since Godel’s groundbreaking 1931 paper [Hawking2005, pg. 1089-1118] that the fundamental axioms of mathematics are incomplete (no consistent system of axioms can encompass all mathematical truths) and unprovable (no system of axioms can demonstrate its own consistency); and it has been known since Alan Turing’s 1937 paper that no computer program can be devised that can infallibly provide a yes/no answer to all questions that can be posed computationally [Hawking2005, pg. 1119-1160]. See these two references for precise statements of these results and full technical details.

The results of Godel and Turing are very specific and are often misinterpreted to justify a large-scale dismissal of modern mathematics and computer science. Note that Godel’s result is only that the consistency of the axioms of mathematics cannot be formally proven; it does not establish that they are inconsistent, or that any specific mathematical result is flawed. A more significant concern, which any research mathematician or computer scientist would readily acknowledge, is that mathematical theorems and computer programs are human constructions, and thus susceptible to errors of human reasoning. Indeed, some mathematical results have subsequently been shown to be flawed, and bugs in computer programs are sadly an everyday annoyance. But mathematical results can be and (have been, in numerous cases) checked by computer using very exacting tests; and computer programs can be independently written and compared on different systems. See also the next section.

How reliable are modern scientific findings?

Even after properly acknowledging the tentative, falsifiable nature of science as taught by Popper and Kuhn, it is clear that modern science has produced a sizable body of broad-reaching theoretical structures that describe the universe and life on earth ever more accurately with each passing year. As a single example of thousands that could be mentioned here, the numerical value of the magnetic moment of the electron, calculated from the present-day standard model (in particular, from the theory of quantum electrodynamics) on one hand, and calculated from best available experimental measurements on the other hand, are [Cliff2024, pg. 98]:

Theoretical: 2.00231930436321

Experimental: 2.00231930436118

Is this astonishing level of agreement — to roughly one part in one trillion, comfortably within the level of experimental uncertainty — just a coincidence? Numerous other instances of scientific progress are presented in Progress-science.

Along this line, recently a researcher computed the mathematical constant pi to over 100 trillion decimal place accuracy [Ranous2024]. Large computations of this type are subject to a myriad of possible difficulties: the underlying mathematical theory might have flaws; the principles behind the algorithms implementing the formulas might not be sound; the computer programs (typically many thousands of lines) implementing these algorithms might have bugs; the system software might have glitches; and the system hardware might miscompute or suffer memory errors. In a very fragile computation of this sort, any one of these problems would almost certainly produce a completely erroneous result.

One major step in this computation was to compute pi in hexadecimal or base-16 digits (i.e., with the digits 01234567890abcdef instead of 0123456789) to over 83 trillion digits. Using a computer program based on one known mathematical formula for pi, running on a computer for several months, the researcher found that the base-16 digits of pi starting at position 83,048,202,372,150 were:

4757d05f3f 35d1b41de3 7d8b3b2289 a4c8a3eb18 262cc3818d

Then he employed another computer program, based on a completely different mathematical formula, derived from a completely different line of mathematical reasoning, to compute digits beginning at the same position. The result was:

4757d05f3f 35d1b41de3 7d8b3b2289 a4c8a3eb18 262cc3818d

Again, is this perfect agreement just a coincidence?

More recent postmodern writings

More recent writings in the postmodern science studies field have greatly extended the scope and sharpness of these critiques, declaring that much of modern science, like literary and historical analysis, is “socially constructed,” dependent on the social environment and power structures of the researchers, with no claim whatsoever to fundamental truth [Koertge1998, pg. 258; Madsen1990, pg. 471; Sokal1998, pg. 234]. Collins and Pinch, for instance, after examining a handful of case studies, assert that “scientists at the research front cannot settle their disagreements through better experimentation, more knowledge, more advanced theories, or clearer thinking” [Collins1993, pg. 143-145; Koertge1998, pg. 258]. Sandra Harding went so far as to describe Newton’s Principia as a “rape manual” [Harding1986, pg. 113].

Here are some other examples of this same thinking:

  1. “The validity of theoretical propositions in the sciences is in no way affected by the factual evidence.” [Gergen1988, pg. 258; Sokal2008, pg. 230].
  2. “The natural world has a small or non-existent role in the construction of scientific knowledge.” [Collins1981; Sokal2008, pg. 230].
  3. “Since the settlement of a controversy is the cause of Nature’s representation, not the consequence, we can never use the outcome — Nature — to explain how and why a controversy has been settled.” [Latour1987, pg. 99; Sokal2008, pg. 230].
  4. “For the relativist [such as ourselves] there is no sense attached to the idea that some standards or beliefs are really rational as distinct from merely locally accepted as such.” [Barnes1981, pg. 27; Sokal2008, pg. 230].
  5. “Science legitimates itself by linking its discoveries with power, a connection which determines (not merely influences) what counts as reliable knowledge.” [Aronowitz1988, pg. 204; Sokal2008, pg. 230].

Scientists counter that these scholars have distorted a few historical controversies, and then have parlayed these isolated claims to a global condemnation of the scientific enterprise [Boghossian2006; Brown2009; Gross1998; Gross1996; Koertge1998; Sokal2008]. In other words, these writers are guilty of the “forest fallacy”: pointing out flaws in the bark of a few trees, then trying to claim that the forest doesn’t exist. See Progress-science for additional discussion.

More importantly, observers of the postmodern science literature have also noted: (a) serious confusion on various concepts of science; (b) an emphasis on politically correct conclusions over sound scholarship; (c) engaging in lengthy discussions of mathematical or scientific principles about which the author has only a hazy familiarity; (d) applying highly sophisticated concepts from mathematics or physics into the humanities or social sciences, without justification; (e) displaying superficial erudition by peppering the text with sophisticated technical terms or mathematical formulas; and (f) employing lengthy technical passages that are essentially meaningless [Sokal1998, pg 4-5].

In a curious turn of events, these postmodern science writings, by attempting to undermine scientists’ claim to objective truth, have provided arguments and talking points for the creationism, intelligent design, anti-vaccination and climate change denial movements [Otto2016a]. The far left has met the far right!

The Sokal hoax

The tension between the scientific and postmodernist communities came to a head in 1996, when Alan Sokal, a physicist at New York University, wrote a parody of a postmodern science article, entitled “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity,” and submitted it to Social Text, a prominent journal in the postmodern studies field [Sokal1996a]. The article was filled with page after page of erudite-sounding nonsense, political rhetoric, irrelevant references to arcane scientific concepts, and approving quotations from leading postmodern science scholars. Here are three excerpts:

Rather, [scientists] cling to the dogma imposed by the long post-Enlightenment hegemony over the Western intellectual outlook, which can be summarized briefly as follows: that there exists an external world, whose properties are independent of any individual human being and indeed of humanity as a whole; that these properties are encoded in “eternal” physical laws; and that human beings can obtain reliable, albeit imperfect and tentative, knowledge of these laws by hewing to the “objective” procedures and epistemological strictures prescribed by the (so-called) scientific method. [Sokal1996a, pg. 217; Sokal2008, pg. 7].

In this way the infinite-dimensional invariance group erodes the distinction between the observer and observed; the pi of Euclid and the G of Newton, formerly thought to be constant and universal, are now perceived in their ineluctable historicity; and the putative observer becomes fatally de-centered, disconnected from any epistemic link to a space-time point that can no longer be defined by geometry alone. [Sokal1996a, pg. 222; Sokal2008, pg. 27].

For, as Bohr noted, “a complete elucidation of one and the same object may require diverse points of view which defy a unique description” — this is quite simply a fact about the world, much as the self-proclaimed empiricists of modernist science might prefer to deny it. In such a situation, how can a self-perpetuating secular priesthood of credentialed “scientists” purport to maintain a monopoly on the production of scientific knowledge? [Sokal1996a, pg. 229; Sokal2008, pg. 53].

With regards to the first passage, note that it derides the most basic notions of scientific reality and common sense. With regards to the second passage, the fundamental constants Pi and G certainly do not have varying values. With regards to the third passage, quantum mechanics, whose effects are significant only at the atomic level, has absolutely nothing to say about the relative validity of cultural points of view.

In spite of its severe flaws, the article was not only accepted for the journal, but it appeared in a special issue devoted to defending the legitimacy of the postmodern science studies field against its detractors. As Sokal later noted, “I intentionally wrote the article so that any competent physicist or mathematician (or undergraduate physics or math major) would realize that it is a spoof.” [Sokal1996b, pg. 50]. He resorted to the hoax out of a deeply felt concern that the postmodern science world has taken a complete about-face from its roots in the Enlightenment, which identified with science and rationalism and rejected obscurantism. “Theorizing about ‘the social construction of reality’ won’t help us find an effective treatment for AIDS or devise strategies for preventing global warming. Nor can we combat false ideas in history, sociology, economics, and politics if we reject the notions of truth and falsity.” [Lingua2000, pg. 52].

More postmodern nonsense

In the same issue as Sokal’s piece, a prominent postmodern writer (in a serious article) asserted:

Most theoretical physicists, for example, sincerely believe that however partial our collective knowledge may be, … one day scientists shall find the necessary correlation between wave and particle; the unified field theory of matter and energy will transcend Heisenberg’s uncertainly principle. [Aronowitz1996, pg. 181].

Einstein’s relativity theory was subjected to official skepticism twenty years after the publication of his Special Theory article in 1905; and equally passionate partisans of wave and matrix mechanics explanations for the behavior of electrons were unable to reach agreement for decades. [Aronowitz1996, pg. 195].

In the first passage, the author is seriously mistaken about wave-particle duality: this is inherent in quantum physics and cannot be removed by a “unified field theory.” In the second passage, even his history is in error: the matrix and wave mechanics formulations of quantum mechanics were resolved within weeks [Gottfried2000]. Also appearing in the same issue with Sokal’s article was the following, written by the chief editor of Social Text:

Once it is acknowledged that the West does not have a monopoly on all the good scientific ideas in the world, or that reason, divorced from value, is not everywhere and always a productive human principle, then we should expect to see some self-modification of the universalist claims maintained on behalf of empirical rationality. Only then can we begin to talk about different ways of doing science, ways that downgrade methodology, experiment, and manufacturing in favor of local environments, cultural values, and principles of social justice. [Ross1996, pg. 3-4].

It is easy to imagine the potentially serious consequences if this extreme cultural relativism were widely adopted in modern science. As a single example, a few years ago the Mexican government encouraged potters, for their own safety, to use lead-free glazes, but the local potters were convinced that the lead issue was only a foreign conspiracy. Unfortunately, as Michael Sullivan has noted, “lead does not care who believes what.” [Sullivan1996].

In other postmodern science writing, researchers have attempted to apply arcane scientific and mathematical concepts into the social sciences and the humanities, often with disastrous results. For example a leading French postmodern scholar wrote:

This diagram [the Mobius strip] can be considered the basis of a sort of essential inscription at the origin, in the knot which constitutes the [human] subject. … You can perhaps see that the sphere, that old symbol for totality, is unsuitable. A torus, a Klein bottle, a cross-cut surface, are able to receive such a cut. And this diversity is very important as it explains many things about the structure of mental disease. If one can symbolize the subject by this fundamental cut, in the same way one can show that a cut on a torus corresponds to the neurotic subject, and on a cross-cut surface to another sort of mental disease. [Lacan1970, pg. 192-196; Sokal1998, pg. 19-20].

With regards to this passage, “Mobius strips,” “toruses,” “Klein bottles” and “cross-cut surfaces” are terms from mathematical topology, the theory of continuous functions and continuously deformed surfaces. There is absolutely no connection between this arcane mathematical theory and psychology. Yet this author pressed this absurd connection between psychology and topology further in several other writings, hopelessly misusing sophisticated mathematical concepts such as compactness, open sets, limit points, subcoverings and countable sets [Lacan1998, pg. 9-10; Sokal1998, pg. 21-22].

Numerous examples of gratuitous and often meaningless scientific jargon can be cited in postmodern literature. Here is one example. The reader need not feel bad that he/she does not understand this text. It is complete nonsense, yet it survived peer review in the postmodern science field:

We can clearly see that there is no bi-univocal correspondence between linear signifying links archi-writing, depending on the author, and this multireferential, multidimensional machinic catalysis. The symmetry of scale, the transversality, the pathic non-discursive character of their expansion: all these dimensions re-move us from the logic of the excluded middle and reinforce us in our dismissal of the ontological binarism we criticised previously. A machinic assemblage, through its diverse components, extracts its consistency by crossing ontological thresholds, non-linear thresholds of irreversibility, ontological and phylogenetic thresholds, creative thresholds of heterogenesis and autopoiesis. The notion of scale needs to be expanded to consider fractal symmetries in ontological terms. [Guattari1995, pg. 50; Sokal1998, pg. 166].

The debate over scientific merit versus social justice

As mentioned above, the scientific world has long recognized the need to properly acknowledge the heretofore downplayed scientific contributions of non-Western societies. One example of many that could be cited is recent recognition of key mathematical contributions in ancient China, India and the Middle East. India in particular is now recognized as the birthplace of our modern system of positional decimal arithmetic with zero, by unknown scholar(s) at least by 200 CE — see, for example, [Bailey2012]. An even more pressing current issue is to understand and correct the chronic under-representation of women, racial minorities and indigenous people in the scientific and engineering fields.

However, many researchers are concerned that in a headlong rush to promote these social justice issues, that solid scientific merit is being compromised. For example, in 2023 a group of scientists published “In defense of merit in science” [Abbot2023], where they argued:

For science to succeed, it must strive for the non-ideological pursuit of objective truth. Scientists should feel free to pursue political projects in the public sphere as private citizens, but not to inject their personal politics and biases into the scientific endeavor. Maintaining institutional neutrality is also essential for cultivating public trust in science. … Although no system is guaranteed to eliminate all biases, merit-based systems are the best tool to mitigate it. Moreover, they promote social cohesion because they can be observed to maximize fairness.

Clearly, this debate will continue, but a broad range of researchers agree that in the final analysis, the scientific method of objectively evaluating real empirical evidence against proposed theories is the best path forward to uncover truth.

Conclusions

In summary, the works of Kuhn and Popper have provided valuable insights into the process of scientific research. In particular, their observations on falsifiability and paradigm shifts have been largely incorporated into the fabric of modern science, although their writings are often misconstrued. Issues such as ensuring proper credit for the scientific contributions of non-Western societies (such as the ancient mathematics of China, India and the Middle East), as well as responding to the chronic under-representation of women, racial minorities and indigenous people in scientific research, are certainly important and worth discussing. But beyond such considerations, what is generally termed the “postmodern science studies” literature has not been very useful in advancing real scientific research, to say the least. And with the 1996 Sokal hoax, these writers lost have considerable credibility in the eyes of the scientific research community.

One criticism that applies rather broadly to present-day literature of this type is that these scholars work almost entirely from outside the realm of real scientific research. Unlike predecessors such as Kuhn and Popper, who were qualified professional scientists, most of the present-day postmodern science studies writers do not have significant scientific training and/or credentials, do not address state-of-the-art scientific theories or methods in technical depth, and do not participate with scientific research teams in performing real research. Their approach is best exemplified by a comment made by Andrew Ross, editor of Social Text during the Sokal hoax episode, in the introduction to one of his published works: “This book is dedicated to all of the science teachers I never had. It could only have been written without them.” [Ross1991].

But this is a point upon which virtually all practicing research scientists will sharply disagree. Indeed, state-of-the-art scientific research is all about the details: underlying physical theories; mathematical derivations; testable hypotheses; state-of-the-art equipment; careful experimental design and data collection; rigorous data analysis and statistical methodology; carefully programmed computer simulations; advanced numerical methods; and, of course, cautiously inferred conclusions. Indeed, the details of the methods underlying a study are often as significant as the conclusions.

Thus, to the extent that the postmodern science studies community studiously avoids delving into the full technical details of leading-edge scientific research, these writers cannot possibly hope to have tangible impact in the scientific enterprise. And, needless to say, when leading figures in this community openly express their contempt and disdain for scientific work, they are not building bridges that will lead to productive collaborations with real scientists in the future.

According to an ancient account, when Pharaoh Ptolemy I of Egypt grew frustrated at the degree of effort required to master geometry, he asked Euclid whether there was some easier path. Euclid is said to have replied, “There is no royal road to geometry.” [Durant1975, vol. 2, pg. 501]. Indeed. And there is no royal road to modern science either.

Canadian-American physicist Lawrence Krauss summed up his view of these issues in the following terms [Krauss2012a]:

As both a general reader and as someone who is interested in ideas and culture, I have great respect for and have learned a great deal from a number of individuals who currently classify themselves as philosophers. … What I find common and so stimulating about the philosophical efforts of these intellectual colleagues is the way they thoughtfully reflect on human knowledge, amassed from empirical explorations in areas ranging from science to history, to clarify issues that are relevant to making decisions about how to function more effectively and happily as an individual, and as a member of a society.

As a practicing physicist however, the situation is somewhat different. There, I, and most of the colleagues with whom I have discussed this matter, have found that philosophical speculations about physics and the nature of science are not particularly useful, and have had little or no impact upon progress in my field. Even in several areas associated with what one can rightfully call the philosophy of science I have found the reflections of physicists to be more useful. For example, on the nature of science and the scientific method, I have found the insights offered by scientists who have chosen to write concretely about their experience and reflections, from Jacob Bronowski, to Richard Feynman, to Francis Crick, to Werner Heisenberg, Albert Einstein, and Sir James Jeans, to have provided me with a better practical guide than the work of even the most significant philosophical writers of whom I am aware, such as Karl Popper and Thomas Kuhn.

In spite of these difficulties, some scientists and philosophers look forward to a more respectful dialogue between the two disciplines in the future. As physicist Carlos Rovelli recently wrote [Rovelli2012]:

I think there is narrow-mindedness, if I might say so, in many of my colleague scientists that don’t want to learn what is being said in the philosophy of science. There is also a narrow-mindedness in a lot of probably areas of philosophy and the humanities in which they don’t want to learn about science, which is even more narrow-minded. Somehow cultures reach, enlarge. I’m throwing down an open door if I say it here, but restricting our vision of reality today on just the core content of science or the core content of humanities is just being blind to the complexity of reality that we can grasp from a number of points of view, which talk to one another enormously, and which I believe can teach one another enormously.

Comments are closed.