If you create a good enough airport—the cargo will come.
What does it take for an individual to do innovative intellectual work, such as scientific discovery? Mere mastery of methods is not good enough.
What does it take for a community or institution to address a volatile, uncertain, complex, and ambiguous world effectively? Mission statements, structures, principles, and procedures are not good enough.
Cargo cult science
Richard Feynman—the foremost physicist of the mid-20th century—gave a famous commencement address on “cargo cult science.”1
During World War II, many Pacific islands that previously had little or no contact with the modern world were used as air bases by the Americans or Japanese. Suddenly, enormous quantities of food, clothes, tools, and equipment, such as the islanders had never seen, appeared out of the sky in magic flying boats. Some of this “cargo” trickled down to the natives, and it was fabulous. Then the war ended, the planes vanished, and—no more cargo!
How to make the cargo flow again? The islanders had observed that, just before cargo arrived, the foreigners performed elaborate rituals involving inscrutable religious paraphernalia. Clearly, these summoned the sky spirits that brought cargo.
Religious entrepreneurs founded cults that duplicated sky spirit rituals using locally-produced copies of the paraphernalia. They imitated the actions of the airstrip ground crews using wicker control towers, coconut headsets, and straw planes (such as the one photographed above). Some cargo cults are still going, generations later, despite their failure to deliver even one landing by the sky spirits. Ha ha, stupid primitive savages!
Except, this is a perfect metaphor for most of what is called “science,” done by people with PhDs.
“Cargo cult science” performs rituals that imitate science, but are not science. Real science sometimes delivers cargo (fame and promotions for scientists; profits for R&D companies; technologies for everyone else). So, you think, OK, what do I have to do to make that happen? How did those guys do it? So you look to see what they did, and you do the same thing. But usually that doesn’t work well.
“Doing what scientists do” is not doing science, and won’t deliver—just as “doing what a ground crew does” doesn’t bring planes. It’s just going through the motions.
But exactly why doesn’t it work? And what does work? What makes the difference between cargo cult science and the real thing?
Cargo cults everywhere
“Cargo cult” describes not just science, but much of what everyone does in sophisticated rich countries. I’m not speaking of our religions; I mean our jobs and governments and schools and medical systems, which frequently fail to deliver. Companies run on cargo cult business management; states run on cargo cult policies; schools run on cargo cult education theories (Feynman mentioned this one); mainstream modern medicine is mostly witch doctoring.
An outsider could see that these cannot deliver, because they are scripted busy-work justified by ideologies that lack contact with reality. Often they imitate activities that did work once, for reasons that have been forgotten or were never understood.
So how do you go beyond cargo cultism? How do you do actual science? Or economics or policy; education or medicine?
And why is cargo cultism so common, if it keeps failing to deliver?
Upgrading
In some video games, you direct the technical and economic development of a handful of hunter-gatherers in straw huts. You start them farming, and they multiply. They build a wooden palisade to keep out hostile strangers. You invent the plow, so their farms become more efficient, and the village grows into a small town. You start them mining, and they build stone houses, and a stone wall to repel invasions. You discover copper smelting and they can make metal plows and swords. And so on—upgrading technology step by step, until eventually your people develop fusion power, take over the whole earth, build spaceships, and set off to colonize the galaxy.
So what about those stupid savages, doing their silly rituals on their Pacific islands?
Suppose they got their imitation runway level enough, and put tarmac on it, and upgraded the control tower from straw to wood to concrete, and installed modern radar and landing control systems, and sent their “ground crew” to Pittsburgh to be trained and certified.
What then?
Imitation and learning know-how
Let’s say you are a new graduate student starting a science PhD program. What you learned as an undergraduate were an enormous number of facts, a few calculation methods, and basic familiarity with some experimental equipment. You learned mainly by being lectured at in classrooms, by reading, by solving artificial puzzle-like problems, and in lab courses where you used the equipment to try to get the known-correct answer to make-believe “experiments.” None of this is anything like actual science: discovering previously-unknown truths.2
Much education assumes the wrong idea that learning consists of ingesting bits of knowledge (facts, concepts, procedures), and storing them, and when you have enough, you can make useful deductions using innate human reasoning. A more sophisticated wrong idea is that there are methods of thinking, and once you have learned them, you can use them reliably. Both of these are partly true—you do need to learn and remember and use facts, and learn and practice and use rational methods—but they are not sufficient.
You can’t learn how to do science from classes or books (although what you do learn there is important). You certainly can’t figure out how to do it from rational first principles! No one has any detailed rational theory of how science works.3 More generally, you mostly can’t learn doing from books or classes or reasoning; you can only learn doing by doing.
In doing, ability precedes understanding, which precedes representation. Knowing-how is not reducible to knowing-that.4 Riding a bicycle is the classic example: no amount of classroom instruction, or rational reflection, could enable a novice to stay upright.
How do you learn know-how?
Imitation is one powerful and common way—one that is unfortunately underemphasized in current American theories of education. The Melanesian cargo cults were founded on the accurate observation that imitation often results in new abilities that you do not understand—at first, at least.
In fact, you start doing science—or any serious intellectual work—by imitation, by going through the motions, not seeing the point of the rituals. Gradually you come to understand something of how and why they work. (If you are smart and lucky; many people never do.) Gradually, you find yourself doing the real thing. At some point, you can improvise, step into the unknown, and create your own methods.
In other words, you can only begin your career as a scientist by doing cargo-cult science. Eventually—if you are smart and lucky—you can upgrade. But almost all scientists get stuck at the cargo cult stage; and almost all supposed science is cargo culting.
Cargo cult science, and cargo cult government and management and education, are based on the perfectly sensible principle of imitation. Why doesn’t that work? Why isn’t classroom science instruction plus learning through imitation good enough?
Why isn’t imitation a sufficient upgrade?
Actually… Why don’t the literal cargo cults work? The answer is not quite as obvious as it may seem at first!
The first obvious answer is: Ha ha, straw airplanes can’t fly, and coconuts are not headphones. But that’s wrong. Proper technology is neither necessary nor sufficient for a functional airport:
- I have landed (as a passenger) at a remote airport in Alaska that consisted of a dry river bed with the larger rocks cleared off, plus a closet-sized wooden shed with emergency fuel and repair supplies.
- If someone installed a complete airport facility with all the latest technology on one of the cargo cult islands, and then left, that would be a useless pile of junk. Without a competent ground crew, the buildings and equipment are not an airport.
Better technology would be a significant upgrade—but it is not the whole answer, or even the main one. It would not make the cargo come.
The second obvious answer is: Ha ha, the cargo cultists are only imitating a ground crew; they have no understanding, so they are just going through the motions. But this isn’t right either. Imitating is often a good way of learning, and understanding an activity is often neither necessary nor sufficient to performing it—even to performing it excellently.
You don’t need understanding to ride a bicycle. In fact, almost no one has an accurate mental model of how a bicycle works.5 I am pretty confident that much of what an expert ground crew does, they don’t understand either.
Better understanding, like better technology, would be a significant upgrade for a cargo cult. The same is true in cargo cult science. One commonly suggested antidote is to understand the principles of the field, so you know why its methods work, and aren’t just performing experiments as inscrutable rituals. I advocated this in “How to think real good,” and it’s important enough that I’m working on a post just on it, to follow up this one.
What are “principles” and how do you find them? If they are so great, why aren’t they just taught in the introductory class? Partly because even the best people in the field can’t quite say what the principles are, because tacit understanding does not always enable explicit explanation. Also, many methods are worked out by trial and error, by many people over many years; they do work, but it’s not clearly known why.
Anyway, I doubt a ground crew knows, or is taught, any profound principles of airport operation. The problem with imitation is not solely or primarily lack of deep understanding.
What is missing?
Feynman found the question awkward:
[Cargo cult scientists] follow all the apparent precepts and forms of scientific investigation, but they’re missing something essential.
Now it behooves me, of course, to tell you what. But it would be just about as difficult to explain to the South Sea islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones.
He goes on to suggest that “utter honesty” is the key. He also describes this as “scientific integrity.” And, he points out ruefully, this is rarely taught:
But this long history of learning how to not fool ourselves—of having utter scientific integrity—is, I’m sorry to say, something that we haven’t specifically included in any particular course that I know of. We just hope you’ve caught on by osmosis.
If this is as important as he—and I—believe, we ought to ask why it is not taught in universities. (I’ll suggest a reason later in this post.)
I vaguely remember being taught something like this in high school, or even grade school. At that point, it’s irrelevant because you can’t understand what scientific honesty even means until you do your own research.
Until then, there’s only what Feynman calls “conventional honesty,” meaning you don’t make things up. If the meter read 2.7, you put 2.7 in your report, even though 3.9 would be much more exciting.
Although I do consider “utter honesty” important, I don’t think it’s quite right that this is what cargo cult science lacks.6 Or anyway, it’s not the whole story. I think it points in a promising direction, however: toward epistemic virtue.
Epistemic virtue and epistemic vice
Honesty is a moral virtue. It is also an epistemic virtue. Epistemic virtues are cognitive traits that tend to lead to accurate knowledge and understanding. Tenacity, courage, generosity, conscientiousness, and curiosity are some other epistemic virtues—which is why I said I think “utter honesty” is not the whole story.7
Cargo cult science is bad science; and “bad” is a moral, or at least normative, term. Upgrading a cargo cult is, I think, a moral responsibility. Doing bad science is wrong—in a specialized way that goes beyond everyday morality.
What did Feynman mean by “utter honesty”? He didn’t explain exactly, but he did say that it’s not mostly about scientific fraud. Avoiding that is a very low bar, and fraud is relatively rare,8 and easy to eschew. Not committing fraud is, as he puts it, “conventional honesty,” not the special “utter honesty” required in science—and, I would argue, in all intellectual work.
Utter honesty, I suspect, means not just telling the truth, but caring about the truth. Feynman uses the phrase “bending over backward” to suggest a higher standard. You will go to extreme lengths to avoid fooling yourself—partly because then you won’t fool others, but more importantly because you really want to know what’s going on.
“Utter honesty” is about overcoming the “good enough” mediocrity of cargo cult science. Mediocrity comes from going along with the social conventions of your field; accepting its assumptions uncritically; using its methods without asking hard questions about whether they actually do what they are supposed to.
Cargo cultism is the bureaucratic rationality of blindly following established procedures and respecting authority. In the moral domain, that can lead ordinary people into committing genocide without reflection; in science, it leads to nutritional recommendations that may also have killed millions of people. When you look into how those recommendations were arrived at, it becomes obvious that honesty would compel the entire field of nutrition science to resign in recognition of its total failure—both scientific failure and moral failure.
Unflinching lustful curiosity
Important as honesty is, I might rate even higher curiosity, courage, and desire. These are not separable from each other, or from honesty, but it may be helpful to present them as facets of epistemic virtue.
Be curious!
Yeah, good, whatever…
Exhortations to epistemic virtue, and lists of virtues, are not helpful by themselves. We need details. For that, we need to look carefully at specific cases in which epistemic virtue or vice led to success or failure. From them, we can extract heuristics and principles.9
Curiosity
Feynman’s best case study is the rat-running one. (It’s a little too complicated to explain here; it’s near the end of his talk if you still haven’t read that!) It seems to me that the scientists who got this wrong weren’t dishonest. They were incurious: they didn’t actually care about rats. They lacked intellectual desire. They lacked the courage to say “maybe we keep getting inconsistent results because our experimental apparatus is defective.” At some level, they understood that admitting this would lead to a lot of boring difficult work, for which there would be no career reward. (As, Feynman says, occurred: the guy who figured out the problem was ignored and never cited.)
Honesty comes out of curiosity, mostly, I think. If you really do want to know, there’s much less motivation to promote a wrong answer—arrived at either through deliberate fraud or sloppy, inadequately-controlled experimentation.
A reliable recipe for “how to be curious” is impossible (and probably undesirable—you need to choose skillfully what to be curious about). However, we can and should give descriptions of what curiosity is like, so you can recognize when you are curious—and when you are not. Cargo cult science comes from merely going through the motions because you don’t care enough about understanding the phenomena you are studying. It is common for graduate students, or postdocs, or professors, to gradually lose interest in their field without even noticing. Then you do bad work.
Desire
Curiosity is not just caring about which facts are true versus false. It is lust for understanding. What matters is that you want, above all else, to figure out what is actually going on.
Where does curiosity come from? It is not “disinterested,” as some philosophers of science would advocate. You want to know what is actually going on because the thing is cool. If you don’t love your phenomenon of study, you won’t care enough to want to understand it. I would guess that liking rats, finding them cute and funny and interesting and enjoying their company, can make you a better rat-running scientist.
I wrote about this in “Going down on the phenomenon,” making an extended metaphor with sexual desire—which is why I use the term “lust” here.
Beyond respect, one must care about the phenomenon. It seems to me that most academic intellectuals I talk to do not genuinely care about their subject matter. They are more interested in getting papers out of it than they are in learning about it. Analogously, many people in approaching sex are more interested in getting something out of someone than they are in learning about another person (and themselves).
Courage
Every scientist (probably—me for sure) sometimes screws up and promotes an attractive idea that isn’t actually right. That’s unavoidable, probably. Courage and honesty means recognizing and admitting this when it happens, and being as transparent as possible so other people can detect it.
Courage and honesty may also demand that you be transparent about going beyond the boundaries of your discipline. That can be a taboo—but breaking it is a virtue, because mindlessly adhering to disciplinary conventions is a main cause of cargo cult science. A seminal and excellent paper on research management10 explains:
Research has come to be as ritualistic as the worship of a primitive tribe, and each established discipline has its own ritual. As long as the administrator operates within the rituals of the various disciplines, he is relatively safe. But let him challenge the adequacy of ritualistic behavior and he is in hot water with everyone.
The first conviction of the research specialist is that a problem can be factored in such a way that his particular specialty is the only important aspect. If he has difficulty in making this assumption, he will try to redefine the problem in such a way that he can stay within the boundaries of his ritual. If all else fails, he will argue that the problem is not “appropriate.” Research specialists, like all other living organisms, will go to great lengths to maintain a comfortable position. Having invested much time and energy in becoming specialists in a given methodology, they can be expected to resist efforts to expand the boundaries of the methodology or to warp the methodology into an unfamiliar framework.
I’ll give one example. It is self-serving, but I hope you’ll forgive that if you find it funny. It was a time while I was a graduate student in the MIT Electrical Engineering and Computer Science Department. When anyone asked what I was up to, I replied honestly:
I’m reading about the Balinese Rangda-Barong ritual so I can use existential phenomenology to figure out how to make breakfast.
This was something of a risk.11 It is not the sort of thing EECS students are encouraged to spend their time on. My research was funded by the US Department of Defense. The DOD might not have looked favorably on having their money spent on tantric rituals, phenomenology, or breakfast-making. On the other hand, my understanding of those things led directly to new technical methods and insights that underlie the current generation of military robots. (For better or worse.)
If you do realize that you have lost interest in your field—the fire has gone out of your romance—it may take huge courage to admit that and leave. It’s the right thing to do, though.
Legitimate peripheral participation
Earlier, I asked: why isn’t learning know-how through imitation (plus learning facts through classroom instruction) good enough? Part of the answer is: you need feedback, not just a passive source of emulation.
Consider learning to drive a car. When you take a driving class, you get a bit of lecturing, and there’s a booklet you’re supposed to read, but they don’t tell you anything that isn’t obvious. Are you ready to learn by imitation? No, that would be disastrous. As I wrote elsewhere:12
You need someone to teach you how to drive; someone who will sit beside you and explain the controls, and give directions, and watch you screw up, and tell you what to do instead. The skill can only be transmitted by apprenticeship.
Situated learning theory explains apprenticeship as legitimate peripheral participation in a community of practice.13 Let’s unpack that.
You learn know-how by doing. However, in most cases, just doing on your own is inadequate. Imitation is also mainly inadequate—as the Melanesian cargo cults so dramatically illustrate. Participation means doing with other people who know what they are doing. Typically, we learn from collaboration, not from observing and then accurately duplicating the action by ourselves. We aren’t that smart!
Legitimate means that you are accepted as part of the group activity, and given a role within it that everyone agrees to. If you walk out onto an airport landing strip and start “helping,” you probably won’t learn anything (even if you aren’t immediately dragged away by security dudes). Members of a ground crew have complex, interlocking duties; you have to fit into that schema to participate.
Becoming a junior member of a research team grants you the legitimacy needed for participation in its scientific activity. This cognitive apprenticeship is the only way to learn to be a scientist.
Peripheral means that the group initially assigns you a minor role: simple, low-risk tasks that are nevertheless useful. As you master each, you are given increasing responsibility, and trickier, more central roles.
Legitimate peripheral participation is a major reason someone would bother to tutor you. In formal instruction, teachers get paid. But most learning is informal, and most “teaching” is unpaid. The learner’s valuable labor gets exchanged for tuition. This is part of the science system, too: graduate students and postdocs contribute to their professor’s research program.
Legitimate peripheral participation is a more powerful motivation for accurate feedback than money. If a student’s labor contributes to the success or failure of your project, you want to be sure they are doing it right—and so you will scrutinize their work carefully, and give detailed corrective advice.
Feedback is not the whole story, however. People learn from collaboration in ways that go beyond both imitation and explicit correction. We pick up a great deal “by osmosis,” as Feynman put it. The situated learning research program has observed this carefully in hundreds of diverse contexts, and has gone some way toward explaining how it works.
The problem with the cargo cults is not that they are imitating. It’s that their members are not legitimate participants in airport operation.
Imagine a cargo cult downloaded all the manuals for ground crew procedures from the web, and watched thousands of hours of videos of competent ground crews doing their jobs. Imagine they learned them perfectly, and were able to execute them perfectly.
Still no airline would be willing to use their airport. The cult is not certified for operation; it is not legitimate. The proper bureaucratic rituals have not been observed. These rituals are rational: there has to be a fixed procedure for assuring that a ground crew is competent, and making special exceptions could be disastrous. “These cultists sure seem to know what they are doing; let’s create a set of tests to verify that, without putting them through our usual training regimen”? That would risk airplanes and lives, and would probably end the careers of everyone involved.
Communities of practice
A community of practice develops informally and automatically among any group of people who engage in an activity that requires specialized know-how. Whether you are getting seriously into knitting or tokamak optimization, you want and need to talk to other people doing that.
Informal contact naturally develops into a feeling of community. That typically becomes increasingly structured, with multiple communication channels, central authorities, cliques and factions, scheduled and spontaneous group events, and so on. Leaders may formalize the community into an organization, with defined roles and procedures. Air transport organizations take formal bureaucratic rationality to extremes; science somewhat less so.
A community of practice develops its own culture, worldview, and way of being. That includes its own ethical norms, and its own epistemic norms. These may be partly formalized, but remain mainly tacit. They are absorbed by osmosis, as know-how more than as know-that. They are “the way we do things,” which members can gesture at, but not necessarily explain. Becoming a ground crew member, or a scientist, requires a process of enculturation to acquire this tacit knowledge.
Tacit knowledge often contradicts explicit standards—and therefore could not, even in principle, be learned from manuals. In every workplace, there are the official rules, and then there is “the way we do things,” which involves extensive implicit exceptions.14 Those are not ethical norm violations—from the community’s point of view, at least—because “the way we do things” is the ethical standard of the workplace. In every laboratory, there is the protocol manual’s way to run an assay, and there is the way “we” run the assay. That is not an epistemic norm violation—from this research group’s point of view, at least—because the way “we” run the assay is better; or at least takes a lot less hassle and “works perfectly OK.” (Which may very well be true—or not.)
Every social group has two inseparable aspects: it is an invaluable and inescapable resource, but also a zone of socially-enforced conformity, thought-taboos, and dysfunctional practices and attitudes. Every intellectual community transmits to its members a mixture of epistemic virtues and epistemic vices. Some are far more virtuous than others, but none is perfect, nor perfectly depraved.15
Epistemic virtue and vice are not just learned from a community of practice, they inhere in it. The ways that community members interact, and the way the community comes to consensus as a body, are epistemically virtuous or depraved partly independent of the epistemic qualities of individuals. Just as moral preference falsification can lead a community of good people to do terrible things, epistemic preference falsification can lead a community of smart people to believe false or even absurd things.
The problem with nutrition “science” is not that individual nutritionists are stupid, ill-informed, or malicious. It is that the collective epistemic practices of the community are self-serving, wicked, wanton, paranoid, and deranged.16
Like other eternalisms, Melanesian cargo cults involve ideological “beliefs” that work quite differently from pragmatic beliefs like “my bicycle is blue.” Many Christians profess to “believe” the Rapture is imminent, but usually their actions show that this is not a belief in the ordinary sense. Cargo cultists may “believe” that their rituals will bring cargo, but this “belief” is probably as remote and theoretical as Christians’. Such “beliefs” have important functions in maintaining religious identity, membership, and institutions, and in advancing the careers of religious professionals, but they are not taken literally.
The “belief” that particular ritual activities will bring about scientific breakthroughs is often similarly unconnected with scientific discovery. Yet it is similarly important to the smooth functioning of “scientific” institutions and careers.
The replication crisis: mo’ betta rationality vs. epistemic vice
Clueful scientists have recognized for decades that most supposed science is actually cargo culting—but it seemed little could be done.
As Feynman said, cargo cult scientists “follow all the apparent precepts and forms.” The problem is mostly not disregard for epistemic norms; it is that the norms themselves are inadequate. But it is those very norms that define the epistemic community.
The current replication crisis is driven largely by broad moral outrage.17 That motivates a research practices reform movement, seeking to correct epistemic failures that are due to rampant, collective epistemic vice. The moral character of that vice is stressed by some scientific community members—and resisted by others.
The old guard’s attitude is: We followed all the rules, so we deserve to be rewarded accordingly. To which the rebels say: Yes, but the things you thought you discovered weren’t true.
Leaders of cargo cults—in science as well as religion—usually fight to keep their status, power, and income, by opposing attempts at epistemic reform. “Science advances one funeral at a time,” wrote Max Planck:
A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.18
The current reform movement in academic psychology is led mainly by junior members of the epistemic community, and instigated partly by outsider skeptics. However, some senior members have demonstrated heroic epistemic courage by retracting their own earlier work and advocating epistemic reform.
Reformers—in psychology and other fields such as medical research—advocate better explicit research practice standards. These valuable methods of technical rationality include, for example, more frequent replications; experiment pre-registration; publishing all negative results; reporting effect sizes; and abandoning the famously flawed p<0.05 significance test. If adopted, these will be significant upgrades in epistemic communities that have been practicing mainly cargo cult science. This will be a big win, I think!
Unfortunately… it also embodies the essential epistemological failure of cargo culting. That is the belief that there must be some definite method that will reliably bring the desired results. Then you just need to follow the recipe, and cargo will arrive, summoned by magic out of the sky.
But Campbell’s Law19 says that if you set up any explicit evaluation criteria, people will find ways to game the system. They’ll find ways to excel according to the standards, without producing your desired outcome. They’ll follow the letter of the rules, but not the spirit. John Ioannidis, who has done more than anyone to improve medical research standards, details exactly how and why this happens in his searing “Evidence-based medicine has been hijacked.” Institutional changes cannot guarantee science (or government, or education, or software development) that goes beyond cargo-cultish adherence to procedures.
So, better explicit epistemic norms are a significant upgrade, but they aren’t the answer. There is no substitute for actually trying to figure out what is going on. That requires technical rationality—but it also requires going beyond technical rationality.
There is no method: only methods
Do not seek to follow in the footsteps of the wise; seek what they sought.20
“There is no method” is a Dzogchen slogan. Dzogchen is unique among branches of Buddhism in offering no path to enlightenment. This may seem paradoxical at first, because Dzogchen offers innumerable methods—probably more than any other Buddhist approach—and is widely considered the most reliable path to enlightenment.
There is no “The Scientific Method,” and science offers no path to truth. That may seem paradoxical at first, because science offers innumerable, excellent methods, and is the most reliable path to truth.
“The Scientific Method” is the central myth of rationalist eternalism. It is scientism’s eternal ordering principle—the magical entity that guarantees truth, understanding, and control. But no one can say what it is—because it does not exist. No one can explain how or why science works in general, nor how to do it.
We can say a lot about how and why specific methods work—and that is critical. Nevertheless, blind faith in any specific method separates you from the reality of what is actually going on. That is the essence of cargo culting.
The kind of upgrade you need to advance from cargo cult airport operation is critically different from the advance beyond cargo cult science:
- Bureaucratic and technical rationality routinize airport operations, making them reliably good-enough. Conforming to the ritual norms of the ground crew practice community makes you a fully competent ground crew member.
- Science—and any intellectual work involving innovation—addresses the unknown, and therefore must not be routinized, ritualized, or merely rationalized. Conforming to the ritual norms of a practice community does not produce discovery.
The reason Feynman’s “utter honesty” is not taught is that there is almost nothing to say about it—in general. Epistemic virtues are not methods; they are attitudes, and meta to methods.
Recognizing the limitations of rationalist rituals does not mean abandoning them. You have to use methods, and you also have to relativize them. You need meta-rational competence to recognize when a method is appropriate, and when it is not.21 There is no explicit method for that—but, like riding a bicycle, it can be cultivated as tacit know-how. “Reflection-in-action” describes that meta-level learning process.
For the individual, becoming an actual scientist requires two shifts in identity and membership:
- First, you become a cargo cultist: a devout member of the community of practice. Acquiring know-how—explicit and tacit—is most of the work here. The way of being of a cargo cult scientist is social conformity.
- When you have mastered the community’s methods, you see their limitations, and you transcend its epistemic norms, without abandoning them. Developing meta-rational know-how is part of this second shift. However, a shift in your relationship with the scientific community, from mere membership to meta-systematicity, is the key change in the way of being.
Being meta to your community implies critical reflection on its norms. It implies taking responsibility for community development, for upgrading it, while continuing your involvement in it.
Upgrade your community of practice
Despite heroic mythology, lone geniuses do not drive most scientific, cultural, business, or policy advances. Breakthroughs typically emerge from a scene: an exceptionally productive community of practice that develops novel epistemic norms. Major innovation may indeed take a genius—but the genius is created in part by a scenius.
“Scenius” stands for the intelligence and the intuition of a whole cultural scene. It is the communal form of the concept of the genius.
Individuals immersed in a scenius will blossom and produce their best work. When buoyed by scenius, you act like genius. Your like-minded peers, and the entire environment inspire you.22
There is no systematic method for creating a scene, for improving epistemic norms, for conjuring scenius, or for upgrading a community of practice. These are “human-complete” meta-systematic tasks.
There is no method—but there are methods. There are activities, attitudes, and approaches that encourage scenius. These are available to individuals, institutions, or both. Neither can change a community’s epistemic norms unilaterally, but both can contribute to upgrades.
Kevin Kelly describes some scene features that individuals can contribute to:
- Mutual appreciation — Risky moves are applauded by the group, subtlety is appreciated, and friendly competition goads the shy. Scenius can be thought of as the best of peer pressure.
- Rapid exchange of tools and techniques — As soon as something is invented, it is flaunted and then shared. Ideas flow quickly because they are flowing inside a common language and sensibility.
- Network effects of success — When a record is broken, a hit happens, or breakthrough erupts, the success is claimed by the entire scene. This empowers the scene to further success.
Management theorists describe “learning organizations” that don’t base themselves on fixed structures, principles, and procedures. Rather, they conduct continuous meta-systematic reflection on their own commitments, and revise those accordingly. Such organizations also foster the learning and development of their members so they can take on increasingly challenging, interesting, and valuable responsibilities. There are abstract and concrete steps an organization can take to transform itself from a cargo cult into a dynamically innovating scene.23
As one example, making it easier for members to switch fields would represent a major upgrade out of cargo cultism in universities and other large institutions. It would take enormous institutional reforms allow that, and enormous resources to support people in transition, and that would require enormous institutional courage—but it may pay off enormously, too. Fields often advance rapidly when they are joined by talented outsiders who bring powerful, different ways of thinking. And, clearing out the deadwood of people who have fallen out of love with their disciplines would allow vigorous new growth in the fields they leave—without requiring funerals!
Recap: For the win!
Too much of life is wasted going through the motions, playing it by the book, acting according to systems no one really believes in and that fail to reflect a volatile, uncertain, complex, and ambiguous world. This is deadening for individuals, and for society a vast loss of opportunities for prosperity and innovation.
The lesson of cargo cult science for all human activity is that fixed systems are inadequate, because they never fully engage with the nebulosity of reality. We can, and must, upgrade to better ways of thinking, acting, and organizing our communities.
As individuals, we acquire basic competence through legitimate peripheral participation in communities of practice. In becoming a member, we absorb the community’s explicit and tacit norms—including ethical, epistemological, and process norms. Some communities of practice have mainly functional norms; some are highly dysfunctional.
Communities can upgrade their norms—the research practices reform movement is my main example in this post—and individuals can contribute such upgrades. Still, acting according to even the best norms can produce only routine performance, and it inhibits fundamental innovation and discovery.
As individuals, innovation and discovery demand meta-systematic competence. Once we have achieved mastery of the methods of a community of practice, we can reflect on how and when and why they do and do not work well. Then we can accurately select, combine, revise, discover, and create methods.
Communities (including, but not only, institutions) can take a meta-systematic view of themselves. They can reflect on their own goals, structure, dynamics, and norms. These activities may afford much greater leverage than incremental process optimization.
In plainer words: win big!
- 1. The full text of Feynman’s talk is on the web; it also appears in Surely You’re Joking, Mr. Feynman!, a collection of his wit and wisdom, which I recommend highly.
- 2. Some universities do expose undergraduates to actual science, through legitimate peripheral participation in real research projects.
- 3. The theories of the philosophy of science are laughably inadequate. They were invented by philosophers sitting in armchairs trying to work out from first principles how science ought to work. If you attempt to apply these theories to actual scientists doing actual science, you soon find that they explain nothing. Understanding of how and why science works would have to come from extensive, detailed, theory-neutral observation of scientists going about their business. Tragically little work of that sort has been done; and even the best has been marred by theoretical and political axe-grinding.
- 4. Knowing-how is not in general reducible to knowing-that. There may be exceptions: recipes do work. But, the ability to follow a recipe depends on the ability to, for instance, chop vegetables. You don’t learn that as facts, but by imitation, legitimate peripheral participation (helping a parent make dinner), and experience. More generally: explicit, systematic, rational understanding always rests on tacit, non-systematic, pre-rational skills-in-action.
- 5. Most people who ride bicycles every day don’t even know what one looks like! That may sound preposterous, but it’s supported by strong experimental evidence which I explained here, and which you can easily replicate for yourself as a fun party game.
- 6. A limitation in Feynman’s talk is that it lumps outright woo (astrology) with bad but institutionally sanctioned science (inadequate controls in experiments on rats in mazes). The failures of both may ultimately stem from the same abstract epistemological vices, but this is not obvious. The dynamics are sufficiently different in details that separate analyses would be helpful. Everyday common sense should be adequate to dispel woo, whereas bad institutional science can usually only be corrected with specialized technical methods of rationality. “Utter honesty” is required for good science, but ordinary layperson’s skepticism is adequate to address astrology.
- 7. Feynman’s explanation of “utter honesty” is vague; I imagine that he would have readily agreed that other epistemic virtues are important, or would have included them within “utter honesty.” Tantrikas will recognize a little joke here: “tenacity, courage, generosity, conscientiousness, and curiosity” correspond to the five elements of Vajrayana.
- 8. In some fields at least, more than half of published scientific papers are wrong (according to recent replication studies), but deliberate fraud is the cause of only a tiny fraction of those. The rest are due to cargo culting. Scientific fraud is “rare” relative to incompetence and mediocrity, but still far too common.
- 9. “How to think real good” tried to do this; “What they don’t teach you at STEM school” promises to do more, but also points to books by other authors that contain such case studies.
- 10. The Kennedy and Putt research administration paper was published twenty years before Feynman’s talk, and raises several of the same issues. I don’t know whether he was influenced by it. Kennedy and Putt also anticipate some of the 1980s anthropology-of-technology studies at Xerox PARC and the associated Institute for Research on Learning, which heavily influenced both my PhD research and this post. (The Lave and Wenger work I describe in the next section of the post was done at IRL, for instance.) My thanks to Sarah Perry for drawing my attention to the paper.
- 11. To spread the credit of courage around, I was willing to take this risk because I was at least tacitly supported in it by my supervisor, Rod Brooks—for whom that was probably also a risk, and who exercised the epistemic virtue of open-mindedness by putting up with my eccentricities. I am grateful.
- 12. I also wrote about learning by imitation and apprenticeship in “Robots that dance.” Maybe I keep telling the same stories over, but this one is important, I think.
- 13. Situated learning theory, legitimate peripheral participation, and communities of practice were introduced in Lave and Wenger’s Situated Learning: Legitimate Peripheral Participation. That book had a big impact on me—and many others—at the time it was published. Wenger has written several follow-on volumes; they look very interesting, but I haven’t read them.
- 14. The “ethnomethodological study of work,” a research program led by Lucy Suchman at Xerox PARC, investigates the relationship between explicit, systematic bureaucratic rationality and what people in bureaucracies actually do. Its findings strongly influenced my work in artificial intelligence, as Suchman mentions in passing in this interview.
- 15. This is something every freshman should be told on arrival at a university—and the message should be repeated often. “Half the departments at this university teach complete hogwash. However, we can’t tell you which. Your most important task as an undergraduate—especially if you plan to go on to graduate school—is to figure out whether your field of choice is nonsense, or for real. No one knows how to do that for certain, but here are some factors to consider…” I plan a follow-up post about this.
- 16. Many nutrition researchers must know that their field is intellectually bankrupt, but they persist because it’s “the way we do things.” This does show cowardice, at minimum, so there is some individual culpability. By the way, I used precisely five derogatory adjectives here… Why?
- 17. I’m wary of moral outrage in general, and especially collective outrage, but the replication crisis is a case in which it appears both justified and effective.
- 18. A fascinating 2015 study demonstrated the truth of Planck’s principle empirically, by looking at changes in publication patterns after famous scientists’ deaths. The Planck quotes are from his Scientific Autobiography and Other Papers.
- 19. Currently better known as Goodhart’s Law, but Campbell’s formulation is closer to what is usually meant.
- 20. On the web, this is widely attributed to the haiku master Bashō Matsuo. However, Bashō himself attributed it to the Buddhist innovator Nanzan Daishi, who lived a thousand years earlier.
- 21. Another way of putting this, in the language of adult developmental theory, is that airport operations require stage 4 (systematic) cognition, but scientific innovation requires stage 5 (meta-systematic) cognition.
- 22. “Scenius” was coined by Brian Eno, who wrote the first paragraph of the quote. The second paragraph is from the linked essay by Kevin Kelly. My thanks to Hokai Sobol for introducing me to the concept.
- 23. I’m thinking here of the work of, for example, Robert Kegan, John Seely Brown, Donald Schön and Chris Argyris, and Etienne Wenger.
Discussion
Read comments and join in.