How about an irreverent respite? In today’s post we’ll look at the 1973 book, Malice in Blunderland, by Thomas L. Martin, Jr., once the Dean of Engineering at the University of Arizona and later the President of the Illinois Institute of Technology.
We’re all familiar with “Parkinson’s Law” or the “Peter Principle,” but, Martin argues, C. Northcote Parkinson and Laurence J. Peter tried to make their laws too all-encompassing.
They ignored the sheer variety of folly at play in any bureaucratic “wonderland of blunders.”
Note: If you find yourself rubbed the wrong way by some of what follows, please keep in mind that Martin was himself a bureaucrat and the book is (mostly) meant to be humorous.
Part 1: Finagling and Klugemanship
Martin begins by defining a “glitch” as an inherent, built-in defect in any human contrivance. Their existence, he states, has long been acknowledged. In the 1800s, scientist James Clerk Maxwell accounted for such anomalies by joking about “demons.” People then blamed oddities on “Maxwell’s demon” for decades, until WII, when the Royal Air Force popularized the notion of “gremlins.” It was soon noticed, however, that glitches do not occur in wholly capricious ways. There followed the popular notion of things going wrong in a predictable manner, along with various attempts to capture this idea with general “laws” and “principles.”
The earliest and most famous example has got to be Murphy’s Law (or Laws), which, Martin says, no one really knows the origin of. Some postulate the name refers to aerospace engineer Edward Murphy. Martin argues it probably refers to Edsel Murphy. Whomever it was, it was not an original observation. For instance, as stage magician Nevil Maskelyne wrote in 1908, “It is an experience common to all men to find that…everything that can go wrong will go wrong.”
Murphy’s First Law: If something can go wrong, it will.
Murphy’s Second Law: When left to themselves, things always go from bad to worse.
Murphy’s Third Law: Nature sides with the hidden flaw.
Interestingly, though Martin goes on to list a couple of what he calls “Finagle’s Laws,” what he dubs “Murphy’s Second Law” is elsewhere known as “Finagle’s Law,” as popularized by sci fi author John W. Campbell, Jr.
Campbell was the influential editor of Astounding Science Fiction (later Analog). His novella, Who Goes There? was the basis for the movies, The Thing from Another World, John Carpenter’s The Thing, as well as the X Files episode “Ice.” In the classic Star Trek episode “Amok Time,” writer Theodore Sturgeon has Kirk telling Spock, “As one of Finagle’s Laws puts it, ‘Any home port the ship makes will be somebody else’s, not mine.’”
Related to Murphy and Finagle, Martin says, is Berkeley, offering what might be the most succinct summary of the field of Human Factors ever written:
Berkeley’s Second Law of Mistakes: If there is an opportunity to make a mistake, sooner or later the mistake will be made.
More famous than Finagle’s Law is, of course, the “Finagle Factor,” which Martin jokes is likely the single most important element in the emergence of the social sciences to their present status. (For a more serious treatment of this idea, see Stanislav Andreski’s 1972 classic, Social Sciences as Sorcery.)
Finagle Factor: An item introduced to make results match, on paper, the actual desired results. See also:
Zumwalt’s Second Law: No matter what result is anticipated, there is always someone willing to fake it.
This is also a common exercise, Martin observes, for government employees who work with the budget. As we shall see, bureaucracies, being bureaucracies, must expand, which means government costs must expand, which means Finagle Factors, such as “cost of living,” must be applied to the budget, enacted through the massive kluge (defined below) of the taxation system.
Finagle Factors themselves become reality via kluges. (Note: Martin spells the word “kludge,” which The New Hacker’s Dictionary points out is a common mistake. The original and correct spelling is “kluge” [Raymond, 1996]. I know you’re grateful that’s been cleared up.)
Kluge: A clumsy, quick-and-dirty workaround that is costly, inefficient, and difficult to maintain. In Yiddish, קלוג (“klug”), meaning, “Too smart by half.”
This leads to all sorts of advanced equations in the Special Theory of Finagle Relativity. In the third equation below, the “results” on the left side cancel, which means taking a Finagle Factor times a kluge creates a dimensionless constant that cancels real results, otherwise known as a “SNAFU.”
SNAFU: An acronym that stands for, “Situation normal, all f’d up.”
“SNAFU” is the bureaucrat’s motto. Their M.O. is to generate enough delay, red tape, and obfuscation so as to root out any effective change and continuous improvement. As they say, “What is good for the goose is good for the gander,” but what is good for innovation is not good for the bureaucrat (and vice versa).
More interesting than a physical kluge, such as the Ford Edsel, is the ubiquitous “managerial kluge.” A bureaucracy must protect itself, after all. Things aren’t top-down enough? Too many people are venturing off and creatively solving problems? Rein them in with some new standards and trainings, and, voilà—managerial klugemanship at its finest.
These efforts, of course, perpetuate “blunderland,” as a bureaucracy’s consistent, continuing, and largely unintentional klugemanship results in the disease of “botchulism,” and the emerging status quo, which is….
FUBAR: An acronym which stands for, “F’d up beyond all recognition.”
There is another level of complexity that must be considered, which is human intervention in reaction to things such as Murphy’s Law. As we all know, human judgment is typically faulty and, further, the road to hell is paved with good intentions. This leads us to….
Sevareid’s Law: The chief cause of problems is solutions.
If one could be accused of cynicism—and Martin most definitely cannot—one might arrive at the lawyeristic conclusion arrived at by Jerome Cohen (a lawyer) and realize it does not matter if perceived glitches, kluges, and SNAFUs are in fact real.
Cohen’s Law: The name of the game is what label you succeed in imposing on the facts.
The especially adroit bureaucrat realizes that Finagle Factors and kluges are wholly unnecessary, since, following Cohen’s Law, the actual results can almost always be portrayed as the desired results.
Part 2: Hierarchiology and Status Quo Vadis
Bureaucracy: Any social arrangement with a power dynamic; e.g., a pecking order.
The most famous statements made about bureaucracy are likely those of C. Northcote Parkinson.
Parkinson’s Law: Work expands to fill the time available for its completion.
The deeper implications here are frequently missed. What Parkinson’s Law demonstrates is that in any bureaucracy, the number of administrators being paid is uncorrelated to the amount of work to be done. As Martin illustrates, given X number of bureaucrats in Y bureaus in any organization, they can advance their careers by ensuring an increase in either the number of:
- People in the bureau.
- Bureaus in the organization.
- Levels in the hierarchy.
Bureaucracies expand because, as Charles Ecker stated, “There is no power in yes.” The power lies in becoming increasingly obstructionist. As bureaucracies grow, of course, so does the hierarchy, which wreaks additional havoc.
Jay’s Second Law of Hierarchy: Given an organization with six levels and 11 direct reports to each manager, there will be about 177k employees, and the top half of the org will consist of only 133 people.
It’s a battle to climb to the top, and yet, once there, people find themselves at an informational disadvantage. Hierarchical structures are akin to the children’s game of Chinese whispers.
Reich’s Law of Hierarchical Reality: Executives are too busy to find out what’s going on. They only know what they are briefed on, and the briefing is both limited and highly selective.
Combining the above two laws, it follows that given almost any problem with the work itself, the few people at the top of the hierarchy will know the least about it. As Martin puts it, the higher one climbs, the more one doesn’t know that one should know. Hence the growing need for more data, more presentations, more reports, more statements, and more meetings, meaning, of course, more handoffs, busyness, and inefficiencies.
In meetings, however, people have a tendency to abdicate and agree. New and innovative ideas, which tend to be divisive, are therefore shunned. What tends to happen in most meetings, therefore, is not much.
Whyte’s First Law of Meetings: People rarely think in groups. They talk, exchange information, adjudicate, and compromise, but they do not think.
They also routinely misinterpret one another to make such agreement easy.
Dunne’s Law: The territory behind rhetoric is too often mined with equivocation.
Thus, while people likely believe they understand “the message,” they will seldom realize their interpretation of what was said is not really what was meant. As Martin puts it, the more staff meetings that are held, the more comms that are sent out, the more that “communication gap kluges” are employed, the greater the misunderstanding of the org’s actual activities.
From a bureaucratic point of view, at least, that is not entirely a bad thing. More misunderstanding requires more people to help address it. This can be reformulated as follows:
Chisholm’s Third Law of Human Interaction: Purposes, as understood by the purposer, will be judged otherwise by others.
Most of the discussion in meetings, thank goodness, will pertain to issues of lesser importance, as made clear by one of Parkinson’s lesser-known laws:
Parkinson’s Law of Triviality: The time spent discussing an agenda item will be inversely proportional to how much the item costs.
Things that cost a staggering amount of money tend to get funded without much discussion or research, whereas trivial matters that involve negligible sums of money will be discussed ad nauseam in meeting after meeting.
The Law of Triviality was, in part at least, explained by Shimkin.
Shimkin’s Rule: Policy is anything you want to decide yourself. Routine details are anything you do not want to be bothered with.
Returning to Jay’s Second Law and Reich’s Law, they together show that top-down command and control leadership in a large hierarchy is, at best, unfeasible. It will take so long to respond to problems that responses will be stale by the time they are enacted.
Interestingly, Martin states, despite decades of evidence to the contrary, many persist in the illusion that the “solution” to such hierarchical ailments is still “better communication.” Thence follows even more comms, presentations, report outs, and update meetings, only further compounding the issue and proliferating additional confusion.
Martin’s Restated Law of Communication: The more levels there are to a hierarchy and the more attempts are made at interlevel communication, the less people in general will know what the hell is going on.
The irony, Martin concludes, is that contrary to popular belief, the growth of a bureaucracy does not concentrate power at the top—it diffuses power altogether.
Andreski, S. (1972). Social sciences as sorcery. London, UK: Andre Deutsch Limited.
Martin, T. L., Jr. (1973). Malice in blunderland. USA: McGraw-Hill Book Company.
Maskelyne, N. (1908). The art in magic. The Magic Circular, June, p. 25.
Raymond, E. S. (1996). The new hacker’s dictionary (3rd ed.).
Whyte, W. H. (1956). The organization man. USA: Simon & Schuster, Inc.