The Rule of Numbers and the Search for Truth

David Solway
August 23, 2019
Putting numbers to nearly everything is the postmodern world’s way of separating facts and knowledge from mere opinion or superstition. This not merely reflects a cramped view of knowledge, it is false and immensely damaging to rational inquiry, discussion and the dissemination of knowledge. David Solway mounts a counter-argument for quality over mere quantity. Although nominally about the social sciences and aimed at its practitioners, Solway’s essay serves up food for thought for any consumer, customer or target of the social sciences: students, their parents, business people, employers, government officials, voters. In short, all of us.

The Rule of Numbers and the Search for Truth

David Solway
August 23, 2019
Putting numbers to nearly everything is the postmodern world’s way of separating facts and knowledge from mere opinion or superstition. This not merely reflects a cramped view of knowledge, it is false and immensely damaging to rational inquiry, discussion and the dissemination of knowledge. David Solway mounts a counter-argument for quality over mere quantity. Although nominally about the social sciences and aimed at its practitioners, Solway’s essay serves up food for thought for any consumer, customer or target of the social sciences: students, their parents, business people, employers, government officials, voters. In short, all of us.
Share on facebook
Share on Facebook
Share on twitter
Share on Twitter

The privileges, interests, biographies, fetishes and investments of researchers typically remain subtext.

         — Michelle Fine, Working the Hyphens

Most conservative writers concerned with the debasement of education in the West – I’m no exception here – tend to write about the indoctrination programs in K-12, the political devastation of our universities and the ideological perversion of the curriculum. But something else is also going on in the research faculties and protocols of the social sciences that is no less problematic.

It has to do with the classic divide in sociology, anthropology, psychology and education between quantitative and qualitative researchers: simply put, between those who incline to the mystique of numbers and those who put their trust in personal judgment. The numbers-worship that dominates the research industry in education and the social sciences reflects a larger cultural syndrome – the abdication of the self or, in other terms, the substitution of authoritarianism for authority, of collective dominion for personal authenticity, of the many for the one. This prepossession holds across the board, in political activity no less than in intellectual life. The belief in the absolute validity of polls, statistical compilations and quantitative analysis is merely a subset of this compulsion.

The art of thinking: today’s reliance on “scientism” denigrates the human capacity to discover knowledge through judgment, thought and experience.

The quantitative fetish in the cadet disciplines clinging like pilot fish on the body of knowledge – otherwise known as scientism – is an instance of what we may call the “aggregate ethics” that purports to arrive at fundamental truths about human behaviour. As Austin Hughes contends in an essay dealing with the folly of scientism in The New Atlantis, “Of all the fads and foibles in the long history of human credulity, scientism in all its varied guises…pretends to be something very different from what it really is and…has been accorded widespread and uncritical adherence.” Hughes “longs for a new Enlightenment to puncture the pretensions of this latest superstition.”

Superstition, indeed, or even a species of magic. Recite a formula and reality complies.

I’m reminded of a team of colleagues who, conducting a survey of “performance parameters” among college-level students, generated several recommendations for improving “student outcomes”. These included basing the curriculum on popular culture “modules” – movies, hit songs, computer games, TV programs and Netflix blockbusters. Students did poorly, apparently, because they were not sufficiently motivated, many claiming in interview sessions that standard subject matter did not interest them. The upshot was that students needed to be catered to, not challenged. The table of figures produced by the researchers after testing their theories over a two-year span showed a marked improvement in grades and “comprehension scores.” The results were understood as conclusive.

Austin Hughes views scientism as the latest superstition that should be punctured with a modern-day Enlightenment.

What this exercise actually indicated, however, was that students were merely able to respond with greater facility and competence to essay questions and tests focusing on Bono, Game of Thrones and The Hitchhikers Guide to the Galaxy – areas with which they were already familiar – rather than on Shakespeare, the Ode on Intimations of Immortality and The Name of the Rose. The writing itself was impenitently fularious, owing more to the Urban Dictionary than to the English language. Grammar and syntax were MIA. As a piece of research, it was fundamentally circular as well as superficial, as I’ll explain.

It never occurred to the researchers to consider a set of variables that would have tempered their conclusions. Credible options were available, such as reverse socioeconomic status. As the journal Psychology Today notes and experience tends often to confirm, affluence is, paradoxically, often a negative indicator of achievement. Other likely variables would have been personal aptitude and the affective determinants of a techno-visual culture that selects for triviality and distraction as well as cognitive stupefaction. From the standpoint of genuine education – language proficiency, knowledge of history, civics and current affairs – the “outcomes” were disgraceful. Jesse Watters, host of Watters’ World on the FOX News channel, has abundantly demonstrated these widespread deficiencies in his roving video interviews of university students.

In addition to driving research that is merely of poor quality, the social sciences’ obsession with numbers is deceptive – beginning with self-deception. Kenneth Gergen cogently argues in The Saturated Self that purportedly value-free, so-called scientific descriptions “inevitably harbour and sustain the values of their proponents.” Gergen shows, for example, that Solomon Asch’s famous experiments on social conformity are compromised by a set of hidden modernist presuppositions. The behaviour Asch and his colleagues equated with conformity might with equal appropriateness have been described as “socially sensitive,” “socially integrated,” or “harmony seeking”.

Moreover, anthropologists have learned to their cost and embarrassment in the wake of the Margaret Mead fiasco in Samoa that subjects may often be disingenuous in their answers and reports. Mead was famously taken for a ride in her rendition, based on interviews and “field work” in the late 1920s, of Samoa as a trauma-free and innocent sexual paradise. It took until the 1960s for Mead’s incompetence to be discovered and another 20 years for her methods and conclusions to be debunked, but not before a great deal of mischief occurred along the way.

Deception, self-deception or both? Margaret Mead got played by her subjects into believing Samoa was a worry-free sexual paradise.

Paul Stoller remarks in an extended reflection on the issue in In Sorcery’s Shadow, “informants routinely lie to their anthropologists.” One cannot always know what informants have in mind or the extent to which they are unreliable or even self-deluded. The fact that one can cite a majority of respondents confirming one’s thesis and lay out one’s findings in tabular or numerical form is not necessarily a valid technique of analysis.

Neither informants nor researchers can be fully trusted. For one thing, the interviewing process is always skewed. As Jane Radway confides in Reading the Romance, she can only present “my own construction of my informants’ construction of what they were up to.” To complicate the issue further, the most important questions of human experience do not typically yield to statistical methods, experimental verifications, focus group questionnaires or analytical procedures in general. What is called the “psychometric” approach cannot readily encompass the volatile and mercurial dimension of the human spirit. Objective techniques employed by social and behavioural scientists in their investigations of individual, class and cultural patterns of behaviour – what Jürgen Habermas in Theory of Communicative Action calls the “human sciences” – are always problematic.

These caveats apply to both quantitative and qualitative research. As elaborated in the standard teacher-training volume, the SAGE Handbook of Qualitative Research, quantitative analysis entails instruments which attempt to establish conclusions through numerical and statistical means and are regarded as “objective”. This is what Clifford Geertz in “From the Native’s Point of View” calls “experience-distant concepts.” This approach is, or attempts to be, primarily deductive, conveying the aura of intrinsic authority.

Qualitative analysis involves conclusions reached by observation, experience, reading, reflection, dialogue and discussion—what Geertz calls “experience-near concepts.” This process is mainly inductive, which historically was considered the right way to think about the world, but which scientism and other postmodern trends have deemphasized and rendered suspect. The edges between the two methodologies are often blurred, mixed or conflated since the former must involve the intervention of prior conceptions. Social scientists come with their own value-laden baggage.

World-famous cultural anthropologist Clifford Geertz favours the “experience-near” process of individual study and inductive thinking.

In other words, quantitative analysts use qualitative approaches which are subsequently factored out of their published reports. Add to this the undoubted fact that statistics are readily manipulated, that data categories may seem veridical when they are merely reconfigured givens, that unstated assumptions always underlie the process, and that the same set of data can produce conflicting results, and we have good reason to maintain an attitude of robust agnosticism. Quantitative conclusions are frequently subject to “selection effects”, distortions introduced by the apparatus or modus operandi in question or by the manner of collecting and reading the data.

Keeping all such constraints in mind, I would suggest that judgment-based qualitative research constitutes at least as powerful an “instrument” for social science research as the quantitative chauvinism currently in vogue. Pitirim Sorokin in Social and Cultural Dynamics calls this “quantophrenia”. We should not be deceived by the typical third-person locutions we come across in ethnographic papers, education surveys and social science studies, such as “projections indicate”, “studies suggest”, “experiment teaches”, “research shows”, “science reveals”, “surveys find”, etc. A parsimonious sprinkling of such presumably objective phrases is no doubt harmless but the problem is that the assumptions on which all these projections, studies, surveys and instruments rest are seldom disambiguated. The “I” remains hidden beneath the “it.”

Russia-born Pitrim Sorokin argued for judgement-based science over blind adherence to what he termed “quantophrenia”.

Studies treating of those profound, elusive and fugitive questions that circulate in the value dimension resist objective treatment. The intrinsic fallibility or insecurity of all human enterprises – those involving love, filiation, spiritual growth, education, intellectual development, intelligence – is not always experimentally replicable or statistically accessible. Of course, as Bernoulli’s law of large numbers prescribes, statistics and probabilities become meaningful when dealing with a large number of events, not with individual events. “Extensive trials” enable us, said Bernoulli in The Art of Conjecturing, to “finally attain moral certainty.” The trouble is that the individual event can be submerged by statistical calculations such that, if the individual event is qualitatively unique, something of decisive importance, pivotal influence or great interest can be lost.

Are IQ scores, to take one example, objectively reliable, generating predictive value, as many researchers believe, or does the exception invalidate the rule? (The saying “The exception proves the rule” is regularly misunderstood: “proves” here means “tests”.)

My friend Robert Hübner, a world-ranked chess grandmaster, Professor Emeritus of Papyrology at the University of Köln, Germany, and a master of Greek, Latin and 12 modern languages who never read a book in translation, failed his military service tests by scoring an IQ in the low 80s, or border-line deficiency. Quantitatively, Hübner is a moron; qualitatively, a genius.

The obsession with numbers and their elevation over qualitative techniques is more than a quirk that births some silly anomalies; it is fundamentally wrong-headed and wreaking intellectual havoc. German-American political philosopher Eric Voegelin was surely right when he asserted in The New Science of Politics that the mathematical model “became dangerous because [of the] assumption that the methods of the natural sciences were a criterion for theoretical relevance in general.” This led to the positivistic claim “that realms of being which are not accessible to exploration by the model methods were irrelevant.”

IQ: 80. Chess ranking: world-class grand-master. Robert Hübner.

As I argue, a qualitative approach may treat of complex issues more sensitively than a replicable or quantitative instrument. A meditation based on experience, reading and document analysis, long exposure to a given cluster of problems, common sense, lengthy observation of “subjects” and informed discussion with peers may be superior to a pseudo-mathematical or experimental study.

Consider, for example, the ensuing “narrative of experience” given by François Victor Tochon, author of Signs and Symbols in Education: Educational Semiotics. A teacher finds after grade evaluation that the class divides into two neatly disparate groups. The one cluster has grades of 65, 67, 68 and other close results. The other group is clustered around the values 83, 88, 90, etc. Parametric statistics would regard this class as falling within the standard distribution, with a mean of approximately 75. Yet such a result presents a distorted picture of this particular class’s performance. The actual experience would drive different conclusions and suggest different intervention than the final result of quantitative analysis.

I do not deny that the qualitative approach is compromised by conditions which cannot be neutralized or dismissed. There is no known way of entirely clarifying the opacity inherent in issues of human relationship, including the relation of the self to itself. In the words of Paul Ricoeur in Fallible Man, “total comprehension” can never be reached: “[T]his limit is never attained, because in man’s precomprehension of himself there is a wealth of meaning that reflection is unable to equal.” Prejudice, in the literal (rather than ideological) sense of pre-judgment, remains constant. It evades even the most disciplined acts of introspection and self-inquiry, since it is structurally constituted as an intrinsic condition of all acts of consciousness.

Albert Einstein’s theory of relativity required a deep level of thought and theory; it wasn’t about compiling statistics.

The issue we are considering is part of a much larger predicament. The gradients and imperatives that dominate cultural life across the board have changed drastically over the last 15 or 20 years. We see this happening everywhere, in every field, every discipline, every walk of life. We are now living in what political scientist Benjamin Barber calls McWorld. It is a world in which people are encouraged to organize rather than to participate and conserve, and to globalize their passions and concerns (spreading them ever-more thinly) rather than to localize them in the various, overlapping environments in which we live. The actual person – the locus of subjectivity and the centre of judgment – has become dispensable in the interests of “maximizing yields” and ensuring “social productivity.”  

Social science research is thus only a symptom of a far vaster cultural malaise or deformity, which Chogyam Trungpa aptly dubbed “spiritual materialism”. The real dilemma is that we are losing a living concept of the self. It is no longer understood as an autonomous centre of integrity, knowledge and memory but as a mere preference-machine or a field of forces that can be studied and quantified with a view to its manipulation.

Born in Tibet, the venerable Chogyam Trungpa lamented the poverty of “spiritual materialism”.

Our subjectivity is no longer grounded in history, the ethics of stewardship, or the labour of self-perfection (not self-improvement, which is a purely surface concept that does not question the value of the returns it envisages). Rather, it tends to float freely in a vast circulation of images, secondary associations and illusions of achievement. Like the current, flawed conception of knowledge, success is defined as the acquisition and enumeration of anything that can be tagged, measured or displayed.

Almost every human activity now attracts some technical method, therapy or system to provide for cognitive mediation in precisely those areas where good sense, natural intelligence and inner strength would be expected to suffice. We now even require counselling techniques and legions of experts to help us grieve and to achieve “closure,” that most absurd of jargonized concepts! We are constantly exhorted to “move on,” as the cliché has it, with the help of specialized treatments and diagnostic routines, toward new states of putative equanimity. Longitudinal studies then confirm the results that are already prefigured or desired.

Taking flight: the Wright Brothers used their imagination and judgment to design, build and fly the world’s first powered aircraft.

Altogether social scientists, ethnographers, educationalists and their professional kin are busy manipulating their techniques in order to reach quantitative conclusions rather than interrogating their own prejudices. Quantitative methods are often without effect in exorcizing the demon of subjectivity. Results are almost always subject to the hidden lusts of interpretation. In adopting the methods of the natural and mathematical sciences, as Voegelin points out, the social sciences have pulled a fast one. Human truth cannot be definitively measured or reliably described by numbers, objective indices, audit samplings or, in short, purely quantitative surveys.

Eureka! Archimedes discovered a key scientific principle in the bathtub.

In the “human sciences,” for all their precariousness, nothing beats quality, the fruit of personal integrity and critical reflection. This requires a mind capable of introspection and moral rectitude, a rara avis if ever there was one. But nothing else will do. The most dependable research aids in the area we are considering, however scarce and insoluble, remain personal honesty, skepticism and intelligence.

Of course, the social scientist cannot abandon his or her staple practice, but should be wary of putting their entire trust in numerical taxonomies. Charts, tables and columns of numbers cannot supplant the still centre of judgment, the tendency, so to speak, to stay put rather than “move on” in the feverish quest for results. As Pico Iyer mischievously advises in The Art of Stillness: “Don’t just do something. Sit there!”

David Solway is a Vancouver-based poet, songwriter and essayist.

Love C2C Journal? Here's how you can help us grow.

More for you

The Unbearable Wokeness of Being in Stratford

Among 2020’s many unfortunate pandemic casualties was the Stratford Festival. Today it’s anybody’s guess how, when or whether the beloved cultural institution, held annually in the Ontario town named for the hometown of William Shakespeare, can restart. But, writes Grant A. Brown, serious wounds were already being inflicted upon the festival – from within. A Stratford resident and business owner, Brown brings a lifelong Shakespeare lover’s perspective to his dissection of the progressive degradation of the great playwright’s greatest works and the garbling of his eternally revealing insights into human nature.

Portrait black and white shot of Sir John A. Macdonald.

Sir John A. Macdonald Saved More Native Lives Than Any Other Prime Minister

Were he alive today, Sir John A. Macdonald would make short work of his many present-day critics through his legendarily quick wit, disarming personality and mastery of the facts. Unfortunately, he isn’t around to defend himself against horrifying claims he committed genocide against Canada’s Indigenous people. To take on this calumny, Greg Piasetzki goes back to the source. Using Macdonald’s own words and other contemporary voices, Piasetzki brings alive our Founding Father’s determination to save native lives and protect their interests throughout his time in office.

Black women is pictured against mural, begging questions of political correctness in the history of slavery in Quebec.

Angélique – The Politically Correct Rewrite of Quebec Slavery

Slavery is an outrage, pure and simple, truly one where it is accurate to say “even one is too many”. But even slavery requires context. Out of the more than 12 million Africans captured and shipped across the Atlantic, by the year 1700 precisely six were held in what would later become Quebec. So how and why did La Belle Province decide to upend the truth of its past? In this version of an essay that appeared originally in the Dorchester Review, Frédéric Bastien chronicles Quebec’s bizarre orgy of “historical correctness” and the damage it is doing to memory, truth and perspective.

More from this author

The opening of the curtain alludes to the illusion being perpetuated by Leftist narratives.

Narrative and the Leftist Fairy Tale

Most of us probably regard the word “narrative” either as an creaky cliché thrown around mostly by posers or, if we unwittingly fall into the latter group, as a handy instant signal that we’re culturally au courant (to use another aging cliché). There’s far more to the concept of narrative – unfortunately. Would that it were harmless trivia. Instead it has shown not only indestructible staying power but a viral cunning, mutating and replicating and insinuating itself into every cultural nook and cranny. And that’s profoundly dangerous, writes David Solway, who provides the intellectual heavy lifting in this thorough analysis of the concept’s nature, seductive allure, political misuse and potentially civilization-wrecking power.

For followers of instagram poetry their feeds can become inundated with sub-par passages passing as poetry.

Instant Poetry: A Sign of Cultural Decline

Care for some “snuggling”? Such appears to be among the deepest thoughts and most memorable expressions of our current generation of poets. The best-known, “instant” kind, anyway. The real kind still exist, poet David Solway notes in this essay, although they’ve been pushed to the cultural margins. And while understanding and appreciating real poetry – a learned and often challenging practise – has fallen into disfavour, it remains vital to our civilization, if there is to be one. That millions of people are buying Instagram poetry, Solway argues, does not change the fact that it is self-indulgent rubbish.

Imprisoned in Our Cells: the Curse of the Digital Age

Is it possible the modern world so readily accepted government-decreed physical isolation because most of its inhabitants were already living essentially disconnected lives? For people existing mainly in a virtual world, does it even matter that they can’t go out into the real world? Author, poet and songwriter David Solway observes how, once they do venture back out, many people hardly even look up or recognize others. While accepting the usefulness of the mobile device, Solway ruminates on the increasingly obvious psychological, social and, yes, civilizational effects of this ubiquitous technology.

Share This Story

Share on facebook
Share on twitter
Share on print


Subscribe to the C2C Weekly
It's Free!

* indicates required
By providing your email you consent to receive news and updates from C2C Journal. You may unsubscribe at any time.