Future of Science

Tinkering with Time: The Campaign to Conjure Up an “Anthropocene” Epoch

John Weissenberger
February 16, 2024
Time was when nearly everyone – even schoolkids – understood and accepted that geologic time was measured in tens if not hundreds of millions of years, a barely-fathomable vastness that animated our awe over the Age of Dinosaurs or the mysterious arrival of intelligent, upright apes. So how to explain the small but determined scientific movement intent on winning acceptance that geology can now be measured in a comparative blink of an eye, and that humanity has entered a new geological epoch defined by…itself? Applying his professional geologist’s scientific rigour and his amateur cultural historian’s perspective, John Weissenberger urges scientists to maintain a measure of humility, to recall the bitter lessons of past pseudo-scientific fiascos, and to be wary of the pitfalls of activist science pursuing political ends.
Future of Science

Tinkering with Time: The Campaign to Conjure Up an “Anthropocene” Epoch

John Weissenberger
February 16, 2024
Time was when nearly everyone – even schoolkids – understood and accepted that geologic time was measured in tens if not hundreds of millions of years, a barely-fathomable vastness that animated our awe over the Age of Dinosaurs or the mysterious arrival of intelligent, upright apes. So how to explain the small but determined scientific movement intent on winning acceptance that geology can now be measured in a comparative blink of an eye, and that humanity has entered a new geological epoch defined by…itself? Applying his professional geologist’s scientific rigour and his amateur cultural historian’s perspective, John Weissenberger urges scientists to maintain a measure of humility, to recall the bitter lessons of past pseudo-scientific fiascos, and to be wary of the pitfalls of activist science pursuing political ends.
Share on Facebook
Share on Twitter

Let’s say you just escaped from a Mississippi chain gang and, after days of hardship, reunited with your seven daughters. All is well until you find out your estranged wife told the girls you were dead, hit by a train, and all that was left of you was a “grease spot” on the tracks. This scenario played out for main character Ulysses Everett McGill in the Coen Brothers’ movie O Brother, Where Art Thou, speaking to our broader fears about life, death and the meaninglessness of it all.

According to one recent survey by the major life insurance provider, Ethos, about 25 percent of Americans think about death every day and almost half (45 percent) think about it every month. It seems that questions of mortality and the ultimate significance (or insignificance) of our lives swirl around us. These echo in pop culture with vintage songs like “Dust in the Wind” or the Hippie invocation that we’re “stardust”. The common theme would be dust, as in “ashes to ashes”. Tradition holds it was King Solomon (in Ecclesiastes) who composed the more poetic assessment of our lives: “All is vanity…A generation goes, and a generation comes, but the earth remains forever.”

It’s unknown if any of this occurred to a group of geologists plumbing the inky depths of Crawford Lake, near Milton, Ontario, last summer, sampling its sediments to establish stronger criteria for the commencement of a proposed new geological epoch, the “Anthropocene”. The apparent findings made Anthropocene advocates positively giddy with excitement. “It’s not just about climate change,” gushed Jürgen Renn, of the Max Planck Institute for the History of Science in Berlin, at a later announcement. “It’s not just biodiversity loss. It’s not just the sediments that humans are moving. It’s all of this together.”

xNew science: Sampling sediments from Crawford Lake in Milton, Ontario, seeking to strengthen criteria for a proposed new geological epoch, the “Anthropocene” – a time when human impact on the environment became significant enough to be reflected in the geological record. (Source of photo: Rachel Levy-McLaughlin/CBC)

First raised by the late Dutch chemist Paul Crutzen and American biologist Eugene Stoermer over 20 years ago, the Anthropocene is meant to denote a period when human impact on the atmosphere and environment became so significant that it would stand out in the geological record. Hence the “anthro” part, with “cene” from the Greek for “new” indicating that such an epoch is part of the Cenozoic Era commencing 66 million years ago. Yes, that’s “new” to geologists. But when – and even whether – human impact became significant enough to merit such a designation remains fiercely debated.

The “Anthropocene Working Group” is an international team of geologists to which the Canadian lake dredgers belong. They believe the newly proposed epoch began with the radioactive traces left by the atomic and hydrogen bombs, i.e., around 1950. Crutzen, a 1995 Nobel Prize winner for his work on the “ozone hole”, believed the Anthropocene should start earlier, with the beginning of the Industrial Revolution in the late 1700s.

xDutch chemist Paul Crutzen (left) and American botanist Eugene Stoermer (right) introduced the “Anthropocene” concept over 20 years ago; some scientists see it beginning around 1950, with radioactive traces left by nuclear bombs, while others put its origins in the late 1700s with the start of the Industrial Revolution. Shown at bottom, the aftermath of the atomic bombing of Hiroshima, August 6, 1945. (Source of top left photo: Teemu Rajala, licensed under CC BY 3.0)

As a geoscientist, I do find it a bit odd that counterparts from other fields like chemistry would lobby for the definition of new geological time units. Even if this is accepted in the spirit of cross-disciplinary collaboration and generosity, there are innumerable potential areas of study in earth science. So it’s also puzzling why a large group of geologists would focus on this particular question.

Or maybe not so puzzling. The concept of the “Anthropocene” is saturated with society’s current angst about the environment and the belief that we are doing irreparable harm to the planet. If this is true, one might argue, then surely we humans have launched a new geological epoch that will forever stain the earth’s geological record. As with many other parts of these debates, however, this one says as much or more about us and our collective psyche than it does about the planet and its natural history.

Our Reptilian Overlords

If you believe the Earth was once ruled by hyper-intelligent lizard people you’re likely to get more than a few strange looks and comments on social media (as well as the occasional ringing endorsement). Or if you merely argue, as journalist Graham Hancock has, that an advanced civilization predated the oldest societies known from accepted archaeology, you’re due for a bruising.

The scorn heaped on Hancock by the academic establishment is astounding, even by current standards. His Wikipedia page (obviously written or heavily edited by his critics) features numerous choice phrases suggesting his work contains “confirmation bias supporting preconceived conclusions by ignoring context, cherry picking, or misinterpreting evidence, and withholding countervailing data,” and that it “reinforce[s] white supremacist ideas, stripping Indigenous people of their rich heritage and instead giving credit to aliens or white people.” Hancock is scientific kryptonite.

Interestingly, scientists have investigated how one might go about proving the existence of an advanced civilization in the deep geological past (i.e., one too old to show up in conventional archaeological digs at or just below the surface). Known as the “Silurian hypothesis”, the idea was proposed by mathematician Gavin Schmidt and physicist Adam Frank and inspired by a 1970 episode of the British sci-fi series Dr. Who. Schmidt is now a climate modeller and passionate promoter of consensus climate science.

It was, incidentally, the Anthropocene concept that really got them thinking. Their work attempts to determine what evidence an ancient, forgotten, advanced civilization would leave in the geological record. Their conclusions are sobering. After asserting that the activity of a long-past society might be discernible in the rock record, they go on to describe how vanishingly small the evidence would be. This in turn implies how much of a record our current society will leave.

The “Silurian hypothesis”: Inspired by a 1970 episode of the British sci-fi series Dr. Who (top), mathematician Gavin Schmidt (left) and physicist Adam Frank (right) tried to determine whether an advanced civilization from millions of years ago would leave identifiable traces in the geological record.

The bottom line: chances of palaeontologists in the far-distant future unearthing an intact Cabbage Patch Kid, your old fishing tackle, a relative’s bridgework or some discarded breast implants are virtually nil. The task of hunting for evidence of us would thus fall to geology. Chemically, the rocks would show spikes in CO2 and nitrogen – from industrial activity and fertilizer – some “persistent organic pollutants” such as steroids, leaf waxes, alkenones and lipids, plus radioactive isotopes like Plutonium-244 (with a half-life of over 80 million years).

Yet, even if our high-tech civilization lasts many thousands more years, for reasons of preservation potential discussed below, traces in the rock record will be few. This is partly because, as technology advances, pollution and our overall environmental footprint decrease. If, for example, humanoid dinosaurs had evolved (like the “Dinosauroid” discussed by Canadian palaeontologist Dale Russell in the 1980s), their technological achievements would almost certainly be unknown to us. Faith in our long-lost reptilian overlords would have to remain exactly that. For this reason, future civilizations millions of years from now probably won’t know we existed, either.

xEven if dinosaurs had evolved into human-like “dinosauroids” as some palaeontologists theorized about, their culture would have left little-to-no trace; accordingly, future civilizations millions of years from now probably won’t know we existed, either. Shown, a model of the hypothetical dinosauroid, Dinosaur Museum, Dorchester. (Source of photo: Jim Linwood, licensed under CC BY 2.0)

A Valid Scientific Concept?

Whether the Anthropocene concept is scientifically valid or more of a political canard can be assessed a couple of ways. One is how the idea stacks up against recognized geological time units: does it meet the criteria that define them? The other is whether human activity is significant enough to leave a recognizable geological record.

Understanding how the “Anthropocene” compares to established periods of planet Earth’s natural history requires wrapping your head around the sheer scale of what’s commonly known as “deep time”. Journalist Peter Brannen discussed this in a recent article in The Atlantic, comparing the Earth’s 4.5 billion years to a marathon. If we were to run the metaphorical 26.2 miles backwards, our first five-foot stride would already put us two Ice Ages ago, 150,000 years before recorded history even began. By contrast, fans of Jurassic Park may know that the “Age of the Dinosaurs” – formally, the Mesozoic Era – lasted 180 million years which, Brannen notes, is 36,000 times human history. Accordingly, Brannen considers the very idea of the Anthropocene “a joke”.

Geologic time periods are typically defined by changes in fossil assemblages – extinctions and/or significant new faunal appearances – which can coincide with changes in composition of sediments. And while it’s true there are no set time spans for each unit type (eons, periods, ages, epochs, etc.), and that the durations of each tend to get shorter as you go down the hierarchy, recognizing an “Anthropocene” would be a dramatic departure from accepted practice. The Eocene Epoch, for example, spanned 22 million years. Even the more recent Pleistocene – the time of ice ages and the appearance of hominids and humans – lasted 2.6 million years. The most recently recognized epoch, the Holocene, began 12,000 years ago at the end of the last Ice Age – just after your kid’s movie Ice Age: the Meltdown.

xA blip if not an outright joke: Geologic epochs typically span millions to tens of millions of years, making even the current Holocene Epoch questionable, and the proposed Anthropocene – only 80-250 years long – even more so. (Source of table: CBC)

The duration of these units then, of many thousands to millions of years, is one glaring reason why the Anthropocene concept – an “epoch” 80-250 years long – is questionable and will remain so even if humans are around for thousands more years. The so far 12,000-year-long Holocene itself is a stretch, being roughly 1,000 times shorter than a typical epoch. The duration of the proposed Anthropocene “Epoch” just isn’t sufficient or significant.

As for the second assessment method, preservation of anything in the rock record is a crapshoot due to factors such as past sedimentation rates (which control the likelihood of burying anything of future interest), erosion, plate tectonics and the like. Brannen provides a couple of illustrative examples. First is the billion-year time gap in the rock record of the Grand Canyon. Second, all of the early Cretaceous rocks in the state of Maryland (representing 45 million years) have yielded just a few dinosaur bones. Similarly, there is no ocean floor older than Jurassic (about 200 million years ago) because it was all chewed up by plate tectonics. For things to be preserved they need to be either really long-lasting or earth-shatteringly significant. It takes a lot of rock to notice much of anything.

How much time is represented by how much rock is also variable. The Devonian strata I have studied (350-plus million years old) are only preserved because they started in the ocean and were pushed onto land, again by plate tectonics, and then weren’t all eroded by the time of humans. If you drive west from Calgary to Banff, the thinnest strata visible from the Trans-Canada Highway – maybe 5-10 metres thick – represent up to 100,000 years. Because natural processes can deposit 10 metres of sediment in a fraction of that time – recall the flooding in Alberta a decade ago, when raging creeks and rivers did just that in a matter of hours – we know that most of the time is lost in the gaps between the remaining beds of rock; there’s generally more “gap” than rock.

There are some places, however, like the deep ocean or certain lakebeds, that experienced a light, steady rain of sediment lasting thousands or sometimes millions of years which ultimately formed thin, stripey rock. The layers, maybe the thickness of a potato chip, are called varves and often represent one year of deposition. That’s why the Anthropocene-hunting lake plumbers dredged where they did. They were looking for the best possible preservation of a few decades of earth history, hoping to differentiate between, say, the layer deposited around Rocket Richard’s first Stanley Cup – 1944 – from that after the Hiroshima nuclear blast 17 months later.

xGeologic “crap shoot”: Layers of rock called “varves”, shown at top in cliffs near the Dead Sea, represent all of human history in about 10 metres of rock, whereas the thinnest strata of Devonian rock (bottom) visible between Calgary and Banff, represent 100,000 years; anything preserved in future rock layers would therefore need to be really long-lasting, earth-shatteringly significant or extremely widespread. (Source of top photo: geologictimepics)

Although sediments like varves represent a pretty complete time record, all of humanity’s recorded history would still be accounted for in, at most, about 10 metres of rock – the geological equivalent of a grease spot. This is why even members of the International Commission on Stratigraphy – the people who officially define geological units – think humanity’s record should be called an “event” rather than an epoch. We just aren’t that important.

So, rather than our achievements being preserved on the scale and detail of, say, Roman-era Pompeii after the eruption of Mount Vesuvius, it’s most likely that, besides the aforementioned chemical spikes in a thin rock lamination, any signs of our civilization will be crushed and blended among all the other changes since the last Ice Age. That would include the 120-metre sea level rise due to all that melting ice and the extinction of mammoths and other large mammals. In fact, Schmidt and Frank postulate significant preservation of rat skeletons and remnants of other vermin that have thrived alongside mankind, as clues for future scientists, but make no mention of whether things like the First World War ossuary at Verdun or the Cambodian Killing Fields might be geologically preserved.

All of this underscores how transitory is our toiling and how fleeting our legacy. Any rural dweller today knows, as our ancestors did, how hard it is to keep nature at bay, even in inhospitable Canada. If the population of Vancouver were to decamp tomorrow, how long would it take the rain forest to erase any trace of human activity?

Recent discoveries in the Amazon Basin provide a clue. Much to the chagrin of the Graham Hancock haters, convincing traces of a “lost civilization” have been found there. Recent airborne surveying using light detection and ranging (LiDAR) has revealed remains of monumental architecture and towns, evidence of canals and other human-built infrastructure like causeways stretching hundreds of miles, in what’s now dense jungle. What one researcher calls a “fully urbanized Amazonian landscape” is between 600 and 1,500 years old – all overgrown and previously lost to time.

Insatiable nature: remains of past human activity tend to be obscured and erased by natural processes; the Amazonian jungle (left) completely subsumed what was later rediscovered as a “lost civilization”, and recently mapped using aerial light detection and ranging (LiDAR) technology. For these reasons, signs of our own civilization will likely be crushed and blended along with other changes since the last Ice Age. Shown at right, 3D image depicting the urban centre of Cotoca, Bolivia. (Sources: (left photo) Shutterstock; (right image) H. Prümers/DAI)

This illustrates the key fact that any record left by us, all our ancestors and descendants would show up in the geological record more like an “event horizon” – a geological marker – than an epoch. Think of those artist’s renditions you’ve seen of the giant asteroid that killed the dinosaurs. Various tyrannosaurs and their duck-billed prey gaze up dolefully at the massive fireball streaking across the sky that’s about to wipe them out. That cataclysmic event which sent debris circling the globe for years is typically recorded by a rock layer a couple of centimetres thick, like the so-called “magic layer” boundary clay of Wyoming. And it was only discovered 40 years ago.

That global catastrophe, by the way, is geologically defined as a “mere” event. The chemical traces left by humanity’s existence are much punier. On that basis – and, indeed, based on over 100 years of geological practice – it would be barely credible even to be talking about an Anthropocene “event”, must less trying to have it officially recognized as an “epoch”. It’s possible that the established Holocene will be compressed into a sliver or, as geologic time advances, vanish altogether.

The giant asteroid strike that sent debris circling the globe and killed the dinosaurs 66 million years ago is now recorded by a “magic layer” of rock just a couple of centimeters thick (bottom); it is geologically defined as an “event”, not an epoch – and the current chemical traces left by humanity’s existence are much punier. (Source of bottom photo: Canon City Daily Record)

So as interesting as this might all be to some researchers and the geeks who follow them, when measured by established geological criteria the Anthropocene has little to do with science at all. What is it then? It appears to have more to do with politics, social preoccupations and the activism that comes with. Some earth scientists on the Anthropocene bandwagon are surely well-intentioned, but others are likely to be chasing potential research dollars that come from supporting “societally relevant” science and have thus been drawn into this politicized enterprise.

White-Coated Heroes or Group-Thinking Crackpots?

Time was when you could trust just about anyone in a white coat or smock – your doctor, pharmacist, beautician, even butcher. And, of course, those hard-working scientists sweating over boiling test tubes. For the millions of people emerging from the “independence” of rural life – unpredictable weather and harvests, backbreaking work, unsanitary conditions and lack of medicine fuelling high infant and maternal mortality – industrialization, technology and science seemed miraculous. This view transcended politics; inventors like Thomas Edison (the “Wizard of Menlo Park”) and manufacturers like Henry Ford were heroes to millions.

As we got richer and had more leisure time to puzzle over it, we started getting second thoughts. Books like Rachel Carson’s Silent Spring (1962) and Paul Ehrlich’s The Population Bomb (1968) bemoaned the impacts of technology and industry, as well as asserting that growth itself would bring about social and environmental collapse.

Meanwhile, even as science was taking man into space, no less than President Dwight D. Eisenhower raised concerns about scientific research. His farewell address of January 16, 1961 – famous for warning about the “military-industrial complex” – included a much longer critique of government-sponsored science.

The President lamented that “the solitary inventor, tinkering in his shop [was being] over-shadowed by task forces of scientists in laboratories” and that “the free university, historically the fountainhead of free ideas and scientific discovery,” was changing, and not for the better. Eisenhower believed the “huge costs” of research-and-development were causing the search for government contracts to overpower intellectual curiosity and that the “domination of the nation’s scholars by Federal employment, project allocations, and the power of money” was a grave danger. Lastly, he pointed to the “equal and opposite” threat that “public policy could itself become the captive of a scientific-technological elite.”

xScientists on the Anthropocene bandwagon may be chasing potential research dollars that come from supporting “societally relevant” science; in his 1961 farewell address, U.S. President Dwight D. Eisenhower warned of the risk that government-sponsored science would overpower “free ideas and scientific discovery”. (Source of photo: National Archives and Records Administration, 35810768)

There’s growing evidence Eisenhower was right. Given that science is obviously a human pursuit, it is subject to all of humanity’s foibles, beyond just the temptations Eisenhower described. Science is no stranger to petty rivalries, egotism, careerism, group-think or faddism. This is where the bygone image of the noble scientist garbed in dazzling white goes to die.

The most contentious of these issues is group-think, because it’s so perilously close to the radioactive debate around the scientific “C-word”, consensus. Conventional wisdom in various scientific domains has existed from the get-go, as researchers accepted a certain working hypothesis or theory that seemed to best explain the facts. Ideally, that would change through the weight of contrary evidence, but you know how people are. Originators and adherents of theory X become wedded to it and fight to the death those seeking to supplant it.

There are many examples of this. Michael Crichton, author of the aforementioned Jurassic Park, cited numerous historical cases in a 1999 speech critiquing the emerging accepted view on climate. One was the longstanding “miasma theory”, which held that disease was transmitted through “bad air”. Early advocates of the germ theory of disease transmission, such as German-Hungarian Ignaz Semmelweiss (1818-1865), had their data and interpretations harshly rejected by the “miasmists”. The debate broke Semmelweiss, who died in an insane asylum. Unfortunately, Crichton’s contemporary cautionary analysis of science’s past errors and his advocacy for the scientific method were lost under the tirades of those opposing his views on climate.

Crichton also went after the “nuclear winter” scare of the 1980s. Media-savvy scientists Carl Sagan and (again) Paul Ehrlich had postulated catastrophic consequences to even a limited nuclear exchange between the superpowers, imagining massive dust clouds asphyxiating all life on the planet. Sagan even asserted that the oilfield fires set during the First Persian Gulf War in 1990 would cause atmospheric cooling. Turns out the nuclear winter claims were based on unsound assumptions and the oilfield smoke idea was wrong. Both had arguably more to do with scientists seeking political relevance than with science itself.

xMedia-savvy scientists Paul Ehrlich (top left) and Carl Sagan (bottom left) warned that even a limited nuclear war between the superpowers would trigger a “nuclear winter” choking off all life; Sagan even claimed the oilfield fires (bottom right) set during the First Persian Gulf War in 1990 would cause atmospheric cooling. Both claims were later shown to have had more to do with scientists seeking political relevance than with science itself. (Source of top right image: Shutterstock)

There are sadly more examples of the amazing achievements of 20th century biology and medicine being retarded by accepted theories we now know to be misguided or criminal. There’s the oft-cited example of Lysenkoism, which rejected ideas of plant genetics in favour of Lamarckian ideas holding that acquired traits (like strong muscles) could be passed on to subsequent generations. Trofim Lysenko’s crackpot theory became the Soviet government’s official science, contributing to disastrous crop failures. Supporters of genetics, meanwhile, were deemed purveyors of “bourgeois pseudoscience”, with thousands of researchers and academics fired, imprisoned or executed.

Similarly, the now-discredited ideology of eugenics – that human genetic traits could be improved through state-approved medical interventions – resulted in tens of thousands of forced sterilizations around the world and contributed to the horrors of the Holocaust. Mental illness was also for decades treated through lobotomy, wherein – the squeamish should skip this bit – a sharp metal instrument was inserted through an eye socket to destroy the front part of the brain. These professionally sanctioned procedures were finally retired in the 1980s.

“No one to gloat over.”

Many try to wave off the dangers of scientific group-think or the relevance of past scientific error, but I’ve seen the effects in my own field. Best-known is the condemnation of Alfred Wegener’s theory of continental drift, which challenged the prevailing belief that the Earth’s crust was relatively static. This had led geologists to “explain” mountain-building through convoluted concepts like isostasy wherein weak parts of the crust would collect sediment and sink, while adjacent areas would rise, like ice cubes in a glass. The theory of plate tectonics with its moving continents was not accepted until sea-floor spreading was confirmed in the 1960s, more than 30 years after Wegener’s death. This despite the fact that, as many have observed, even a schoolchild can see that the coastlines of Africa and South America “fit together”.

My personal favourite geological misdiagnosis concerns the massive, post-Ice Age flooding in the Pacific Northwest. The unique erosional topography of Washington State – dry channels and cataracts, coulees, huge boulders randomly dotting the horizon – was described in the 1910s by J. Harlen Bretz as “channeled scablands”. Bretz postulated that only massive floods from rapidly emptying post-glacial lakes could so radically sculpt the landscape.

His theory was shot down by the geological establishment, who held to the “gradualist” school of earth history. Bretz’s model was simply too “catastrophic” for their sensibilities. Bretz was ultimately vindicated after 40 years of acrimony and in 1979, at the age of 96, he received the Geological Society of America’s highest award. Channelling physicist Max Planck’s dictum about the stubbornness of scientists – that science progresses “one funeral at a time” – Bretz told his son, “All my enemies are dead. I have no one to gloat over.”

Ignoring the “consensus”: American geologist J. Harlen Bretz postulated that only massive flooding from post-Ice Age glacial lakes could have sculpted unusual topography in Washington State; colleagues derided his theory for 40 years, with Bretz living just long enough to experience his own vindication. (Sources of photos: (left) Courtesy Cataclysms on the Columbia by John Eliot Allen, retrieved from HistoryLink.Org; (right) Parkfreeon6th, licensed under CC BY-SA 4.0)

The same thing goes on in my own sub-discipline of sedimentary geology called “sequence stratigraphy”. Based on the regional stratigraphic concepts of American geologist L.L. Sloss, Exxon geologists in the 1970s proposed a new paradigm in sedimentary geology: understanding strata based on the effects of changing relative sea level during sediment deposition. The rock record could then be divided by major events, such as great inundations of continents with rising sea level (depositing marine sediments on what used to be land) or, conversely, sea level drops like that during the last Ice Age. These revolutionary ideas were not given a rapturous reception.

Many, particularly academics, rejected the concepts of sequence stratigraphy outright or cited lack of evidence. One couldn’t help suspecting they were irked because it was industry that had made the breakthrough. A pointed example of this was a paper by one Canadian professor and his psychologist wife. They argued that acceptance of Exxon’s model was more related to “philosophical and sociological assumptions about the nature of human activity” than it was about science. Well, at least they didn’t come right out and say we all had a screw loose. The theory was, however, a genuine breakthrough, confirmed by a growing body of research since the 1980s, and is now widely accepted.

xScientific disciplines have recurringly been beset by group-think and gatekeeping – with researchers being blackballed and/or prevented from publishing, or simply not speaking up in the face of political and social pressures.

Group-think often translates into gatekeeping – researchers being blackballed and/or prevented from publishing – something I’ve also observed personally. One example was a senior American academic geologist who was considered too obtuse and, after having his scientific articles serially rejected by colleagues, decided to start his own journal – which proved credible in its own right. On the flipside, Canada’s largest petroleum geology newsletter published dozens of its editor’s own articles while rejecting a colleague’s single submission, despite the latter’s decades of experience.

Personal pettiness, grudges and even socio-political controversies also intrude into our quiet rock-bound club. One of my former professors very publicly quit the petroleum geology society because it chose to invite a Creationist to speak. Thirty years later, universities (and the federal government) appear to be embracing alternative approaches to science based on cultural and religious practices.

It’s difficult not to have misgivings about scientific rigour and credibility when there is so much evidence of human flaws and malice in your own backyard. It raises the automatic fear that, if things are this bad in a discipline I know, how can it be much different in other fields? Human frailty aside, the examples above also show that political and social pressures on science are nothing new. Separating that chaff from free inquiry and discovery – what science is supposed to be about – becomes that much more difficult. Consequently, when the public sees solemn reports about a new geological epoch – the “Anthropocene” – they might be well-advised to take it with a grain of salt.

The Crises of Science

It turns out that Paul Crutzen, coiner of the term “Anthropocene”, was also among those stoking fears of both nuclear winter and the Kuwaiti oil fires. More recently he began promoting geo-engineering Earth’s climate to counter atmospheric warming. The latter would involve “seeding” voluminous quantities of sulphur in the atmosphere to reflect sunlight and (ostensibly) cool the planet. The UK newspaper Independent described Crutzen’s rationale: “That political attempts to limit man-made greenhouse gases are so pitiful that a radical contingency plan is needed.”

Needless to say, despite the advocacy of scientists like Crutzen, the risks and potentially catastrophic harms of atmospheric “modification” or other climate mitigation ideas (like dumping iron into the oceans) remain unknown. It should give pause that the science around ozone itself, the subject of Crutzen’s Nobel prize-winning work, turns out to be more complex than originally thought. Go figure.

Of great concern is how Crutzen and his colleagues, if not popularizers or catastrophists like Sagan and Ehrlich, veer uncomfortably close to activism. Popularizing science is a double-edged sword. On the one hand it informs the public and raises the profile of important topics. But in my experience, it also cheapens the value of foundational scientific work – lunch-bucket science – like data collection and repeated experimentation. In order to get funding, researchers are driven to sensationalize their research, and to pick research topics that are more likely to generate sensations.

Activist science is even worse. It drives researchers to align their investigation with government priorities rather than go where their instinct and experience leads them – precisely what President Eisenhower warned of 60 years ago. Add to this increasing evidence that much experimental work is flawed, that it can’t be independently verified by repetition (the “reproducibility crisis”) or replicating the experiments, and you have what amounts to a full-blown crisis in science.

xScientists who align their work with political priorities and public anxieties veer uncomfortably close to activism; the whole Anthropocene concept smacks of hubris as well, grossly overstating humanity’s potential impact on planet Earth’s future geological record. (Source of left photo: Png Studio Photography/Shutterstock)

The humanities and social sciences have come under sharp criticism for teaching ideologies rather than subjects. Scrutiny has been on any department whose name ends with “studies”. Unfortunately, the same can now be said of aspects of the hard sciences. Science, properly done, would mean that hypotheses could be empirically proven rather than merely asserted and maintained by politics and power. Politicized science is not science.

The pursuit of scientific knowledge is like groping about in the pitch-black night. We know the things we’ve already felt, what we’ve run our fingers across and can describe. Beyond that is a vast realm of truth, just beyond our reach, that could be apprehended through science, if we only allow it. If we can shine a light on it. Holding fast to our current view, shunning those that challenge it, fixing science to political or strict social ends can’t but keep us bound in the darkness. Unless science is reformed, that’s where we’ll remain.

Given all of this, one leans towards the suspicion that the whole Anthropocene concept is another example of politicized science, using earth science to bolster a public policy agenda. Even if this is done with the best of intentions, it can’t help but undermine the scientific enterprise. The “Anthropocene” also smacks of simple hubris, denying the fact that all of humanity’s efforts and aspirations, played out over thousands of years, will geologically end up as a thin layer of clay. In the Coen Brothers’ language, a forgotten grease spot on the train tracks.

John Weissenberger is a Calgary-based geologist with a PhD from the University of Calgary.

Love C2C Journal? Here's how you can help us grow.

More for you

What the Oscar-nominated Canadian Documentary Sugarcane Gets Wrong, and Why

The secret to every good magic trick, Michael Caine’s character explains in the 2006 movie The Prestige, is a willing audience. “You want to be fooled,” he says. Anyone watching the Oscar-nominated documentary film Sugarcane could find themselves slipping into a similar act of self-deception. Focused on a residential school in northern B.C., the Canadian-made Sugarcane withholds key facts, arranges other evidence in confusing ways and encourages viewers – already primed to think the worst of residential schools – to reach unfounded conclusions about what they’re actually seeing. Even professional movie critics have been fooled. Documentary filmmaker Michelle Stirling pulls back the curtain on the dark magic behind Sugarcane.

The Dangerous Absurdity of Canada’s “Nation-to-Nation” Treaty with Manitoba Métis Federation Inc.

When the Métis were included in Canada’s 1982 Constitution as “aboriginal peoples”, some members complained that they’d been handed an “empty box” compared to the ample rights and treaties offered to Indian and Inuit people. Since then, however, Canada’s court system has been hard at work filling up that box. Now, with the signing of a “nation-to-nation” treaty late last year, Manitoba Métis have a box that’s positively overflowing with new rights, powers and federal cash. Peter Best explores how Canada came to recognize a fractious, landless, fully-assimilated, colonial-era group – a group that is actually represented by a corporation – as a nation with an inherent right to self-government, as well as the deeply problematic consequences of this decision.

More from this author

Wokism: A Symptom of “Late-stage Capitalism”?

Its leaders are avowed leftists and even “trained Marxists.” Its central creed is an oppression narrative revolving around race, gender and other elements of identity. It loathes capitalism, middle-class society and traditional institutions, and wants to topple all of them. Whatever else it might be – even if you sympathize with some of its ideas and goals – it seems undeniable that wokism is a feature of the political left. Not so, says a small but vocal and apparently growing group of left-wing theorists. John Weissenberger explores the claim that wokism is actually a right-wing phenomenon stemming from the historically foreordained problems of “late-stage capitalism.”

A Country Worth Saving: The 1867 Project

Public discourse in Canada today is dominated by voices insisting our history and culture comprise little but oppression and racism. We see it in the cancellation of historical figures and in the demeaning of Enlightenment values like freedom of conscience and respect for reason, alongside the elevation of so-called social justice and critical race theory. Canada seems a country gripped by ideologically-driven amnesia and calculated self-loathing. A new book, The 1867 Project, seeks to deconstruct and push back against this slow-motion cultural train wreck. John Weissenberger reveals how The 1867 Project rescues Canada’s history, reveals critical truths about our culture and charts the potential for national renewal.

Understanding the “Progressive” in Progressive Conservative

What sort of politician deliberately avoids learning economics while purporting to personify fiscal responsibility? Harbours socially liberal to left-leaning views but can’t bring themselves to join the Liberals or NDP? Disdains principled members of their own party more than politicians on the other team? And loses election after election while insisting their never-changing approach is the sure path back to office? Why, a Progressive Conservative, that’s who. Drawing on his over 40 years in and around politics, John Weissenberger offers a rollicking overview of the alternately odd, amusing, infuriating and just plain self-defeating bundle of contradictions that are Canada’s Red Tories.

Reading Progress

Share This Story by John Weissenberger

Donate

Subscribe to the C2C Weekly
It's Free!

* indicates required
Interests
By providing your email you consent to receive news and updates from C2C Journal. You may unsubscribe at any time.