Connect with us

JUSTICE

How the media encourages – and sustains – political warfare

Published

on

Since his inauguration, President Donald Trump has been waging war against the American press by dismissing unfavorable reports as “fake news” and calling the media “the enemy of the American people.”

As a countermeasure, The Washington Post has publicly fact-checked every claim that Trump has labeled as fake. In August, The Boston Globe coordinated editorials from newspapers across the nation to push back against Trump’s attacks on the press. The Associated Press characterized this effort as the declaration of a “war of words” against Trump.

News organizations might frame themselves as the besieged party in this “war.” But what if they’re as much to blame as the president in this back-and-forth? And what if readers are to blame as well?

In an unpublished manuscript titled “The War of Words,” the late rhetorical theorist and cultural critic Kenneth Burke cast the media as agents of political warfare. In 2012, we found this manuscript in Burke’s papers and, after working closely with Burke’s family and the University of California Press, it will be published in October 2018.

In “The War of Words,” Burke urges readers to recognize the role they also play in sustaining polarization. He points to how seemingly innocuous features in a news story can actually compromise values readers might hold, whether it’s debating the issues further, finding points of consensus, and, ideally, avoiding war.

A book born out of the Cold War

In 1939 – just before Adolf Hitler invaded Poland – Burke wrote an influential essay, “The Rhetoric of Hitler’s ‘Battle,’” in which he outlined how Hitler had weaponized language to foment antipathy, scapegoat Jews and unite Germans against a common enemy.

After World War II ended and America’s leaders turned their attention to the Soviet Union, Burke saw some parallels to Hitler in the way language was being weaponized in the U.S.

He worried that the U.S. might remain on a permanent wartime footing and that a drumbeat of oppositional rhetoric directed at the Soviet Union was making the nation susceptible to slipping into yet another war.

Tormented by this possibility, he published two books, “A Grammar of Motives” and “A Rhetoric of Motives,” in which he sought to to inoculate Americans from the sort of political speech that, in his view, could lead to a nuclear holocaust.

Kenneth Burke. Oscar White

“The War of Words” was originally supposed to be part of “A Rhetoric of Motives.” But at the last minute, Burke decided to set it aside and publish it later. Unfortunately, he never ended up publishing it before his death in 1993.

The thesis of “The War of Words” is simple and, in our view, holds up today: Political warfare is ubiquitous, unrelenting and inevitable. News coverage and commentary are frequently biased, whether journalists and readers are aware of it or not. And all media coverage, therefore, demands careful scrutiny.

To Burke, you don’t have to launch social media missives in order to participate in sustaining a polarized political environment.

Instead, the quiet consumption of news reporting is enough to do the trick.

Pick a side

Most people might think that the content of media coverage is the most persuasive component. They assume what gets reported matters more than how it gets reported.

But according to “The War of Words,” this assumption is backwards: An argument’s form is often its most persuasive element.


Join Mobilized for life-changing opportunities, create new partnerships, discover new and improved ways to mobilize your ideas and actions, and discover a whole new world of opportunity dedicated to sustainable development of systems around the world.  Sign up here.


Burke takes pains to catalog the various forms that news writers use in their work and calls them “rhetorical devices.”

One device he calls “headline thinking,” which refers to how an article’s headline can establish the tone and frame of the issue being discussed.

Take, for example, an Aug. 21 article The New York Times ran about how Michael Cohen’s indictment might affect the 2018 midterms. The headline read: “With Cohen Implicating Trump, a Presidency’s Fate Rests With Congress.”

The next day, the Times ran another article on the same topic with the following headline: “Republicans Urge Embattled Incumbents to Speak Out on Trump.”

Both headlines seek to assail the Republican Party. The first implies that the Republican Party, because it holds a majority in Congress, is responsible for upholding justice – and if they don’t indict Trump, they’re clearly protecting him to preserve their political power.

The second headline might seem less malicious than the first. But think about the underlying assumption: Republicans are only urging “embattled” elected officials to speak out against Trump.

The directive, therefore, isn’t born out of political principle. Rather, it’s being made because the party needs to preserve its majority and protect vulnerable incumbents. The unstated claim in this headline is that the Republican Party exhibits political virtue only when it’s needed to quell threats to its power.

If you side with The New York Times, you may be heartened by its efforts to position the Republican Party as craven in its lust for power. If you side with the Republican Party, you are probably disgusted with the paper for claiming that its representatives lack moral virtue.

Either way, the line is drawn: The New York Times is on one side, and the Republican Congress is on the other.

A rhetorical ‘call to arms’

Another device Burke explores is one that he calls “yielding aggressively,” which involves accepting criticism in order to leverage it to one’s own benefit.

We see this at play in an op-ed piece published on Fox News on Aug. 22, 2018. The writer, John Fund, concluded that Michael Cohen’s guilty plea will “likely” not lead to an indictment of President Trump.

To support his argument, he cites Bob Bauer, a former White House counsel to President Barack Obama, who has argued that the campaign finance violations aren’t very significant but are instead being used as a political cudgel.

Fund admits that Cohen’s guilty plea will hurt Trump and make things tougher for his supporters, requiring them “to do a lot of heavy lifting when they come to his defense.” Fund’s editorial also admits to minor lapses in Trump’s judgment – particularly in hiring Cohen, Manafort and Omarosa Manigault Newman. It thus yielded to popular criticisms of Trump.

But this admission is not a call for accountability; it is a call to arms. Fund ultimately argues that if Trump is indicted, it will not be because he is guilty of violating a serious law. It will be because his opponents seek to vanquish him.

Indictment or not, Fund seems to be saying, Trump supporters should be ready for a ferocious political fight come 2020.

Again, the lines are drawn.

How to survive the ‘war of words’

Burke once wrote about how rhetorical devices like those explored above can sustain division and polarization.

“Imagine a passage built about a set of oppositions (‘we do this, but they on the other hand do that; we stay here, but they go there; we look up, but they look down,’ etc.),” he wrote. “Once you grasp the trend of the form, [you see that] it invites participation regardless of the subject matter … you will find yourself swinging along with the succession of antitheses, even though you may not agree with the proposition that is being presented in this form.”

Burke calls this phenomenon “collaborative expectancy” – collaborative because it encourages us to swing along together, and “expectancy” because of the predictability of each side’s argument.

This predictability encourages readers to embrace an argument without considering whether we find it persuasive. They simply sit on one of two opposing sides and nod along.

According to Burke, if you passively consume the news, swinging along with headlines as the midterms unfold, political divisions will likely be further cemented.

However if you become aware of how the media reports you’re consuming seek to subtly position and influence you, you’ll likely seek out more sources and become more deliberative. You might notice what’s missing from a debate, and what really might be motivating the outlet.

To avoid getting sucked into a dynamic of two opposing, gridlocked forces, it’s important for all readers to make their consciousness a matter of conscience.

Source: The Conversation

 

Get Mobilized and Make Love Go Viral!
Continue Reading

Food

Too big to feed: The need to shift our food systems

Published

on

Pat Mooney is the co-founder and executive director of the ETC Group, and is an expert on agricultural diversity, biotechnology, and global governance with decades of experience in international civil society and several awards to his name.

The ETC group is an international civil society organization headquartered in Canada with offices in Mexico, Philippines, Nigeria and USA. ETC group has consultative status with ECOSOC, FAO, UNCTAD, UNEP, UNFCCC, IPCC and the UN Biodiversity Convention.

Since 1977, ETC group has focused on the role of new technologies on the lives and livelihoods of marginalized peoples around the world. Pat Mooney has almost half a century of experience working in international civil society, first addressing aid and development issues and then focusing on food, agriculture and commodity trade.

He received The Right Livelihood Award (the “Alternative Nobel Prize”) in the Swedish Parliament in 1985 and the Pearson Peace Prize from Canada’s Governor General in 1998. He has also received the American “Giraffe Award” given to people “who stick their necks out.” The author or co-author of several books on the politics of biotechnology and biodiversity, Pat Mooney is widely regarded as an authority on issues of agricultural diversity, global governance, and corporate concentration.

Although much of ETC’s work continues to emphasize plant genetics and agriculture, the work expanded in the early 1980s to include biotechnology. In the late 1990s, the work expanded further to encompass a succession of emerging technologies such as nanotechnology, synthetic biology, geoengineering, and new developments ranging from genomics and neurosciences to robotics and 3-D printing. Pat Mooney and ETC group are known for having discovered and named The Terminator seeds – Genetically-modified seeds designed to die at harvest.

Get Mobilized and Make Love Go Viral!
Continue Reading

Featured

The UN climate panel still doesn’t understand technology – and it matters

Published

on

Source: RethinkX

With the Sixth Assessment Report of the United Nations Intergovernmental Panel on Climate Change (IPCC) being released, it’s important to revisit the climate scenarios that are its centerpiece. These scenarios form the basis of the climate science community’s modeling and projections, which in turn affects governance and investment decisions across the world. Trillions of dollars and the policymaking of the entire planet thus ride upon these climate scenarios, and so the cost of getting things wrong is extremely high.

Scenarios past and present

The previous generation of climate scenarios published in the Fifth Assessment Report in 2014 were known as Representative Concentration Pathways, or RCPs. The RCP scenarios were labeled according to the amount of radiative forcing expected by the end of the century in each case. Radiative forcing is the scientific term for the change in the balance between the Earth’s incoming and outgoing energy. The Fifth Assessment Report focused on four of these scenarios, with RCP2.6 having the least warming and thus being the “best case”.

In the eight years since then, a new generation of scenarios has been developed for the Sixth Assessment Report, referred to as Shared Socioeconomic Pathways, or SSPs. The five main SSP scenarios are also labeled according to radiative forcing, but in addition each has a subtitle that tells a story about an imagined future:

  • SSP1-1.9 – Sustainability (Taking the Green Road)
  • SSP1-2.6 – Middle of the Road
  • SSP2-4.5 – Regional Rivalry (a Rocky Road)
  • SSP3-7.0 – Inequality (A Road Divided)
  • SSP5-8.5 – Fossil-Fueled Development (Taking the Highway)

Flaws in climate scenarios

A scenario is only as plausible as the assumptions it makes. Unfortunately, the technology assumptions made in both the RCP and SSP scenarios are not remotely plausible, and as a result they are extremely misleading. If there were even one scenario that made genuinely plausible assumptions, then the others could be useful for comparison. But the lack of any properly plausible one means that, taken together, these scenarios will only cause harm by leading decision-makers and the public badly astray.

First and foremost, all RCP and SSP climate scenarios get technology wrong because they fail to understand the forces that drive technological change, how quickly the shift to new technologies occurs, and how quickly old technologies are abandoned as a result.

Our team at RethinkX has shown that the same pattern of disruption has occurred hundreds of times over the last several thousand years. Again and again, for technologies of all kinds – from cars to carpenter’s nails, from arrowheads to automatic braking systems, from insulin to smartphones – we see that technology adoption follows an s-curve over the course of just 10-20 years. The first phase of the s-curve is characterized by accelerating (or “exponential”) growth, which is driven by reinforcing feedback loops that make the new technology increasingly more competitive while at the same time making the old technology increasingly less competitive.

Unfortunately, the RCP and SSP climate scenarios show no sign that their authors understand technology disruption at all. For example, the “best case” RCP2.6 scenario in the Fifth Assessment Report published in 2014 assumed that less than 5% of global primary energy would come from solar, wind, and geothermal energy combined in the year 2100.

Source: Adapted from Van Vuuren et al., 2011, and IPCC, 2014.

In reality, the exponential trend in the growth of solar and wind power had already been clear for over two decades at the time the Fifth Assessment was published in 2014, and the trend since then has only continued – as shown in the chart below.

(Note that the vertical axis of the chart is logarithmic, increasing by a factor of 10 at each major interval, which means the trajectory is exponential).

On their current trajectory, which has been extraordinarily consistent for over 30 years, solar and wind power will exceed the RCP2.6 assumption for the year 2100 before 2030, 70 years ahead of schedule on an 86-year forecasting timeframe.

This is an egregious error that was entirely avoidable. The energy sector has shown every sign of becoming a textbook example of disruption for more than 15 years, and technology theorists were noticing the signs well before 2014. Indeed, Tony Seba – co-founder of RethinkX – had already published an analysis of the energy disruption in his book Solar Trillions in 2010.

Since 2014, the exponential growth of solar power has become common knowledge, as have similar trajectories for batteries and electric vehicles. It is therefore completely inexcusable that the same mistakes have continued in the new SSP scenarios for the Sixth Assessment Report in 2022. The SSP5-8.5 scenario, for example, is titled “Fossil Fueled Development”. Here is its description:

This world places increasing faith in competitive markets, innovation and participatory societies to produce rapid technological progress and development of human capital as the path to sustainable development. Global markets are increasingly integrated. There are also strong investments in health, education, and institutions to enhance human and social capital. At the same time, the push for economic and social development is coupled with the exploitation of abundant fossil fuel resources and the adoption of resource and energy intensive lifestyles around the world.

This logic around “rapid technological progress” is not just wrong, it’s backwards. The faster we make technological progress, the less fossil fuels we will use. The more global markets are integrated and the more human and social capital we have, the faster we will decarbonize.

The SSP3-7.0 scenario contains the same error:

Technology development is high in the high-tech economy and sectors. The globally connected energy sector diversifies, with investments in both carbon-intensive fuels like coal and unconventional oil, but also low-carbon energy sources.

Again, the basic premise here is false. Technological progress will result in less fossil fuel development, not more. The collapse of coal demand is already well underway in the wealthy countries of the Global North, and all fossil fuels in all countries will follow suit as clean technologies rapidly disrupt the energy and transportation sectors over the next two decades.

The SSP2-4.5 scenario assumes that, “The world follows a path in which social, economic, and technological trends do not shift markedly from historical patterns.” But the authors of this scenario do not understand what those historical patterns of technological change actually are.

As our research at RethinkX has shown, the pattern throughout history has been an s-curve of rapid technology adoption over the course of just 20 years or less once new technologies become economically competitive with older ones – as is now the case for clean energy, transportation, and food technologies. The data throughout history simply do not support the assumption that the shift to new, clean technologies will be slow and linear between now and the year 2100.

The SSP1-1.9 scenario, “sustainability”, is allegedly the most sustainable, but this too is based on false assumptions – namely that lower material, resource, and energy intensity are necessary for reducing environmental impacts, and that they are compatible with increasing human prosperity. Neither is true. The solution to environmental impacts is not less energy, transportation, and food. That would be like thinking that if your house is on fire, the solution is to extinguish some of the flames. That’s madness. The solution is to put the fire out, which means switching rapidly and completely to clean energy, transportation, and food.

If we want to be truly sustainable, we must have a superabundance of clean energy, clean transportation, and clean (i.e. non-animal-derived) food that slashes our environmental footprint and gives us the means to restore and protect ecological integrity worldwide. Any attempt to mitigate our ecological footprint by reducing economic prosperity would be disastrous because the scale of cutbacks needed to have any significant effect on sustainability would be utterly catastrophic to the global economy and geopolitical stability.

Projections to 2100… seriously?

It is worth stepping back a moment and recognizing that the RCP and SSP scenarios make quantitative projections to the year 2100. This in itself is flatly preposterous.

Five thousand years ago, you could have made a reasonably accurate prediction about what life would be like 80 years in the future. After all, not much changed from one generation to the next. Your children’s lives were likely to be very similar to your parents’ lives.

Five hundred years ago, in the year 1522, it would have been considerably more difficult to make an accurate prediction about life 80 years hence. The invention of the moveable-type printing press by Johannes Gutenberg 80 years earlier in around 1440 had helped turbocharge the Renaissance, setting the stage for the Scientific Revolution. Life in 1602 was still quite similar to life in 1522, but an explosion in the growth of useful knowledge was laying the groundwork for massive social, economic, political, and technological transformations to come.

A century ago, in 1922, it would have been very hard for anyone to predict with any accuracy what the world 80 years in the future, in 2002, would be like. Nobody could have imagined the role that nuclear weapons or computers or the Internet would play in our lives, for example.

Today, it is absolutely impossible to predict in any detail what the world will be like 80 years from now, around the year 2100. The rate of technological change is so fast now that our team at RethinkX never makes any quantitative forecasts more than 20 years into the future, because to do so is undisciplined in the formal sense. And technological progress is only accelerating.

Although we cannot know what the world will be like in 2100, we can say that it is implausible to presume the conditions and constraints of today will continue to hold. And this is why we can say that all of the RCP and SSP climate scenarios are implausible: they all presume life in 2100 will be more or less the same as today – still governed by material scarcity, regional resource conflicts, food insecurity, demographic transitions, health and education challenges, and even fossil fuel use. None of these makes even the slightest sense in the context of technologies that we fully expect to see from mid-century onward.

So, what happened? Why did the RCP and SSP climate scenarios get technology so wrong?

Anti-technology sentiments in conventional environmental orthodoxy

At least part of the explanation for fundamental errors and misunderstandings around technology we see in the RCP and SSP climate scenarios is that they were developed by a small group of academic authors operating inside an ideological bubble.

One of the features of this ideological orthodoxy is that it holds long-standing anti-technology sentiments dating back over two centuries to the rise of Romanticism and Transcendentalism. On the one hand, the orthodoxy holds that the arc of history ought to be viewed largely through the lens of human behavior and institutions, minimizing or outright rejecting the causal power of technology to shape societies. There even exists a pejorative term, technological determinism, that is used to label and reflexively dismiss any claims that technology has played a key role in steering the course of human affairs across the ages. Yet, at the same time, this orthodoxy holds technology largely to blame for the massive ecological footprint humanity has imposed upon the planet.

It can’t cut both ways. Either technology has enormous causal power, or it doesn’t.

If it does, then that means technology is also the key to transforming our world in positive ways – including achieving genuine sustainability. We don’t see this accurately reflected anywhere in the RCP or SSP climate scenarios because it runs contrary to the anti-technology sentiments of the prevailing orthodoxy.

When you don’t know enough to know you’re being fooled

The climate science community failed to realize the importance of consulting technology experts in the development of climate scenarios. Instead, they made the mistake of relying on conventional forecasts for technologies like solar and wind power from incumbent energy interests such as the International Energy Agency and the U.S. Energy Information Administration. This would be a bit like relying on Blockbuster Video to accurately forecast the future of streaming video, or Kodak to forecast the future of digital cameras, or the American Horse & Buggy Association to forecast the future of automobiles.

The charts below show the laughably poor forecasting track record of the IEA and U.S. EIA.

 

 

Note that the unreliability of these two ‘authoritative’ sources was already clear when the Fifth Assessment Report was published in 2014. Would you depend on advice in a critical situation from someone who had gotten things wrong over and over again?

More cynically, it’s very difficult to see how the IEA or U.S. EIA making the same “errors” year after year for almost two decades could be an honest mistake. At the same time, it’s very easy to imagine that there are powerful incentives for these incumbents to ignore technological change, or even to deliberately troll others about it.

Regardless, trusting the wrong sources and failing to consult actual technology experts was an inexcusable mistake that the climate science community is unfortunately continuing to make.

Predicting the future is hard

The future is obviously uncertain, and the further ahead we look, the blurrier the picture becomes. At first, it might seem reasonable to err on the side of conservativism – after all, if you don’t know exactly how the world will change in the future, isn’t it best just to assume it won’t change much from the present? The answer is no, but the reason why this logic is flawed is rather subtle.

There are dozens of major dimensions and countless minor ones along which change can occur, all of which move us away from our present condition. The fact that these changes are unpredictable does not imply that the noise will somehow cancel out and leave us close to where we started.

By analogy, imagine assembling a complex machine like a car. If you don’t follow the exact steps in the exact order with the exact parts, you aren’t going to end up with a working car. And if you randomize the assembly process, you’re going to end up with a useless pile of junk. This is why tornadoes don’t spontaneously assemble new cars when they pass through a junkyard. The reason why has to do with entropy: there are almost infinitely more ways to incorrectly assemble things than to correctly assemble them.

This analogy helps show why any movement through a large possibility space is only likely to take you away from your current position. This is why the future will be very different from the present, even though those differences are unpredictable.

So, how should we deal with all the uncertainty of the future? The correct response is indeed to construct multiple scenarios that chart the general trajectory and broad outlines of possible futures based on plausible assumptions about what might change between now and then. The trouble with the RCP and SSP climate scenarios, however, is that none of them make plausible assumptions about technological progress.

Refusing to admit past mistakes only feeds conspiracy theories

The climate science community has made very serious technology forecasting errors in its climate scenarios, but has so far refused to acknowledge and take responsibility for them. This is a losing strategy.

Failure to admit and correct the technology forecasting errors in climate scenarios plays right into the hands of conspiracy theorists, because the longer we refuse to admit we’ve made mistakes, the more it looks like they were deliberate. These mistakes are too large to brush under the rug, and so there is no painless option here. We either admit we were fools, or we look like we are liars.

Admitting our mistakes and taking the heat for it is the right move. The alternative only indulges the worst extremist narratives that claim the scientific community has deliberately inflated the threat of climate change and misrepresented our options for solving it in order to advance an agenda of more taxation and more government control over private industry and individual consumer choices.

The public needs to be able to trust the environmental science community, and they can’t do that until we come clean about how wrong we’ve gotten renewable energy and other technologies in our climate scenarios. The longer we pretend nothing happened, the more our legitimacy will erode in the public sphere at a time when trust of scientific authority is already low in the wake of the COVID-19 pandemic.

Getting technology wrong in climate scenarios does real harm

Given the enormous stakes involving trillions of dollars and all of the world’s policymaking, the errors around technology in the RCP and SSP climate scenarios have had serious consequences. They have misled policymakers and the public alike into believing that the only means to solve climate change are punitive – that we must atone for our past environmental sins by sacrificing human prosperity, tightening our belts, and giving up our indulgent personal lifestyles. They have demonized the prosperity of the rich nations of the Global North as unsustainable, and condemned the aspirations of poorer countries of the Global South as unattainable. They have led nations to waste time and resources trying fruitlessly to achieve sustainability through austerity, when this approach is hopelessly counterproductive as I have previously explained.

Austerity cannot solve climate change even in principle, let alone in practice. Prosperity has always been a necessary precondition for solving big problems, both personal and collective, and so it is the only real path to sustainability as well. Technological progress in general will inevitably play an outsized role in bringing the prosperity we need to tackle major challenges to billions worldwide, and specific technologies like solar power and electric vehicles will give us the tools we need to directly reduce emissions and draw down carbon. The IPCC climate scenarios must reflect these facts so that we can all make well-informed decisions about how best to solve climate change together.

Source: RethinkX

Get Mobilized and Make Love Go Viral!
Continue Reading

Featured

The Disruption of Slavery Unveils Fastest Path to End Today’s Wars

Published

on

Source: Rethink X

blue and yellow fence
Fence in the colours of the national flag of Ukraine, photo by Tina Hartung on Unsplash

We are now at a crossroads in history, and no path forward looks pleasant. The war in Ukraine is killing innocent civilians, disrupting lives, and shaking the markets in energy, food and other commodities, making us wonder how we let ourselves become so complacent in trading with Russia, whose government has shown such little respect for the rights of its neighbors and its own citizens.

The obvious path seems to be to boost oil, gas, coal, food and metals production from friendly countries. Cut ourselves off from Russian oil, Russian gas, Russian grains, metals and other commodities as much as possible by getting them from elsewhere, and fast.

Unfortunately, this is no easy task. Russia is one of the world’s largest exporters of oil and gas, as well as a bewildering array of other materials: wheat (of which they are the world’s largest exporter), ammonia fertilizers (made from natural gas), iron and nickel (used in making steel), gold and titanium, platinum and palladium (used by the oil industry), neon (for lasers used by the electronics industry), cobalt and rhenium. Do you have any spare rhenium lying around?

The harder answer is to reduce our dependence on oil, gas, coal, wheat, precious metals, rare elements and other such things as much as possible, by getting energy from solar power, wind power, and batteries (SWB), producing agricultural products from precision fermentation and cellular agriculture (PFCA), and radically reducing our materials use through dramatic increases in the efficiency of transportation and production through electric vehicles (EVs), autonomous electric vehicles (A-EVs) and Transport-as-a-Service (TaaS).

A rapid reshaping of the world economy will be painful in the short term. Prices of many products we take for granted will go way up. There will be job losses. This ‘worse-before-better’ dynamic is often seen in complex systems. To build a business, you might have to invest and go deeply into debt first – things get worse financially before they turn the corner and get better.

The same will happen with getting out of fossil fuels and industrial agriculture. But in the long run, the build-out in SWB energy and PFCA food and agriculture will mean lower prices, a decentralization of production, and greater political stability.

The steam engine enslaved before it liberated

We have been through similar situations before, where disruptions have had major geopolitical consequences – sometimes negative, sometimes positive. Understanding this complex relationship can help us to navigate the risks and opportunities today.

One of the most geopolitically consequential disruptions took place a couple centuries ago. It illustrates how a disruption can have immediate negative implications but longer term positive impacts that can end up driving a total social, economic and cultural transformation.

For centuries shipwrights filled in the gaps between the boards of their vessels with hemp or other materials. They would slather the surface with tar or pitch to protect the wood from being eaten by worms. Ultimately this did not stop the worms, or keep barnacles and weeds from growing below the waterline, weighing the ship down and slowing its travel. So, periodically, ships would need to be hauled out of the water, scraped clean, and re-tarred.

But a Sunday stroll in May 1765 changed that, and much else, forever. Steam engines had been around for decades, but they were not very efficient. They took a lot of energy to power a pump that was both slow and unreliable. But on that walk in Scotland, twenty-nine-year-old James Watt got the idea to separate the engine’s condenser, which would always be kept cold, from the steam cylinder itself, which was always warm, and to use a valve to connect them. This innovative arrangement was about five times more efficient than existing engines. It took only about 20% as much fuel to do the same amount of work.

This is part of the ‘pattern of disruption’ – a new technology is radically better than an incumbent, and therefore quickly displaces the old technology.

Thanks to Watt’s invention, more-powerful, faster, more-efficient and more-reliable steam engines kicked off an industrial revolution. They were first employed in the mines that supplied the metals to make machines. Their steady power enabled mechanized factories to spin thread and weave fabric, turn wood, and drill metals in a way that was uneconomical with earlier engines. (And, ironically, as they got more efficient, they used more fuel, something we have explored in a previous post.)

In northwest Wales, Parys Mountain was one of the largest copper deposits known in the world at the time. It had been exploited since the Bronze Age because the deposits were near the surface, where they were easy to access. But unfortunately, the ore was not very high grade which meant it took a lot of energy, and therefore a lot of coal, to refine it into metal product. So much coal that it was actually easier to bring the ore to the coal than coal to the ore.

But Watt’s engines changed this equation. They not only made metals easier to obtain and fabric easier to produce, they made it easier to mine for the coal that fuelled those machines and, in later years, to build steam-powered trains and steam-powered ships. They drove a revolution in materials, energy, and transportation all wrapped up in one.

They would also turn out to be the solution to the shipworm problem.

Industrialized mining began at Parys in 1775 and within fifteen years, it was the largest copper mine in the world. Ore was loaded onto ships and brought south to Swansea, where there were coal reserves that could be exploited due to Watt’s steam engines. By 1790 British mines were producing more than 75% of the world’s copper.

Just ten years after Watt’s steam engine patent, the entire British Navy was clad with copper bottoms over a period of just two years from 1779 to 1781. According to Gareth Rees in ‘Copper Sheathing: An Example of Technological Diffusion in the English Merchant Fleet’: “copper sheathing not only solved the problems of worm and hull fouling [like barnacles and weeds], but actually improved sailing speed as an unexpected and welcome by-product.”

This is another part of the ‘pattern of disruption’ – the unintended consequence of a seemingly unrelated problem in shipping being solved by a better water pump.

Amidst all these wonderful unintended consequences was one horrific side-effect. Copper-clad ships travelled about 15% faster, meaning that an 80-day Atlantic crossing could be cut by about 12 days. All ships that went to tropical waters and that needed to move quickly benefited from copper bottoms. Yet there was one kind of merchant ship that benefited financially from greater speed more than any other: slave traders.

Slave trading ships, too, went from less than 10% having copper bottoms to more than 70% having copper bottoms over a period of just two or three years, at the same time as the Navy fleet. This allowed slavery to become more efficient, because fewer enslaved people ended up dying on the ships during transport.

The database at SlaveVoyages.org, a project funded by the U.S. National Endowment for the Humanities, suggests that the death rate was about half on a copper-bottom ship, compared to one without sheathing. Prior to 1780, the proportion of enslaved people lost during a trans-Atlantic voyage was approximately 20%. After the early 1780s, this decreased to about 10%.

The unexpectedly rapid demise of the trans-Atlantic slave trade and slavery

Speaking before the UK Parliament in 2019, environmentalist and TV presenter David Attenborough said:

“There was a time in the 19th century when it was perfectly acceptable for civilised human beings to think that it was morally acceptable to actually own another human being for a slave. And somehow or other, in the space of 20 or 30 years, the public perception of that totally transformed.”

Many historians argue that the key driver of the end of slavery was economics: its decline in profitability. Others argue that it was the rise of abolitionist humanitarian campaigns. The heroic resistance of enslaved people is another important factor. But one of the most critical developments that provided the enabling context for all of these factors is usually overlooked. An overarching factor that enabled and amplified many of the other factors bringing about the end of the trans-Atlantic slave trade, and of the institution of slavery itself, was a technology disruption: the steam engine.

This, first of all, was the key disruption that transformed the economics, making slave labour ultimately uneconomical. The same device that greatly improved the productivity of mining, and that enabled the entire factory system of industrial production, also made muscle labor, for the most part no longer cost-competitive against machine labor – and increasingly so. No wonder, then, that the collapse of the slave trade happened so rapidly within the same time-frame as disruptions take to scale.

 

trans-Atlantic slave trade, by year

Number of People Transported per Year in the Atlantic Slave Trade. Data from SlaveVoyages.org

The total history of the slave trade shows a ‘hump-shaped’ curve – with a sharp increase from about 1650 to 1750 and a peak era that lasted roughly from 1750 to 1850. But then, over only a few years, the entire trade came to a halt. In 1849, 76,654 people were brought from Africa across the Atlantic, a number substantially higher than the average of 63,853 people brought per year in the 1750-1850 period. But in 1850, the number of people transported was half of 1849, and 1851 was half of 1850. Fifteen years later, the trade had ended completely.

The technology disruption that had, at first, helped make the slave trade more efficient – contributing to an increase in its profits – ultimately facilitated its complete collapse as the machine labor turned out to be an order of magnitude cheaper and more efficient than slavery.

Of course, this doesn’t mean technology disruption was the only factor – but it’s hard to see how this rapid, sudden collapse of slavery would have been possible without it.

Back in the late 1700s, the UK’s Slave Trade Act 1788 had tried to make the appalling conditions of the trans-Atlantic slave trade more ‘humane’. Sadly, this was all by small, incremental steps, none of which ultimately challenged the institution of slavery itself – rather like the small half-measures and tiny behavioural changes we talk about today in relation to climate change.

The Act, for instance, mandated more space per enslaved person being transported on British slaver vessels – “1.67 slaves per ton up to a maximum of 207 tons burthen, after which only 1 slave per ton could be carried”.

The earliest impact of the steam-engine disruption alleviated the conditions of slavery further. Applying copper-sheathing to the undersides of slave ships made the trade even more humane, by allowing ships to travel faster, thus cutting the death rate per voyage.

But ultimately, while purporting to make slavery more ‘humane’ these measures really only contributed to entrenching its existence. What made the trade most humane was simply ending it, which became not just feasible, but economically desirable thanks to the total transformation of the economics of the industrial landscape following the steam-engine disruption. Those new economics helped lay the foundations for seismic political and cultural shifts as people recognised entirely new possibilities in how to organise labor and run societies.

It was not just the trans-Atlantic slave trade that collapsed quickly. So did the entire institution of slavery. In the UK, the ‘Society for the Mitigation and Gradual Abolition of Slavery Throughout the British Dominions’, was founded in 1823. By 1833, just ten years later, they had succeeded in their goal. In the US, the institution of slavery also collapsed over just a few years, going from being legal in about half of the states in 1860, to illegal everywhere just five years later. The ultimate collapse of slavery did not, of course, happen peacefully. While the transformation of economic forces ushered in by the steam-engine changed incentives, risks and opportunities, the eruption of the American Civil War illustrates how the unravelling of slavery as an institution was often a violent process.

Ukraine war as precursor to the Age of Freedom?

The steam-engine, of course, ended up doing a lot more than just disrupting slavery. By ushering in the industrial mechanisms of production and manufacturing, it also led us into the age of fossil fuels – and with it, of course, climate change.

It laid the foundation for the geopolitical order that emerged through the twentieth century premised on centralized domination of scarce oil, gas and coal resources. Now we are dependent on what economist Nathan Hagens calls ‘energy slaves’ – fossil fuels that do more work per hour by being burned in combustion engines than a person could ever hope to do. A motor vehicle that only turns a small portion of its fuel into motion is bearable when these fuels are cheap and easy to get. These same products also serve as the feedstocks to make fertilizers that grow the grains we feed to cattle, who turn a small percent of their inputs into milk or meat.

In this system of fossil fuel ‘energy slaves’, Russia is a formidable if not pre-eminent power – given its monopoly over so much of the world’s oil, gas and grain. The war in Ukraine has highlighted the geopolitical dangers of this system. But the unfolding of the ‘pattern of disruption’ in relation to slavery highlights how quickly this system could be replaced with something far better. Rather than simply focusing on the minutia of the fossil fuel system and Russia’s domination of it, this pattern suggests that the most effective path ahead is to disrupt the energy sources that are so crucial to Russian power. As RethinkX’s work has shown, that disruption has already begun. Over the next two decades, the disruption of the energy, transport and food systems by SWB, PFCA and TaaS will make the industries that underpin Russia’s geopolitical clout completely obsolete. European net-zero strategies and commitments illustrate the direction in which the global economy is now inexorably moving.

Yet the war in Ukraine also illustrates that freeing ourselves from fossil fuel ‘energy slaves’ might be as difficult as ending the millennia-old institutions of human slavery was in the past, maybe even involving wars as deadly and devastating over the next decade or two of accelerating disruptions.

But when the task is complete and we live in an ‘Age of Freedom’ where we no longer need fossil energy to power our lives, to move ourselves and our goods, or animals to supply our food, we will look back on how we live today and its attendant geopolitical horrors with the same sense of disgusted fascination that we now feel about keeping other people in bondage.

Source: Rethink X

Get Mobilized and Make Love Go Viral!
Continue Reading

Gaia Talks The Earth Speaks

An Apartheid State (of Mind)

Published

on

Gaia Talks/The Earth Speaks is a Mobilized Co-Production. Produced by Missy Crutchfield and Jeff Van Treese.

Ronnie Barkan is an Israeli dissident and BDS activist.
As a serial disrupter to apartheid representatives, he recently stood trial in Berlin along with two fellow activists (“Humboldt3”) for speaking up against representative Aliza Lavie and her direct responsibility for Israeli crimes.
Barkan negates the patently false liberal-Zionist discourse, offering instead a narrative that challenges the very foundation of the Zionist race-state in Palestine as a supremacist endeavor which practices colonialism and apartheid since the day of its inception.

This conversation took place in early February, 2022.

Read more about and from Ronnie Barkan here

Get Mobilized and Make Love Go Viral!
Continue Reading

Interviews

Translate »