Connect with us

JUSTICE

Judge Temporarily Blocks Private Prison Company’s Plan For Detention Centers

Published

on

 A private prison company’s plan to expand immigration detention by a whopping 350 percent in Kern County was blocked today, after a federal judge ruled in favor of a preliminary injunction requested by Freedom for Immigrants and the Immigrant Legal Resource Center (ILRC). This follows the judge’s grant of a temporary restraining order on this same matter.

The groups requested the preliminary injunction after the city of McFarland approved the private prison company GEO Group’s request to convert two 700-bed facilities in Kern County into annexes for the Mesa Verde ICE Processing Center, which currently cages up to 400 immigrants in Bakersfield daily. These converted facilities were otherwise primed to open at the height of the COVID19 pandemic, and would surely have joined other detention centers in their utter inability to protect people from COVID19.

City officials violated SB 29, the Dignity not Detention Act, which was drafted and co-sponsored by Freedom for Immigrants and the ILRC in 2017 and passed in partnership with the California Dignity not Detention Coalition. The act clearly states that entities must provide at least 180 days’ notice before issuing a permit for an immigration detention facility and they must hold two open meetings for public comments. The city of McFarland failed to do both, despite powerful community turnout reflecting how important this issue is to local communities.

SB 29 is one of many laws passed by the California Dignity not Detention Coalition, including AB 103 and AB 32.  

“City officials clearly violated state law, seeking profits over people, cash over compassion,” said Grisel Ruiz, supervising attorney at the Immigrant Legal Resource Center. “Residents must be given an opportunity to voice their opinion on whether the centers should be brought to their county and they were not afforded that choice. The Dignity Not Detention Act’s aim was to make this process transparent and avoid backroom deals.”

“For too long, private prisons have attempted to circumvent state and federal laws to expand immigration detention and line the pockets of their shareholders. Today, the courts saw through this farce,” said Christina Fialho, an attorney and the co-founder/executive director of Freedom for Immigrants. “This case should remind us all that GEO Group is not a law unto itself, but rather must comply with the laws of any state in which it operates.”


Mobilized is seeking Partnerships

Become a Mobilized collaborator in Creation and help to transform the news for good.

If you have the wisdom and the talent and the desire to become a collaborator in creation of a brighter tomorrow  for the health and well-being of people, planet, purpose, we look forward to your joining us in our journey. Sign up for FREE here!


“Earlier this year, hundreds of fearless McFarland community members did everything in their power to organize and plead to  their City representatives to not sell them out for a Florida corporation’s added profits,” said Lety Valencia, co-Director of Organizing for Faith in the Valley. “At the 11th hour, their City Council decided to go around their own constituents and cut them out of the decision-making process while the rest of the world was focused on a pandemic. While McFarland’s residents and essential workers continue to work tirelessly to feed themselves and the world, today’s news at least gives them some needed hope that this process is far from over.”

The case will be heard in federal district court. The date of the hearing is pending.

Court documents are available upon request.

The Immigrant Legal Resource Center (ILRC) is a national nonprofit that works with immigrants, community organizations, legal professionals, and policy makers to build a democratic society that values diversity and the rights of all people. Through community education programs, legal training & technical assistance, and policy development & advocacy, the ILRC’s mission is to protect and defend the fundamental rights of immigrant families and communities.www.ilrc.org

 Faith in the Valley is a multi-faith, multi-racial coalition of over 100 Central Valley congregations, covering over 200 miles from San Joaquin to Kern counties, organizing in unity for our region’s most vulnerable families and communities, and for a Central Valley that truly protects, values and includes everyone. During this pandemic and at all times, this includes the right for everyone in the Central Valley to work, live and grow in health, shelter and community.  

Freedom for Immigrants is devoted to abolishing immigration detention, while ending the isolation of people currently suffering in this profit-driven system. We monitor the human rights abuses faced by immigrants detained by ICE through a national hotline and network of volunteer detention visitors, while also modeling a community-based alternative to detention that welcomes immigrants into the social fabric of the United States. Through these windows into the system, we gather data and stories to combat injustice at the individual level and push systemic change. Visit www.freedomforimmigrants.org. Follow @MigrantFreedom


Get Mobilized and Make Love Go Viral!
Continue Reading

Food

Too big to feed: The need to shift our food systems

Published

on

Pat Mooney is the co-founder and executive director of the ETC Group, and is an expert on agricultural diversity, biotechnology, and global governance with decades of experience in international civil society and several awards to his name.

The ETC group is an international civil society organization headquartered in Canada with offices in Mexico, Philippines, Nigeria and USA. ETC group has consultative status with ECOSOC, FAO, UNCTAD, UNEP, UNFCCC, IPCC and the UN Biodiversity Convention.

Since 1977, ETC group has focused on the role of new technologies on the lives and livelihoods of marginalized peoples around the world. Pat Mooney has almost half a century of experience working in international civil society, first addressing aid and development issues and then focusing on food, agriculture and commodity trade.

He received The Right Livelihood Award (the “Alternative Nobel Prize”) in the Swedish Parliament in 1985 and the Pearson Peace Prize from Canada’s Governor General in 1998. He has also received the American “Giraffe Award” given to people “who stick their necks out.” The author or co-author of several books on the politics of biotechnology and biodiversity, Pat Mooney is widely regarded as an authority on issues of agricultural diversity, global governance, and corporate concentration.

Although much of ETC’s work continues to emphasize plant genetics and agriculture, the work expanded in the early 1980s to include biotechnology. In the late 1990s, the work expanded further to encompass a succession of emerging technologies such as nanotechnology, synthetic biology, geoengineering, and new developments ranging from genomics and neurosciences to robotics and 3-D printing. Pat Mooney and ETC group are known for having discovered and named The Terminator seeds – Genetically-modified seeds designed to die at harvest.

Get Mobilized and Make Love Go Viral!
Continue Reading

Featured

The UN climate panel still doesn’t understand technology – and it matters

Published

on

Source: RethinkX

With the Sixth Assessment Report of the United Nations Intergovernmental Panel on Climate Change (IPCC) being released, it’s important to revisit the climate scenarios that are its centerpiece. These scenarios form the basis of the climate science community’s modeling and projections, which in turn affects governance and investment decisions across the world. Trillions of dollars and the policymaking of the entire planet thus ride upon these climate scenarios, and so the cost of getting things wrong is extremely high.

Scenarios past and present

The previous generation of climate scenarios published in the Fifth Assessment Report in 2014 were known as Representative Concentration Pathways, or RCPs. The RCP scenarios were labeled according to the amount of radiative forcing expected by the end of the century in each case. Radiative forcing is the scientific term for the change in the balance between the Earth’s incoming and outgoing energy. The Fifth Assessment Report focused on four of these scenarios, with RCP2.6 having the least warming and thus being the “best case”.

In the eight years since then, a new generation of scenarios has been developed for the Sixth Assessment Report, referred to as Shared Socioeconomic Pathways, or SSPs. The five main SSP scenarios are also labeled according to radiative forcing, but in addition each has a subtitle that tells a story about an imagined future:

  • SSP1-1.9 – Sustainability (Taking the Green Road)
  • SSP1-2.6 – Middle of the Road
  • SSP2-4.5 – Regional Rivalry (a Rocky Road)
  • SSP3-7.0 – Inequality (A Road Divided)
  • SSP5-8.5 – Fossil-Fueled Development (Taking the Highway)

Flaws in climate scenarios

A scenario is only as plausible as the assumptions it makes. Unfortunately, the technology assumptions made in both the RCP and SSP scenarios are not remotely plausible, and as a result they are extremely misleading. If there were even one scenario that made genuinely plausible assumptions, then the others could be useful for comparison. But the lack of any properly plausible one means that, taken together, these scenarios will only cause harm by leading decision-makers and the public badly astray.

First and foremost, all RCP and SSP climate scenarios get technology wrong because they fail to understand the forces that drive technological change, how quickly the shift to new technologies occurs, and how quickly old technologies are abandoned as a result.

Our team at RethinkX has shown that the same pattern of disruption has occurred hundreds of times over the last several thousand years. Again and again, for technologies of all kinds – from cars to carpenter’s nails, from arrowheads to automatic braking systems, from insulin to smartphones – we see that technology adoption follows an s-curve over the course of just 10-20 years. The first phase of the s-curve is characterized by accelerating (or “exponential”) growth, which is driven by reinforcing feedback loops that make the new technology increasingly more competitive while at the same time making the old technology increasingly less competitive.

Unfortunately, the RCP and SSP climate scenarios show no sign that their authors understand technology disruption at all. For example, the “best case” RCP2.6 scenario in the Fifth Assessment Report published in 2014 assumed that less than 5% of global primary energy would come from solar, wind, and geothermal energy combined in the year 2100.

Source: Adapted from Van Vuuren et al., 2011, and IPCC, 2014.

In reality, the exponential trend in the growth of solar and wind power had already been clear for over two decades at the time the Fifth Assessment was published in 2014, and the trend since then has only continued – as shown in the chart below.

(Note that the vertical axis of the chart is logarithmic, increasing by a factor of 10 at each major interval, which means the trajectory is exponential).

On their current trajectory, which has been extraordinarily consistent for over 30 years, solar and wind power will exceed the RCP2.6 assumption for the year 2100 before 2030, 70 years ahead of schedule on an 86-year forecasting timeframe.

This is an egregious error that was entirely avoidable. The energy sector has shown every sign of becoming a textbook example of disruption for more than 15 years, and technology theorists were noticing the signs well before 2014. Indeed, Tony Seba – co-founder of RethinkX – had already published an analysis of the energy disruption in his book Solar Trillions in 2010.

Since 2014, the exponential growth of solar power has become common knowledge, as have similar trajectories for batteries and electric vehicles. It is therefore completely inexcusable that the same mistakes have continued in the new SSP scenarios for the Sixth Assessment Report in 2022. The SSP5-8.5 scenario, for example, is titled “Fossil Fueled Development”. Here is its description:

This world places increasing faith in competitive markets, innovation and participatory societies to produce rapid technological progress and development of human capital as the path to sustainable development. Global markets are increasingly integrated. There are also strong investments in health, education, and institutions to enhance human and social capital. At the same time, the push for economic and social development is coupled with the exploitation of abundant fossil fuel resources and the adoption of resource and energy intensive lifestyles around the world.

This logic around “rapid technological progress” is not just wrong, it’s backwards. The faster we make technological progress, the less fossil fuels we will use. The more global markets are integrated and the more human and social capital we have, the faster we will decarbonize.

The SSP3-7.0 scenario contains the same error:

Technology development is high in the high-tech economy and sectors. The globally connected energy sector diversifies, with investments in both carbon-intensive fuels like coal and unconventional oil, but also low-carbon energy sources.

Again, the basic premise here is false. Technological progress will result in less fossil fuel development, not more. The collapse of coal demand is already well underway in the wealthy countries of the Global North, and all fossil fuels in all countries will follow suit as clean technologies rapidly disrupt the energy and transportation sectors over the next two decades.

The SSP2-4.5 scenario assumes that, “The world follows a path in which social, economic, and technological trends do not shift markedly from historical patterns.” But the authors of this scenario do not understand what those historical patterns of technological change actually are.

As our research at RethinkX has shown, the pattern throughout history has been an s-curve of rapid technology adoption over the course of just 20 years or less once new technologies become economically competitive with older ones – as is now the case for clean energy, transportation, and food technologies. The data throughout history simply do not support the assumption that the shift to new, clean technologies will be slow and linear between now and the year 2100.

The SSP1-1.9 scenario, “sustainability”, is allegedly the most sustainable, but this too is based on false assumptions – namely that lower material, resource, and energy intensity are necessary for reducing environmental impacts, and that they are compatible with increasing human prosperity. Neither is true. The solution to environmental impacts is not less energy, transportation, and food. That would be like thinking that if your house is on fire, the solution is to extinguish some of the flames. That’s madness. The solution is to put the fire out, which means switching rapidly and completely to clean energy, transportation, and food.

If we want to be truly sustainable, we must have a superabundance of clean energy, clean transportation, and clean (i.e. non-animal-derived) food that slashes our environmental footprint and gives us the means to restore and protect ecological integrity worldwide. Any attempt to mitigate our ecological footprint by reducing economic prosperity would be disastrous because the scale of cutbacks needed to have any significant effect on sustainability would be utterly catastrophic to the global economy and geopolitical stability.

Projections to 2100… seriously?

It is worth stepping back a moment and recognizing that the RCP and SSP scenarios make quantitative projections to the year 2100. This in itself is flatly preposterous.

Five thousand years ago, you could have made a reasonably accurate prediction about what life would be like 80 years in the future. After all, not much changed from one generation to the next. Your children’s lives were likely to be very similar to your parents’ lives.

Five hundred years ago, in the year 1522, it would have been considerably more difficult to make an accurate prediction about life 80 years hence. The invention of the moveable-type printing press by Johannes Gutenberg 80 years earlier in around 1440 had helped turbocharge the Renaissance, setting the stage for the Scientific Revolution. Life in 1602 was still quite similar to life in 1522, but an explosion in the growth of useful knowledge was laying the groundwork for massive social, economic, political, and technological transformations to come.

A century ago, in 1922, it would have been very hard for anyone to predict with any accuracy what the world 80 years in the future, in 2002, would be like. Nobody could have imagined the role that nuclear weapons or computers or the Internet would play in our lives, for example.

Today, it is absolutely impossible to predict in any detail what the world will be like 80 years from now, around the year 2100. The rate of technological change is so fast now that our team at RethinkX never makes any quantitative forecasts more than 20 years into the future, because to do so is undisciplined in the formal sense. And technological progress is only accelerating.

Although we cannot know what the world will be like in 2100, we can say that it is implausible to presume the conditions and constraints of today will continue to hold. And this is why we can say that all of the RCP and SSP climate scenarios are implausible: they all presume life in 2100 will be more or less the same as today – still governed by material scarcity, regional resource conflicts, food insecurity, demographic transitions, health and education challenges, and even fossil fuel use. None of these makes even the slightest sense in the context of technologies that we fully expect to see from mid-century onward.

So, what happened? Why did the RCP and SSP climate scenarios get technology so wrong?

Anti-technology sentiments in conventional environmental orthodoxy

At least part of the explanation for fundamental errors and misunderstandings around technology we see in the RCP and SSP climate scenarios is that they were developed by a small group of academic authors operating inside an ideological bubble.

One of the features of this ideological orthodoxy is that it holds long-standing anti-technology sentiments dating back over two centuries to the rise of Romanticism and Transcendentalism. On the one hand, the orthodoxy holds that the arc of history ought to be viewed largely through the lens of human behavior and institutions, minimizing or outright rejecting the causal power of technology to shape societies. There even exists a pejorative term, technological determinism, that is used to label and reflexively dismiss any claims that technology has played a key role in steering the course of human affairs across the ages. Yet, at the same time, this orthodoxy holds technology largely to blame for the massive ecological footprint humanity has imposed upon the planet.

It can’t cut both ways. Either technology has enormous causal power, or it doesn’t.

If it does, then that means technology is also the key to transforming our world in positive ways – including achieving genuine sustainability. We don’t see this accurately reflected anywhere in the RCP or SSP climate scenarios because it runs contrary to the anti-technology sentiments of the prevailing orthodoxy.

When you don’t know enough to know you’re being fooled

The climate science community failed to realize the importance of consulting technology experts in the development of climate scenarios. Instead, they made the mistake of relying on conventional forecasts for technologies like solar and wind power from incumbent energy interests such as the International Energy Agency and the U.S. Energy Information Administration. This would be a bit like relying on Blockbuster Video to accurately forecast the future of streaming video, or Kodak to forecast the future of digital cameras, or the American Horse & Buggy Association to forecast the future of automobiles.

The charts below show the laughably poor forecasting track record of the IEA and U.S. EIA.

 

 

Note that the unreliability of these two ‘authoritative’ sources was already clear when the Fifth Assessment Report was published in 2014. Would you depend on advice in a critical situation from someone who had gotten things wrong over and over again?

More cynically, it’s very difficult to see how the IEA or U.S. EIA making the same “errors” year after year for almost two decades could be an honest mistake. At the same time, it’s very easy to imagine that there are powerful incentives for these incumbents to ignore technological change, or even to deliberately troll others about it.

Regardless, trusting the wrong sources and failing to consult actual technology experts was an inexcusable mistake that the climate science community is unfortunately continuing to make.

Predicting the future is hard

The future is obviously uncertain, and the further ahead we look, the blurrier the picture becomes. At first, it might seem reasonable to err on the side of conservativism – after all, if you don’t know exactly how the world will change in the future, isn’t it best just to assume it won’t change much from the present? The answer is no, but the reason why this logic is flawed is rather subtle.

There are dozens of major dimensions and countless minor ones along which change can occur, all of which move us away from our present condition. The fact that these changes are unpredictable does not imply that the noise will somehow cancel out and leave us close to where we started.

By analogy, imagine assembling a complex machine like a car. If you don’t follow the exact steps in the exact order with the exact parts, you aren’t going to end up with a working car. And if you randomize the assembly process, you’re going to end up with a useless pile of junk. This is why tornadoes don’t spontaneously assemble new cars when they pass through a junkyard. The reason why has to do with entropy: there are almost infinitely more ways to incorrectly assemble things than to correctly assemble them.

This analogy helps show why any movement through a large possibility space is only likely to take you away from your current position. This is why the future will be very different from the present, even though those differences are unpredictable.

So, how should we deal with all the uncertainty of the future? The correct response is indeed to construct multiple scenarios that chart the general trajectory and broad outlines of possible futures based on plausible assumptions about what might change between now and then. The trouble with the RCP and SSP climate scenarios, however, is that none of them make plausible assumptions about technological progress.

Refusing to admit past mistakes only feeds conspiracy theories

The climate science community has made very serious technology forecasting errors in its climate scenarios, but has so far refused to acknowledge and take responsibility for them. This is a losing strategy.

Failure to admit and correct the technology forecasting errors in climate scenarios plays right into the hands of conspiracy theorists, because the longer we refuse to admit we’ve made mistakes, the more it looks like they were deliberate. These mistakes are too large to brush under the rug, and so there is no painless option here. We either admit we were fools, or we look like we are liars.

Admitting our mistakes and taking the heat for it is the right move. The alternative only indulges the worst extremist narratives that claim the scientific community has deliberately inflated the threat of climate change and misrepresented our options for solving it in order to advance an agenda of more taxation and more government control over private industry and individual consumer choices.

The public needs to be able to trust the environmental science community, and they can’t do that until we come clean about how wrong we’ve gotten renewable energy and other technologies in our climate scenarios. The longer we pretend nothing happened, the more our legitimacy will erode in the public sphere at a time when trust of scientific authority is already low in the wake of the COVID-19 pandemic.

Getting technology wrong in climate scenarios does real harm

Given the enormous stakes involving trillions of dollars and all of the world’s policymaking, the errors around technology in the RCP and SSP climate scenarios have had serious consequences. They have misled policymakers and the public alike into believing that the only means to solve climate change are punitive – that we must atone for our past environmental sins by sacrificing human prosperity, tightening our belts, and giving up our indulgent personal lifestyles. They have demonized the prosperity of the rich nations of the Global North as unsustainable, and condemned the aspirations of poorer countries of the Global South as unattainable. They have led nations to waste time and resources trying fruitlessly to achieve sustainability through austerity, when this approach is hopelessly counterproductive as I have previously explained.

Austerity cannot solve climate change even in principle, let alone in practice. Prosperity has always been a necessary precondition for solving big problems, both personal and collective, and so it is the only real path to sustainability as well. Technological progress in general will inevitably play an outsized role in bringing the prosperity we need to tackle major challenges to billions worldwide, and specific technologies like solar power and electric vehicles will give us the tools we need to directly reduce emissions and draw down carbon. The IPCC climate scenarios must reflect these facts so that we can all make well-informed decisions about how best to solve climate change together.

Source: RethinkX

Get Mobilized and Make Love Go Viral!
Continue Reading

Featured

The Disruption of Slavery Unveils Fastest Path to End Today’s Wars

Published

on

Source: Rethink X

blue and yellow fence
Fence in the colours of the national flag of Ukraine, photo by Tina Hartung on Unsplash

We are now at a crossroads in history, and no path forward looks pleasant. The war in Ukraine is killing innocent civilians, disrupting lives, and shaking the markets in energy, food and other commodities, making us wonder how we let ourselves become so complacent in trading with Russia, whose government has shown such little respect for the rights of its neighbors and its own citizens.

The obvious path seems to be to boost oil, gas, coal, food and metals production from friendly countries. Cut ourselves off from Russian oil, Russian gas, Russian grains, metals and other commodities as much as possible by getting them from elsewhere, and fast.

Unfortunately, this is no easy task. Russia is one of the world’s largest exporters of oil and gas, as well as a bewildering array of other materials: wheat (of which they are the world’s largest exporter), ammonia fertilizers (made from natural gas), iron and nickel (used in making steel), gold and titanium, platinum and palladium (used by the oil industry), neon (for lasers used by the electronics industry), cobalt and rhenium. Do you have any spare rhenium lying around?

The harder answer is to reduce our dependence on oil, gas, coal, wheat, precious metals, rare elements and other such things as much as possible, by getting energy from solar power, wind power, and batteries (SWB), producing agricultural products from precision fermentation and cellular agriculture (PFCA), and radically reducing our materials use through dramatic increases in the efficiency of transportation and production through electric vehicles (EVs), autonomous electric vehicles (A-EVs) and Transport-as-a-Service (TaaS).

A rapid reshaping of the world economy will be painful in the short term. Prices of many products we take for granted will go way up. There will be job losses. This ‘worse-before-better’ dynamic is often seen in complex systems. To build a business, you might have to invest and go deeply into debt first – things get worse financially before they turn the corner and get better.

The same will happen with getting out of fossil fuels and industrial agriculture. But in the long run, the build-out in SWB energy and PFCA food and agriculture will mean lower prices, a decentralization of production, and greater political stability.

The steam engine enslaved before it liberated

We have been through similar situations before, where disruptions have had major geopolitical consequences – sometimes negative, sometimes positive. Understanding this complex relationship can help us to navigate the risks and opportunities today.

One of the most geopolitically consequential disruptions took place a couple centuries ago. It illustrates how a disruption can have immediate negative implications but longer term positive impacts that can end up driving a total social, economic and cultural transformation.

For centuries shipwrights filled in the gaps between the boards of their vessels with hemp or other materials. They would slather the surface with tar or pitch to protect the wood from being eaten by worms. Ultimately this did not stop the worms, or keep barnacles and weeds from growing below the waterline, weighing the ship down and slowing its travel. So, periodically, ships would need to be hauled out of the water, scraped clean, and re-tarred.

But a Sunday stroll in May 1765 changed that, and much else, forever. Steam engines had been around for decades, but they were not very efficient. They took a lot of energy to power a pump that was both slow and unreliable. But on that walk in Scotland, twenty-nine-year-old James Watt got the idea to separate the engine’s condenser, which would always be kept cold, from the steam cylinder itself, which was always warm, and to use a valve to connect them. This innovative arrangement was about five times more efficient than existing engines. It took only about 20% as much fuel to do the same amount of work.

This is part of the ‘pattern of disruption’ – a new technology is radically better than an incumbent, and therefore quickly displaces the old technology.

Thanks to Watt’s invention, more-powerful, faster, more-efficient and more-reliable steam engines kicked off an industrial revolution. They were first employed in the mines that supplied the metals to make machines. Their steady power enabled mechanized factories to spin thread and weave fabric, turn wood, and drill metals in a way that was uneconomical with earlier engines. (And, ironically, as they got more efficient, they used more fuel, something we have explored in a previous post.)

In northwest Wales, Parys Mountain was one of the largest copper deposits known in the world at the time. It had been exploited since the Bronze Age because the deposits were near the surface, where they were easy to access. But unfortunately, the ore was not very high grade which meant it took a lot of energy, and therefore a lot of coal, to refine it into metal product. So much coal that it was actually easier to bring the ore to the coal than coal to the ore.

But Watt’s engines changed this equation. They not only made metals easier to obtain and fabric easier to produce, they made it easier to mine for the coal that fuelled those machines and, in later years, to build steam-powered trains and steam-powered ships. They drove a revolution in materials, energy, and transportation all wrapped up in one.

They would also turn out to be the solution to the shipworm problem.

Industrialized mining began at Parys in 1775 and within fifteen years, it was the largest copper mine in the world. Ore was loaded onto ships and brought south to Swansea, where there were coal reserves that could be exploited due to Watt’s steam engines. By 1790 British mines were producing more than 75% of the world’s copper.

Just ten years after Watt’s steam engine patent, the entire British Navy was clad with copper bottoms over a period of just two years from 1779 to 1781. According to Gareth Rees in ‘Copper Sheathing: An Example of Technological Diffusion in the English Merchant Fleet’: “copper sheathing not only solved the problems of worm and hull fouling [like barnacles and weeds], but actually improved sailing speed as an unexpected and welcome by-product.”

This is another part of the ‘pattern of disruption’ – the unintended consequence of a seemingly unrelated problem in shipping being solved by a better water pump.

Amidst all these wonderful unintended consequences was one horrific side-effect. Copper-clad ships travelled about 15% faster, meaning that an 80-day Atlantic crossing could be cut by about 12 days. All ships that went to tropical waters and that needed to move quickly benefited from copper bottoms. Yet there was one kind of merchant ship that benefited financially from greater speed more than any other: slave traders.

Slave trading ships, too, went from less than 10% having copper bottoms to more than 70% having copper bottoms over a period of just two or three years, at the same time as the Navy fleet. This allowed slavery to become more efficient, because fewer enslaved people ended up dying on the ships during transport.

The database at SlaveVoyages.org, a project funded by the U.S. National Endowment for the Humanities, suggests that the death rate was about half on a copper-bottom ship, compared to one without sheathing. Prior to 1780, the proportion of enslaved people lost during a trans-Atlantic voyage was approximately 20%. After the early 1780s, this decreased to about 10%.

The unexpectedly rapid demise of the trans-Atlantic slave trade and slavery

Speaking before the UK Parliament in 2019, environmentalist and TV presenter David Attenborough said:

“There was a time in the 19th century when it was perfectly acceptable for civilised human beings to think that it was morally acceptable to actually own another human being for a slave. And somehow or other, in the space of 20 or 30 years, the public perception of that totally transformed.”

Many historians argue that the key driver of the end of slavery was economics: its decline in profitability. Others argue that it was the rise of abolitionist humanitarian campaigns. The heroic resistance of enslaved people is another important factor. But one of the most critical developments that provided the enabling context for all of these factors is usually overlooked. An overarching factor that enabled and amplified many of the other factors bringing about the end of the trans-Atlantic slave trade, and of the institution of slavery itself, was a technology disruption: the steam engine.

This, first of all, was the key disruption that transformed the economics, making slave labour ultimately uneconomical. The same device that greatly improved the productivity of mining, and that enabled the entire factory system of industrial production, also made muscle labor, for the most part no longer cost-competitive against machine labor – and increasingly so. No wonder, then, that the collapse of the slave trade happened so rapidly within the same time-frame as disruptions take to scale.

 

trans-Atlantic slave trade, by year

Number of People Transported per Year in the Atlantic Slave Trade. Data from SlaveVoyages.org

The total history of the slave trade shows a ‘hump-shaped’ curve – with a sharp increase from about 1650 to 1750 and a peak era that lasted roughly from 1750 to 1850. But then, over only a few years, the entire trade came to a halt. In 1849, 76,654 people were brought from Africa across the Atlantic, a number substantially higher than the average of 63,853 people brought per year in the 1750-1850 period. But in 1850, the number of people transported was half of 1849, and 1851 was half of 1850. Fifteen years later, the trade had ended completely.

The technology disruption that had, at first, helped make the slave trade more efficient – contributing to an increase in its profits – ultimately facilitated its complete collapse as the machine labor turned out to be an order of magnitude cheaper and more efficient than slavery.

Of course, this doesn’t mean technology disruption was the only factor – but it’s hard to see how this rapid, sudden collapse of slavery would have been possible without it.

Back in the late 1700s, the UK’s Slave Trade Act 1788 had tried to make the appalling conditions of the trans-Atlantic slave trade more ‘humane’. Sadly, this was all by small, incremental steps, none of which ultimately challenged the institution of slavery itself – rather like the small half-measures and tiny behavioural changes we talk about today in relation to climate change.

The Act, for instance, mandated more space per enslaved person being transported on British slaver vessels – “1.67 slaves per ton up to a maximum of 207 tons burthen, after which only 1 slave per ton could be carried”.

The earliest impact of the steam-engine disruption alleviated the conditions of slavery further. Applying copper-sheathing to the undersides of slave ships made the trade even more humane, by allowing ships to travel faster, thus cutting the death rate per voyage.

But ultimately, while purporting to make slavery more ‘humane’ these measures really only contributed to entrenching its existence. What made the trade most humane was simply ending it, which became not just feasible, but economically desirable thanks to the total transformation of the economics of the industrial landscape following the steam-engine disruption. Those new economics helped lay the foundations for seismic political and cultural shifts as people recognised entirely new possibilities in how to organise labor and run societies.

It was not just the trans-Atlantic slave trade that collapsed quickly. So did the entire institution of slavery. In the UK, the ‘Society for the Mitigation and Gradual Abolition of Slavery Throughout the British Dominions’, was founded in 1823. By 1833, just ten years later, they had succeeded in their goal. In the US, the institution of slavery also collapsed over just a few years, going from being legal in about half of the states in 1860, to illegal everywhere just five years later. The ultimate collapse of slavery did not, of course, happen peacefully. While the transformation of economic forces ushered in by the steam-engine changed incentives, risks and opportunities, the eruption of the American Civil War illustrates how the unravelling of slavery as an institution was often a violent process.

Ukraine war as precursor to the Age of Freedom?

The steam-engine, of course, ended up doing a lot more than just disrupting slavery. By ushering in the industrial mechanisms of production and manufacturing, it also led us into the age of fossil fuels – and with it, of course, climate change.

It laid the foundation for the geopolitical order that emerged through the twentieth century premised on centralized domination of scarce oil, gas and coal resources. Now we are dependent on what economist Nathan Hagens calls ‘energy slaves’ – fossil fuels that do more work per hour by being burned in combustion engines than a person could ever hope to do. A motor vehicle that only turns a small portion of its fuel into motion is bearable when these fuels are cheap and easy to get. These same products also serve as the feedstocks to make fertilizers that grow the grains we feed to cattle, who turn a small percent of their inputs into milk or meat.

In this system of fossil fuel ‘energy slaves’, Russia is a formidable if not pre-eminent power – given its monopoly over so much of the world’s oil, gas and grain. The war in Ukraine has highlighted the geopolitical dangers of this system. But the unfolding of the ‘pattern of disruption’ in relation to slavery highlights how quickly this system could be replaced with something far better. Rather than simply focusing on the minutia of the fossil fuel system and Russia’s domination of it, this pattern suggests that the most effective path ahead is to disrupt the energy sources that are so crucial to Russian power. As RethinkX’s work has shown, that disruption has already begun. Over the next two decades, the disruption of the energy, transport and food systems by SWB, PFCA and TaaS will make the industries that underpin Russia’s geopolitical clout completely obsolete. European net-zero strategies and commitments illustrate the direction in which the global economy is now inexorably moving.

Yet the war in Ukraine also illustrates that freeing ourselves from fossil fuel ‘energy slaves’ might be as difficult as ending the millennia-old institutions of human slavery was in the past, maybe even involving wars as deadly and devastating over the next decade or two of accelerating disruptions.

But when the task is complete and we live in an ‘Age of Freedom’ where we no longer need fossil energy to power our lives, to move ourselves and our goods, or animals to supply our food, we will look back on how we live today and its attendant geopolitical horrors with the same sense of disgusted fascination that we now feel about keeping other people in bondage.

Source: Rethink X

Get Mobilized and Make Love Go Viral!
Continue Reading

Gaia Talks The Earth Speaks

An Apartheid State (of Mind)

Published

on

Gaia Talks/The Earth Speaks is a Mobilized Co-Production. Produced by Missy Crutchfield and Jeff Van Treese.

Ronnie Barkan is an Israeli dissident and BDS activist.
As a serial disrupter to apartheid representatives, he recently stood trial in Berlin along with two fellow activists (“Humboldt3”) for speaking up against representative Aliza Lavie and her direct responsibility for Israeli crimes.
Barkan negates the patently false liberal-Zionist discourse, offering instead a narrative that challenges the very foundation of the Zionist race-state in Palestine as a supremacist endeavor which practices colonialism and apartheid since the day of its inception.

This conversation took place in early February, 2022.

Read more about and from Ronnie Barkan here

Get Mobilized and Make Love Go Viral!
Continue Reading

Interviews

Translate »
Skip to toolbar