Description
Conference Presentation of this Paper
Abstract – Energy is the currency of life and the foundation for all digital preservation activities. It is energy, in the form of electricity, which allows us to write physical marks on various substrates and energy which allows us to read those marks. The industrial revolution, powered by coal and oil, supercharged our technology by greatly increasing the amount of energy available to us at only the cost of extraction. This surge of cheap energy into our civilization was what allowed us to advance our technology and develop computers and the digital realm.
As we move into the middle of this century, energy will affect our ability to preserve digital materials in two critical ways. First, the age of cheap oil is coming to an end. As oil becomes more expensive to extract, less and less energy will be available. Second, the byproducts of burning fossil fuels such as carbon dioxide, methane, and nitrogen, create greenhouse gasses (GHGs) which have warmed our atmosphere and destabilized our climate. The more energy we use from fossil fuels, the more we create an inhospitable planet.
To proactively plan for the future ahead of us, the digital preservation community needs to become aware of the risks inherent in our critical dependency on energy and enact adaptation strategies to address those threats. Methodologies, like those presented in the Decision Making Under Deep Uncertainty (DMDU) suite, provide a way of selecting strategies that are adaptive to multiple futures. While there’s no way of knowing what the future will bring, the current state of our climate guarantees that it will not be business as usual.
Keywords – energy, climate change, decision making, digital preservation
This paper was submitted for the iPRES2024 conference on March 17, 2024 and reviewed by Maureen Kenga, Andrea Goethals and 2 anonymous reviewers. The paper was accepted with reviewer suggestions on May 6, 2024 by co-chairs Heather Moulaison-Sandy (University of Missouri), Jean-Yves Le Meur (CERN) and Julie M. Birkholz (Ghent University & KBR) on behalf of the iPRES2024 Program Committee.
Energy is the currency of life and the foundation for all digital preservation activities. It is energy, in the form of electricity, which allows us to store data by writing physical marks on various substrates and energy which allows us to read those marks. The industrial revolution, powered by coal and oil, supercharged our technology by greatly increasing the amount of energy available to us at only the cost of extraction.
The use and availability of energy is a critical factor in the rise of sociopolitical systems and in the fall of civilizations. [1] Humans have always depended on the free energy of the sun to grow the plants we and our domesticated animals eat, as well as the wood we burn to keep us warm. Ten thousand years ago, as agricultural practices grew, we learned how to harness that energy and store it in the form of grains. With the ability to store surplus energy, we were able to build and develop cities and more complex forms of society. [2] The development of our current modern industrial civilization was supercharged with the discovery of dense energy deposits found below the crust of the earth: coal and then later oil. Instead of relying only on the physical labor provided by humans or animals, we could combine the power of fossil fuel energy with engines to do work. Crude oil was especially potent: Providing more energy (joules) per kilogram than coal, its liquid state allows for it to be transported conveniently through pipelines, and it can be extracted by machines rather than mined by humans. [3] As the amount of energy available for human use increased, the human population exploded as did our economic growth. [4]
Buckminster Fuller coined the term “energy slave” to put humanity’s use of fossil fuels into context. In a 1940 article in Fortune magazine, he calculated the yield of an energy slave by taking the energy used from minerals and water consumed by industry and dividing it by the energy a human being can provide, which is approximately 2,000 kilojoules per eight-hour day. One energy slave represents one unit of human labor. [5] Global fossil fuel use is equivalent to at least 800 billion humans working for 8 hours a day, or roughly 100 energy slaves per person on Earth. But because earth’s resources aren’t distributed equitably, the average American has an average of 300 energy slaves working for them while the average Haitian only has one. [6]
As the human population grew and more and more humans were freed up from providing the agricultural labor needed to feed the growing population, technology advanced and the increased specialization of professional roles ushered in the age of computers. The digital preservation community largely ignores the role of non-renewable fossil fuels in providing the energetic basis for our profession specializing into existence, and generally starts discussion with the age of computing. [7] The Open Archival Information System Reference Model (OAIS), the foundational standard for the profession, does not even mention the word “energy”. [8] This is not an oversight limited to the digital preservation profession; it is also applicable to modern industrial civilization as a whole. Our civilization tends to be “energy blind” and overlooks the important role cheap energy from burning fossil fuels has played in our advances, focusing instead on human ingenuity through technological innovations. [9]
Yet energy is the foundation of all things digital, and thus all things digital preservation. It takes energy to write bits on a disk. It takes energy to read them. It takes energy to cool the physical substrate that the bits live on so that they don’t get too hot or too humid. It takes energy to run fixity checks to ensure that the bits have not changed over time. Information and Communications Technology (ICT) infrastructure relies on the power grid it connects to, and 60% of the electricity generated in the United States comes from fossil fuels. [10] And that is just the operating energy needed. Digital materials require energy for their storage creation, also known as embodied energy. Oil is transformed into plastic cases and cables. Drills dig into the earth to extract minerals and metals used to create the electronic elements. In general, resource extraction takes place in the global south (accompanied by the deleterious environmental effects of mining), and the raw materials are transported to China for production before the final products are delivered to the global north for use. [11] Oil is needed to fuel the ships transporting these materials around the globe and to transport the manufactured goods to and from shipping docks. Industrial energy consumption is largely powered by fossil fuels, and accounts for about 25% of all carbon dioxide emissions. [12]
For a profession so dependent on energy, especially one which has a practice of risk assessment as a core activity [13], the lack of attention paid to our energy dependency is perplexing, if understandable. After all, energy is the invisible backbone of modern industrial civilization. Yet if we broaden our perspective outside of our practice, we can begin to identify larger, systemic risks that will affect digital preservation practice across the board.
Peak oil, or the point at which the maximum rate of global oil production will occur, after which it will start to decline, was first coined by Martin King Hubbert in 1956. [14] Peak oil is less of an issue due to lack of oil reserves than it is to Energy Returned on Energy Invested (EROEI) or Net Energy. Essentially, there are different grades of oil quality, and different difficulties in extracting oil. The amount of energy required to extract oil can be compared to the amount of energy that it provides. In the early days of oil production, when there were large amounts of conventional oil available which was easily extracted, the EROEI was approximately 100:1. For every one measure of energy needed to extract the oil, 100 measures were returned. High quality oil that was easy to extract was extracted first, the subsequent result being that these oil reserves have largely declined. Today, fracking and other technologies allow us to extract oil from areas which were tricker to extract from in the past, such as tar sands and shale rock. [15] These processes are more energy intensive, and thus the EROEI today is around 10:1 to 6:1.1 [16] [15] While it is unlikely that we will use up all the oil in the earth, it is likely that the energy used to extract it will at some point be so close to the energy it provides that extraction will no longer be profitable. Oil industry specialist Arthur Berman believes that this could happen as early as a decade from now. [15] Humans have proved to be quite successful in “kicking the can down the road,” by being able to implement new technologies or use new resources just in time, but this is a key area to track for reasons discussed below.
The fact that the extraction of oil may be nearing its end may come as welcome relief to those who are concerned about global warming. After all, oil that is not extracted cannot be burned, and burning oil releases carbon dioxide into the atmosphere, effectively warming the planet. Global warming is an existential threat to humanity and the creatures that share this planet with us. Yet oil is an essential part of our economy and how we’ve constructed our lives. The North American lifestyle is built on the car, and even if all cars magically became electric tomorrow, fossil fuels are required to maintain the roads. Cement and steel are both manufactured using high temperatures only economically feasible by burning fossil fuels. [17] In most industrial agricultural and manufacturing environments, there is no substitute for fossil fuel products. Even the production of substrates used to capture renewable energy, like wind turbines or batteries, relies on oil. Oil derivatives are used to create plastics, which are a vital part of food packaging and distribution as they keep food fresher for longer. [18] Polyester, which is made from petroleum, is the most widely used fiber worldwide and accounted for over 54% of fiber production in 2021. [19] A drastic reduction of the oil supply in the near future would cause economic contraction, severe market turbulence, food production and distribution issues, and cascading effects such as political unrest. All of which, of course, would also affect our ability to preserve digital materials.
This is modern industrial civilization’s catch-22: Our world is built on access to cheap fossil fuel energy. Without it, we face serious hardships. Yet the more fossil fuels we burn, the more we damage our biosphere, our only home. Green energy may assist in the transition, but it has been predominantly adopted to generate electricity. As such, it can’t replace the many other fossil fuel uses mentioned above. “Renewable” technology isn’t renewable; the sun or the wind it captures is renewable, not the technology. The physical components of a battery or a wind turbine require nonrenewable resources like minerals and metals, which are also on earth in finite quantities. [20] And unfortunately, Jevons Paradox, an economical concept which states that gains in efficiencies in resource usage (such as energy) cause an increase -- not a decrease as commonly expected -- in the use of those resources, appears to be holding true. This is because adding more of a resource to the market generally decreases its cost. Thus, adding green energy to the market increases the amount of energy available, thereby driving down costs and increasing the overall energy demand. This increased demand makes the transition to low-carbon energy sources more difficult. [21] The exuberant adoption of Artificial Intelligence (AI) also promises to drastically escalate energy demand. [22]
In 2015, 196 parties adopted the Paris agreement, a legally binding international treaty on climate change, with the overarching goal of keeping global warming to well below 2°C relative to the pre-industrial period between 1850-1900. [23] In 2018, the Intergovernmental Panel on Climate Change (IPCC) released the Special Report on Global Warming of 1.5°C, indicating that the previous effects of reaching 1.5°C warming were underestimated and stressing the importance of immediate and deep cuts in GHG emissions. [24] In May of 2023, the IPCC’s latest report, the Sixth Assessment Report, indicated that we are likely to reach 1.5°C in the “near-term” meaning the early 2030’s. [25]
In November of last year, renowned climate scientist James Hansen published an article stating that “under the present geopolitical approach to GHG emissions, global warming will exceed 1.5°C in the 2020s and 2°C before 2050.” [26] Hansen’s article states that earth is much more sensitive to GHGs than the IPCC accounts for in its models and illustrates the recent nonlinear jump in temperature rise which drastically alters our expected warming pathway.2
Although only time will tell how accurate Hansen’s predictions are, they are bolstered by the incredible air and sea temperatures that have occurred since June of 2023. Last year (2023) record temperatures averaged 1.48°C warming, with the last six months of the year all averaging above 1.5°C and several days breaking the 2°C barrier for the first time. [27] This is significantly higher than the 1.1°C warming already recorded. [28] The last twelve months (February 2023 - February 2024) have been an average of 1.56°C warmer than pre-industrial levels. Sea temperatures have also been rising, reaching a record high this past February. [29]
This does not necessarily mean that the goal of staying at or below 1.5°C warming is dead, since the IPCC temperatures are averaged out over 20 years; we won't know until more time has passed. These temperatures also reflect the current El Niño weather pattern which is known to cause hotter weather. However, it is becoming increasingly clear that we are now “entering uncharted territory” and that as an “alarming and unprecedented succession of climate records are broken.... we are entering an unfamiliar domain regarding our climate crisis, a situation no one has ever witnessed firsthand in the history of humanity.” [30]
This nonlinear rise in temperatures also suggests that tipping points, or nonreversible changes in ecological or climate systems may be at play. [26] Tipping points can increase the heat in the atmosphere without additional GHG emissions. For example, Arctic ice has a high reflectivity (albedo). The less ice we have in the Arctic, the less of the sun’s energy is reflected out of the atmosphere. That energy is instead absorbed by the oceans and land. Tipping points are generally not well understood and are hard to model, but their potential potency suggests that the more carbon we allow into the atmosphere, the more likely a disproportionate amount of warming will occur.
Even though temperatures are skyrocketing, records are being broken, and tipping points may be tipping, global GHG emissions are still increasing. [31] This failure to enact the promises made in the Paris Agreement is perhaps best summed up by the title of the United Nations 2023 Emissions Gap Report: Broken Record: Temperatures hit new highs, yet world fails to cut emissions (again). [32]
There is some literature concerning climate change and digital preservation, and it is generally focused on mitigation: reducing the GHGs we release into the atmosphere. Pendergrass et al. provide an excellent overview of the literature on sustainability and cultural heritage organizations. [33] While mitigation is essential for preventing further warming, it is becoming increasingly evident that adaptation, or dealing with the climate change we are unable to prevent,[34] is needed just as urgently. Many nations in the global south are already in the midst of climate crises, even though they are also the least responsible for GHG emissions. [35] Some activities, like moving physical locations or data centers due to sea level rise or wildfire risk, can take years to plan. The sooner we can address the changes we need to make and build resilience into our institutions and preservation practices, the more successful we’ll be. But first we must recognize the impending risks climate change poses and be willing to address them head on. The question then becomes, what changes will we need to make?
As mentioned earlier, digital preservation activities incur both embodied energy costs and operating energy costs. Embodied energy consists of the energy needed to manufacture and deliver the storage mechanisms used for preservation purposes. This energy includes mining for minerals and metals, refining oil or natural gas into plastics, transporting raw materials to storage manufacturers, the manufacturing process, and shipping finished products to where they’ll be used. Each of these supply chain steps present risks to preservation practitioners, primarily due to potential supply and production issues increasing costs. For example, in 2011, floods in Thailand shut down production of Western Digital’s hard disk drive (HDD) manufacturing plant. At the time, Western Digital produced 25% of the global HDD supply. With that much supply off the market, prices increased 47%. [36] One of the major effects of a warmer world is increased sea levels which will cause more floods. Inversely, climate change can also cause droughts, like the recent one affecting shipping traffic through the Panama Canal, a major supply route. [37] The raw materials needed to produce digital storage may also become increasingly difficult to source. The fuels - predominately petroleum derivatives - needed to transport the materials and finished products around the world are likely to increase in cost as their EROEI decreases. The minerals and metals needed for production of digital materials come from non-renewable sources and are likely to face shortages in the future. [38] All these factors point to a future in which digital storage is increasingly expensive. Storage needs for preservation are also increasing [39]. Because of the cascading risks climate change presents, we are also likely to face economic contraction, leading to decreased budgets. Thus, the risks that embodied energy presents to us predominantly consist of a variety of potential supply chain issues which increase the cost of storage. This increased cost comes at a time when we will likely face both economic contractions and are actively growing our collections.
How can we adapt to increasing storage needs, increasing storage costs, and decreasing budgets? One obvious answer is to reduce our collecting purview and cull materials according to established criteria. Reducing the size of our preservation corpus also reduces the size of its environmental footprint. In the 2022 NDSA Fixity Report, 17 survey respondents reported preserving collections over one petabyte in size, 6 of which had collections over five petabytes, before any redundant copies were made. [40] Given the rich and voluminous amount of materials available on archival appraisal, methods of reducing the amount of preserved materials are outside the scope of this paper, but they should be more fully investigated and discussed given the ethical considerations of the environmental impacts of preservation. Stewarding a significant amount of data should be considered a risk factor, especially if there has been no consideration given to selection of the most essential materials to preserve. Without such selection or discussion of what criteria “essential” entails, then what gets preserved may be left to the fortunes (or misfortunes) of the stewarding institutions. Any reduction in energy expenditure makes an organization more adapted to future circumstances where less energy is available.
The second grouping of energy costs for digital preservation are operating costs, or the energy needed for preservation activities after the digital storage is manufactured. The source of this energy is largely dependent on electrical grids, although transportation costs for employees can also be considered. Both climate change and reduced energy availability pose significant threats to the continuation of digital preservation activities. Furthermore, these threats may be elevated depending on the specific geographical locations of a stewarding institution and the data centers it employs. For example, extreme temperature variations are a stress on electrical grids as more power is needed to heat or cool living spaces. Data centers present immense electrical demands and compete with local populations for potable water. [41] Extreme weather also includes hurricanes, floods, wildfires, and tornadoes which can all damage the physical buildings which hold the data storage and affect the employees who manage it. There are a variety of tools available to start identifying geographically based risks such as sea level rise and risk of wildfire.3 These tools provide a means for starting to understand the climate vulnerability profile of an organization and its contracted data centers.
Diversifying the geographical redundancy of preservation copies is a recommended best practice in digital preservation, and it’s important to update our location-based risk analysis to include climate change factors. For example, sea level rise is expected to flood 235 data centers in the US by 2035. [42] These data centers are in areas - Seattle, New York, and Miami - that may once have been considered to be geographically distinct. Evaluating them through sea level rise projections illustrates that they all share the same threat even though they are over 1000 miles (1609 km) from one another. In addition to data center locations, connections between data centers are made vulnerable by climate change. In 2018, Durairajan et al. found that 4,067 miles of fiber conduit will be underwater due to sea level rise, and 1,101 nodes, which include points of presence (POPs) and colocation centers, will be inundated with water by 2035. [42] Subsea fiber-optic cable systems which connect continents are also at risk of damage by storm surges, waves, cyclones, floods, submarine landslides, and ice scour. [43] When evaluating redundant locations for preserved objects, potential connection vulnerabilities should also be assessed.
The direct effects of climate change are easier to project for the variables of geographical location and expected degree warming due to the availability of data sets and visualizers. However, the effects that occur as part of a risk cascade are incredibly complex and it's much more difficult to assess appropriate adaptation measures. How does an organization determine the amount of political conflict it will face in the future and assess how that conflict may affect its business activities, including preservation? Similarly, there are many other effects of risk cascades - food insecurity, economic contraction, etc. - that could prove the continuing operations of preserving institutions untenable. How can an institution predict its own failure and adapt preservation methods to secure continued access to materials? As Kemp et al. states: “Climate damages lie within the realm of ‘deep uncertainty’: We don’t know the probabilities attached to different outcomes, the exact chain of cause and effect that will lead to outcomes, or even the range, timing, or desirability of outcomes. Uncertainty, deep or not, should motivate precaution and vigilance, not complacency.” [44]
While we can’t predict the future, we are also not the only profession challenged by global warming. There are tools we can adopt from other vocations to help us analyze and model potential scenarios, select paths that present the most promise and least resistance, and provide critical evaluation points for adjustments. The suite of tools known as Decision Making Under Deep Uncertainty (DMDU) has seen recent adoption by local water authorities, transportation planning, and other agencies in charge of infrastructure. It represents methodologies which we can employ to evaluate preservation plans when faced with certain threats. Deep uncertainty occurs when there is no knowledge of, or consensus on, the likelihood of future events and the effects of actions taken. The basic DMDU principles are: 1) Consider multiple futures in your planning and use them to stress test your plans, 2) Seek robust plans that perform well over many futures, 3) Create plans that are flexible and adaptive, 4) Use analytics to explore options and futures. [48] Currently, digital preservation decisions are made by identifying the likelihood of certain future threats and taking steps to mitigate those threats, known as the “predict then act” model. In contrast, DMDU methods involve proposing plans and then iteratively testing them against a variety of potential futures. Four examples of DMDU methods, described in brief below, include Scenario Planning, Adaptive Pathways, Robust Decision Making (RDM) and Decision Scaling.
Scenario Planning begins with identifying the decision challenge, identifying the forces at work -- such as social, economic, political or technological issues -- fleshing out scenario narratives, and then using those scenarios to develop a robust plan. [46] [47]
Adaptive Pathways provides a framework for creating adaptive plans that change as circumstances evolve. Adaptive Pathways can help time the sequence of various plan options (i.e., paths), and help identify when these paths create lock-in (i.e., restrict alternate pathway options) or require dependencies (other actions that are necessary to enact). Adaptive Pathways planning includes identifying signposts, or adaptation “tipping points” that indicate when plans need to change. [48]
Robust Decision Making (RDM) uses a quantitative process to stress test strategies over many likely futures. Based on a two-by-two “XLRM” matrix constructed by stakeholders, it uses computer simulations to evaluate potential strategies and identify those that are the most robust or perform well over many scenarios. While the quantitative modeling involved in this method may be outside the normal skill set of a digital preservationist, it represents a promising area of collaboration with data analysts. [49]
Decision Scaling provides a structured means of performing climate stress tests. Decision scaling analysis starts by identifying goals and the climate conditions that can cause them not to be met. It involves three steps: decision framing, climate stress test, and estimating climate-informed risks. [50] Decision scaling centers around the question “what do we want to achieve and what will prevent this?” Decision scaling generally starts by using data from the historical climate record and then perturbing climate variables to test strategies. [51]
DMDU methods assess performance relative to other strategies, often using “regret” as a measure. Regret is the difference between a strategy's performance and that of the optimal strategy in that future. [51] “Low regret” strategies are ones that likely will provide success or benefits under most future scenarios, while “no regret” strategies provide success or benefits under all future scenarios. [52] No regrets strategies may come at a higher cost which should factor into the decision making.
DMDU tools can require significant stakeholder involvement and time investment but are also flexible enough to be introduced incrementally. [51] Even small steps, such as defining signposts which indicate when actions need to be taken, can increase the robustness of a digital preservation strategy. For example, a stewarding institution may decide to expand the geographical diversity of its preservation corpus to international locations depending on the election of a political candidate who promises to thwart efforts to move to renewable energy sources. In this case, the election results serve as the signpost indicating the need for change.
Data size and number of replications should also be considered a factor in these decision-making methods, so that institutions can assess how the total size of their preserved content may present challenges to continued stewardship. For example, if future scenarios show increases in preservation storage costs over time as well as decreases in budget available for preservation, the institution can reduce the overall amount of data preserved in their strategy and then stress test the new strategy to evaluate its robustness, eventually ending with better knowledge about how much data they can preserve within the acceptable level of regret.
“Not everything that is faced can be changed, but nothing can be changed until it is faced.”
- James Baldwin
The age of cheap “energy slaves” provided by fossil fuels is coming to an end, and we are caught between needing to reduce our greenhouse gas emissions while also being critically dependent on the sources that emit them. The hidden basis for all digital preservation activities is energy, whether embodied energy or operating energy, and it is only by understanding that dependency that we can recognize the risk it presents for us.
Both impending oil shortages and climate change will dramatically alter our lifestyles, our institutions, and our civilization, in deeply uncertain ways. DMDU tools present new methods that help test a variety of strategies against plausible future scenarios, thus helping make them more robust in many different futures. Simultaneously, it’s important to recognize that digital preservation is an ongoing commitment to energy expenditure and resource extraction; there are ethical considerations to make when we decide to preserve or continue preserving materials and these considerations should not be taken lightly.
We are in the realm of 1.5°C warming above pre-industrial temperatures and quickly headed to 2°C. This level of baked-in warming requires us to shift our thinking to adaptation measures. The longer we wait, the less time we have to enact potential strategies to preserve our digita and the less likely resources will be available to do so.