Resilient Computing - Expanding Use Scenarios to Catastrophic Events
Lately, i’ve been trying to get some new work started that sits at the intersection of disaster, computation, the essence of humanity, and a variety of other perspectives. There’s a massive tangle of shadows that sits in the spaces between discipline that are starting to get more and more necessary to unravel but there’s little to use as a guide and so experimentation is in order. This is connected to a spatial computing piece focusing on digitizing emergency management and so I thought i’d start with some leftover bits from there and see where we go.
Computation is born of war after a prolonged acceleration of logistics began by Napoleon Bonaparte’s theft of the technologies manifested by the Cassini-driven Composite of France.. As such, at the center of computation is a requisite foundation of geospatial calculation fueled by an ever-increasing race toward efficiency, complexity, and fidelity.
Out of this process, we saw a brief moment in time where there was some concern about the type of labor that these new machines would birth followed by a rather complete destruction of literally any criticality toward these new machines. The result is essentially the treatment of the computer as something like the Spice in Frank Herbert’s Dune: We are addicted. In fact, disconnecting computers results in essentially withdrawal symptoms that are difficult to understand and now that society itself is a user, none of the ways we govern ourselves are possible without this machine.
The potential of computation, the manner through which it has spread across culture, and its impact has been lost to our attention due to 2 events:
- The attachment of cultural power to technological complexity.
- The computer inevitably coming home and becoming mobile.
And more recently, a third event began:
- Because of 1 and 2, we are inherently unable to defend ourselves should someone with resources and without ethical or moral bounds decide to use these two events to their own personal advantage.
We could call this the “inevitable attraction of the corruptible to the lack of restraint we place on computation” process if we keep Dune metaphors coming. However, it’s a bit more simple than that. Because we lack an ability to be critical of computation in general, we simply haven’t had the time, resources, or ability to give attention to this problem long enough to understand how a defense might even look. It is here that we the hoisting of responsibility onto a third party though in this case it is the Democrats who folks are constantly screaming at to DO SOMETHING while absolutely no one is really capable of even starting to know what they might be.
Now, some folks will say that the loss of the humanities in higher ed, the slow receding of the social sciences, or a sort of loss of popularity for human-focused majors are the cause of this issue. Unfortunately, the truth is closer to the grandparents, or great-grandparents, in the humanities, the social sciences, and the researchers around these domains had a significant hand in creating so many of the problems we have now. The forced separation between culture and material culture did not allow us to actually deal with computation as it happened.
The hoisting of that legacy onto the fields that emerged from a lack of action or involvement (e.g. human-computer interaction, human factors, computer science, software engineering, and the like) has switched the discussion from how these things fit with culture to how these things now reflect culture. As a result, we can do nothing but react or respond as we are carried down a stream without any sort of ground to stand or build on.
The focal point of resilience is to be able to sustain ourselves under more and more dire of circumstances. And this is important because we live in a world where severe weather is getting bigger, more frequent, and less predictable. And this is important because for the most part humans build cities in some of the dumbest places possible like:
- Right next to the water.
- In the forest.
- Next to a fault line.
- In or near a desert.
And we mostly did this because we didn’t know how the earth’s forces worked but we also haven’t corrected anything after we did learn about them. And so what happens here is that we build great things but have no protection against the inevitable event that will happen there because waters will almost always rise, wood burns, and the ground will shake. Pair this with other kinds of human-focused hazards like riots due to socio-political turmoil, lack of funding for critical infrastructure, and an inability to keep researching topics involved in understanding the earth as well as humans living on earth, and you have a do or die point for the current issues computing has created for us.
For example, very recently the DHS announced that the BRIC program was terminated because it was wasteful and inefficient. BRIC, which stands for “Building Resilient Infrastructure and Communities, marked a very notable investment in fostering resilience or fostering human settlement capacity to resist disruptions. This is notable because one thing we’ve learned recently is a very simple formula: $1=$6.
Or more directly, for every $1 we spend on mitigation, on enhancing resilience of an area, we gain $6 back when a hazard event and subsequently declared disaster requires response. So in other words, if we spend $1 million on enhancing the resiliency of an area and it’s impacted by something like a hurricane, we will save ourselves $6 million dollars in responding, recovering, and rebuilding the impacted area.
Now, we could also see something like the loss of the BRIC program as a benefit if that cash was then re-allocated toward weather modeling, balloons, and monitoring of weather patterns. And yet, we additionally see cuts to NOAA, the NWS, and the finishing off of the series of tools we’ve used to predict weather patterns for decades. This is because we have enough data to train models on. The issue is that while many of you would point toward climate change, the anthropocene, or even the sun’s maximum as a reason to keep collecting data, the technologist says this is pointless because we have a huge amount of data and just need to use it because it’s more advanced. We can replace everything with AI because humans aren’t needed in those spaces anymore.
So in essence, while we can say that the attachment of cultural power to technological complexity and the computer being available to everyone was something of a useful idea, we are now entering into a new phase of the consequences of those actions. From a critical standpoint, the tenets of technological determinism have now been deployed en masse.
The belief in predictive modeling and AI/Machine Learning/Whatever is itself a piece of determinism in that we have somehow become so modern that we have won against nature itself. However, we are so far away from understanding the underlying processes of anything in our existence that trying to model something is a lot like asking folks to draw Homer Simpson from memory but instead of it being funny, it has 6 fingers and blends like a character in the movie Annihilation.
We do not really have a term to use to collect the criticism that will probably start to appear written by folks who spent most of their career ignoring it because determinism was useful to careers. If someone does name the space–probably something silly like neo-post-colonialism computing–I haven’t decided if i’ll just mute it or become one of them.