Vulnerability, Resilience, and Technology

Why Emergency Management Should Be a Leader in Technology

I am a researcher of the socio-technical that has made the jump from the social sciences, the socio, to the technical. This form of the jump is arguably more difficult because the social sciences are not as strict, literal, or frustrating as a computer, compiler, or interpreter. As my work in the technical spaces solidified and found a focus in emergency management (as a crisis informatics researcher), I have been left wondering what the place of emergency management is with regard to technology. I felt for a while that the place would be that of accessibility and technology as disasters, crisis, and uprisings can leave us immobile altogether disabled in ways that this domain is focused on.

But as I considered this, I thought more about how to reconcile an important aspect of my training in the sociology of race, gender, and ethnicity: that statistics began with eugenics and that machine learning, artificial intelligence, and algorithms are (in general) applied statistics. With this, emergency management may be able to work with accessibility while additionally being a critical aspect of fostering new ways of seeing the world.

Perhaps a little hyperbolic…but only a little.

The easiest way to begin is to find a way to unpack it and then re-configure the contents for repacking. Since I’m in and around the study of emergency management, I thought it would be a useful vehicle through which to engage another concept simultaneously: that emergency management understands technology about as well as my cat (which is not a knock against EM at all, my cat can be an amazing cat without technology).

I think this makes sense because fundamentally, a lack of understanding of why and how technology matters is part and parcel of how EM will change in the coming years. This is due (overwhelming) to the fact that technological development occurred so quickly that it could never have been integrated in the first place. Yet, the turn to integrating technologies in emergency plans, moving emergency management toward a more technology savviness is inevitable.

Regardless of its speed, we can see now that there have been decades of slowly increasing vulnerability concurrent with decreased resilience to disruption. This has resulted in a fragile communications infrastructure that is a single coordinated attack away from being destroyed.

For this piece, I’d like to consider 3 central actors relevant to this discussion: technology, statistics, and ignorance.

A picture of the cover of Thicker Than Blood: How Racial Statistics Lie by Tufuku Zuberi
This book is something emergency management folks engaged in analysis should read. An additional text, White Logic, White Methods, is helpful in digging deeper.

It is said that, “Technology is society made durable.” This idea is a powerful one as it forces us to think about how technology (in this case the computer or all of the information communication technology) has moved from a tool we used to do work to mediating society itself.

Yet as emergency management practitioners and academics we understand that society itself is not terribly sturdy. In fact, the shift from “natural disasters” to “no such thing as a natural disaster” is representative of just how little is considered with regard to safety, protection, and planning of folks who will use those things. This shift is important as it allows emergency management to become not only responders, but important actors with regard to the development technology.

If there is no such thing as a natural disaster, emergency management allows us to see what parts of a technology:

  • enhance weaknesses to those predictable events like those of tornados, floods, and earthquakes?
  • What aspects of technology could lead to exacerbation of existing strain?
  • And more importantly, what parts of society shouldn’t have been made durable in the first place?

This brings us to another aspect of this concept of, “society made durable” is a question of time. What society are we discussing? What time period? And this is where technology, discussions of the power embedded in technology, and the unpacking of all those things becomes a little difficult.

The language of statistics is full of terms and symbols that have no meaning
 to social scientist across disciplines and usually are not important to
 anyone who is not an expert in the specific discipline’s practice of statistics.
 But the foundations of statistical applications to the study of society are the
 same and can be understood and debated on the basis of basic mathematics
 and logical statements. Yet there is no set logic in the methods themselves.
A quote from Zuberi’s introduction to White Logic, White Methods

At the core of information communication technology is statistics (I am making a simplistic argument for the sake of time) especially applied statistics. These are the sort of “language of the present” as machine learning, artificial intelligence, and every analytics you can think of are built on the statistical measures created by the social sciences. For a longer discussion of the complexities I am over-simplifying, click here.

If technology is a society made durable, then the method of analysis of groups that allows us to understand and inference based on sampling those groups forms the basis of technology. Yet, statistics was created by a group of researchers who (like most scientists at the time) were out to prove objectively that white groups were superior to all others by orders of magnitude. This eugenics-based foundation is still part and parcel of the toolkit of statistics in that:

“…’statistical significance,’ [which was] for decades the measure of whether empirical research is publication-worthy, can be traced directly to the trio [of its creators: Pearson, Galton, and Fisher].”
https://nautil.us/issue/92/frontiers/how-eugenics-shaped-statistics

And while applied statistics is different than that of inferential statistics, their foundation is the same. These issues can be seen within the discussions of algorithmic bias. This is a continuation of the history of statistics as a way to codify and measure difference among social categories of race. As a result, when we discuss the integration of technologies that rely on these types of applications, we inadvertently perpetuate a society we’d like to forget.

What’s more is that statistics most generally seeks the space of least disagreement. Mean, median, and mode allow us to understand where that space is. By default, this space will typically end up being where the majority’s norms dominate due to their majority status. And this influences…everything.

Thankfully, we can see some of this become less pronounced over time thanks, in part, to equal rights movement. Despite that, we still ask the question, “What is your ethnicity?” We ask this question as a static variable despite the fact that race is socially constructed and immensely variable across time, space, and place. This can be seen in the history of the US Census as different groups gaining categories in that question pop up over time, mostly due to societal pressure from those groups. For a discussion of this, see the books Thicker Than Blood: How Racial Statistics Lie or White Logic, White Methods.

Finally, this is where ignorance comes into play. To be clear, this is not a pejorative term though it is often used as one. Ignorance is basically:

From: https://dictionary.cambridge.org/us/dictionary/english/ignorance

We see ignorance all the time. At gas stations, customers and clerks celebrate not understanding those infernal point-of-sale machines. Online, we celebrate not understanding technologies like social media. On emergency management Facebook groups, we see folks celebrate not understanding ransomware attacks or cyber-vulnerabilities. While this is a way that users are ignorant, this also applies to researchers, technologists, and academics.

There are entire swathes of researcher who want to bring technology to EM because everyone has a mobile device and wants to use it to do just about everything. Yet, the ignorance about history, the ignorance about the foundations of statistics, the ignorance about how emergency management works often creates a cycle of technologists declaring they will provide emergency management technology and then calling emergency management “backward” and “in the stone ages” while shutting down their participation.

This calls back to technology is society made durable.

As we can see at the moment society is a giant fragile mess that is only getting more fragile. The integration of technology in EM then, is problematic in ways that could be useful. The state of things offers us a way to not only endeavor to strengthen what parts of society are made durable, but does so in a way that helps to prepare society for the mess it created for itself in the midst of this ignorance. It can do this because emergency management understands the ways that vulnerabilities, resiliencies, and weaknesses can manifest. As a result, we can (with a lot of work), become a way to foster technologies that make society more durable.

And more importantly, undo some of the history that got us here in the first place.

PhD: Information Science. Programming Pedagogy, Data Science, Crisis-Informatics, Map Interfaces, Science and Technology Studies, Play, and Game Studies.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store