Sunday, November 23, 2014

The (Human) Value of Technology: Heidegger, Winner, Mumford, & Ellul

In the last post, we began looking at the Philosophy of Technology, focusing on an existential problem of our time: the conflict between our humanity and our machines.  This conflict is often personified by famous movie villains—the HAL 9000 computer, Dearth Vader, the Terminator machines, and other mechanical monsters.

Heidegger's writings on technology 

(from Amazon)

Think of the problem this way: are our technologies serving us, or are we becoming slaves to our technologies?  The German philosopher Martin Heidegger referred to this problem as “enframing” [Gestell in German] in his famous essay, “The Question Concerning Technology.”  Modern technology, warns Heidegger, may “enframe” us, by which he means it may objectify, isolate, and control us, treating us as if we all were mere “human resources” to be quantified, calculated, rationed, and perhaps eventually discarded.  (See the previous post for my review of Heidegger's provocative essay.)

When we serve technology (instead of it serving us), we lose our humanity by overstating some criteria at the peril of other human values.  For example,  a potential danger of modern technology is that it overemphasizes efficiency and expediency at the cost of robustness and resilience.  This danger is not just a philosophical discussion; it’s a practical matter.  I live in the Twin Cities, Minnesota, where the I-35W Bridge collapsed in Minneapolis in 2007.  According to officials from the Federal Highway Administration, the bridge collapsed because its steel plates were inadequately designed to support its load and weight.  In other words, the bridge collapsed because its design sacrificed robustness to efficiency.  In business, we call that "cutting corners," and in this case the consequences were literally matters of life and death.

Try as we might, we cannot separate the analysis of technology from the study of human values.  And this is essentially what the Philosophy of Technology is all about.  Besides Heidegger, the Philosophy of Technology includes many other important writers that have thought about technology from the standpoint of human values.  In this post, I want to bring three more writers to your attention.

Autonomous Technologyby Langdon Winner

(from Amazon)

An important writer is Langdon Winner, author of Autonomous Technology.  Like Heidegger, Winner stresses that technologies are more than just artifacts.  Technologies are, more broadly understood, cognitive and social activities.  My computer, keyboard, and modem are not just mechanical objects; they are means to participate in digital activities, such as composing and blogging.

As a political scientist, Winner studies how technologies affect social relationships.  He's especially fascinated by what he calls “autonomous technologies,” which are “self-governing, independent, not ruled by an external law or force.”  An ‘autonomous technology’ is basically a Frankenstein monster: a machine over which we’ve lost control, an instrument that lets loose unintended consequences like a Pandora’s box.  Thus, the invention of nuclear power also gave us the danger of nuclear bombs.

A theme in Winner’s book is that technology is never value neutral.  Technologies are parts of human activities.  Human activities form human relationships.  And human relationships embody human values.

Myth of the Machine,
by Lewis Mumford
(from Amazon)

A similar writer is Lewis Mumford, who wrote a classic book on technology and human values called The Myth of the Machine.  In this very detailed, two-volume work, Mumford explores what he calls the history of “technics,” which basically means how we use technology throughout history.

Looking at examples from the Neolithic period and Ancient Egypt to the 20th century, Mumford analyzes how we use technology to create machines.  For Mumford, however, a machine is not only a thing but also a social organization.  In his words, a machine is “a combination of resistant parts, each specialized in function, operating under human control, to utilize energy and to perform work.”

If a machine is both a piece of technology and a social organization, then Mumford is particularly interested in what he calls “the megamachine.”  The megamachine is a huge social organization that uses technology to expand itself continuously like an empire.  For example, there are “labor machines” that perform collective work for a common purpose—the Egyptian kings of the Forth Millennium used labor machines to create the grand pyramids.  We may think of many modern corporations as megamachines, using organized labor to make large-scale products.  (Mumford also discusses ‘military machines’ like the army and ‘communications machines’ like governments.)

Mumford really wants us to question if megamachines are becoming too big and powerful.  Recall Eisenhower's warning about the military-industrial complex.  Or consider what economists now call too-big-to-fail corporations.

The Technological Society

by Jacques Ellul 

(from Amazon)

Another writer worthy of mention is Jacques Ellul.  Like the writings of Heidegger, Winner, and Mumford, Ellul’s book, The Technological Society, investigates what technologies do to our social lives.  In particular, Ellul looks at how we use technologies to produce what he calls "techniques" of “absolute efficiency."  By "technique," Ellul specifically means "the totality of methods rationally arrived at and having absolute efficiency"—a pretty narrow definition of 'technique,' but one that serves his own rhetorical purposes.  For Ellul, the question is, do we really control these technologies and techniques, or do they control us?

A question like that reminds me of mobile technologies such as smart phones, which give us new communication techniques.  Do users really control them, or do they control users?  I'm reminded of friends and family members who carry their phones everywhere they goand check them constantly.  Even worse, they always have their phones connected to the Internet, receiving instant updates from text messages, e-mail, Facebook, Twitter, Instagram, Pinterest . . . and God knows what else!  The result: not a minute goes by without their attention being disrupted by some kind of screen notification, often of a petty nature.

Maybe my example helps illustrate Ellul's point: "technological society" is developing in such a way that we are not the ones controlling our technologies; they are controlling us.  In short, we are losing our freedom and agency in this new "technical milieu."  For the record, Ellul believes in human freedom, but he doesn't believe it can impact social change.  Hence, he exclaims, "Technique is essentially independent of the human being who finds himself naked and disarmed before it."  Passages like that one in The Technological Society strike me as a bit hyperbolic.  Nevertheless, the book is, in a sense, a careful discussion about what happens when we design technology to overemphasize “efficiency” or “efficient techniques” at the cost of other important human values, such as human freedom and social engagement with others.

So were these writers right?  Well, as I mentioned, I occasionally find writers like Ellul a bit hyperbolic.  Don't get me wrong, because reading them is a tremendous learning experience.  However, they often convey a tone that's fatalistic, pessimistic, and oozing of subversive criticism while offering no creative solutions.  Like these writers, I understand we don't want to be overly optimistic about technology or technological progress, but we shouldn't necessarily be overly pessimistic either.  I prefer a middle way.  The philosopher William James called such a middle way (between optimism and pessimism) "meliorism"—the position that it's possible (but not inevitable or futile) to improve our lives (e.g., with technology).

So to end with a cliffhanger, swing by next time to read about this meliorism that William James and his tradition of American Pragmatism gave us, because its application to technology is indispensable for students and professionals alike.

Saturday, November 8, 2014

From Monsters to Machines: what is Philosophy of Technology?

2001: A Space Odyssey (image from IMDb)

When you read ancient myths from Greece, Norway, or Britain, the monsters are dragons or giants.  In today’s stories, they are machines.  Take, for example, many of our science fiction and superhero movies.

In 2001: A Space Odyssey, we see the HAL 9000 computer murder an entire space crew, with the exception of Dave Bowman, who barely survives.

Darth Vader (image from IMDb)

In the Star Wars movies, we see Anakin Skywalker’s ruinous fall as Darth Vader, becoming, in the words of Obi-Wan Kenobi, “more machine now than man.”

Leviathan monster from The Avengers 

(image from Marvel Movies Wiki)

And in the latest plethora of superhero movies like The Avengers, we see cybernetic soldiers and mechanical monsters attempt to destroy human civilization.

Need I even mention the Terminator machine?

Terminator (image from IMDb)

If today's mythical monsters are machines that threaten human existence, then there is clearly a conflict between our technology and our humanity.  For artists and philosophers alike, this conflict that raises an existential question.

How do we relate to modern technology without losing our humanity?

Heidegger's writings on technology

This question characterizes what’s called the Philosophy of Technology.  Perhaps the most important introduction to it is a famous essay by 20th-century German philosopher Martin Heidegger.

In his essay, “The Question Concerning Technology,” Heidegger gives insights into how we use technology, or how technology uses us.  Now Heidegger’s writings are notoriously difficult, so here I’ll give you the gist of the essay—then you can guide yourself through it, if you dare.

Basically, Heidegger asks, what is the meaning or “essence” of modern technology?

His answer: Gestell, which is German for “enframing.”

For Heidegger, “enframing” [Gestell] is when we use technology to isolate something in nature.  We take entities like trees and waterways (which we then refer to as “natural resources”) and treat them as a “standing reserve” [Bestand]—that is, as a “stock” of utilities to be stored for later use.

Hydroelectric dam in China 

(image from Wikipedia)

Heidegger was writing before the Digital Age, so he gives mechanical examples to illustrate “enframing.”  Take the example of an hydroelectric plant, which isolates a river and transforms it into a power supplier.

In the enigmatic words of Heidegger, this transformation “sets upon nature . . . in the sense of challenging it. . . .  This setting-upon that challenges the energies of nature is an expediting. . . . Yet that expediting is always itself directed from the beginning toward furthering something else, i.e., toward driving on to the maximum yield at the minimum expense.”

Say what?  Allow me to translate . . .

Heidegger is telling us that technology is more than a tool.  It’s a way of relating to the world.  In particular, modern technology is an expedient way of relating to nature, because it objectifies nature and turns it into a natural resource that can be quantified, calculated, and rationed.

Now the tone here may sound alarmist, but Heidegger was no hippie (quite the opposite, but that’s another story).  There’s nothing necessarily wrong with using technology to “enframe” nature.  But there's a real danger: we may use technology to “enframe” ourselves.  Heidegger warns us:

“As soon as what is unconcealed no longer concerns man even as object, but exclusively as standing-reserve, and man in the midst of objectlessness is nothing but the orderer of the standing-reserve, then he comes to the very brink of a precipitous fall, that is, he comes to the point where he himself will have to be taken as standing-reserve.”

In other words, we may use technology to turn ourselves into “human resources.”  And when technology threatens to turn us into human resources, we may try to compensate by pretending we are masters of the universe, or so says Heidegger: "man, precisely as the one so threatened, exalts himself to the posture of lord of the earth."  (This self-exaltation was, of course, the fate of Anakin Skywalker/Darth Vader.)

As I pointed out in the previous post, this understanding of modern technology ignores the original meaning of technology, which meant art or artistic skill (from the Greek word techne).  Heidegger explains this understanding of technology as art:

The Aqueduct Bridge of Segovia,
a Classical example of technology as art
(image from Wikipedia)

technikon means that which belongs to techne . . . techne is the name not only for the activities and skills of the craftsman, but also for the arts of the mind and the fine arts. Techne belongs to bringing-forth, to poiesis; it is something poetic.”

So originally, technology was understood as a kind of art.  My favorite examples of technology as art include Classical architecture.

Heidegger, like a good philosopher, ends his essay with some advice.  He argues we should not give modern technology a monopoly on how we relate to the world (i.e., relating to the world through the value of expediency, seeing nature as nothing more than a resource.)  For instance, we can also relate to the world artistically or poetically:

“There was a time when it was not technology alone that bore the name techne. Once that revealing which brings forth truth into the splendor of radiant appearance was also called techne. Once there was a time when the bringing-forth of the true into the beautiful was called techne. The poiesis of the fine arts was also called techne.

Here's the bottom line: modern technology is useful and necessary, but we can balance it with a more holistic perspective on technology, especially an artistic perspective.

In this light, it may be a meaningful coincidence how cinema has embodied Heidegger’s advice by producing movies that use technology to make art, express our humanity, and help us to think about philosophical questions concerning technology.  (For instance, what do mechanical monsters tell us if they are metaphors for the conflicts between our machines and our humanity?)