Justin Daba
The language of purpose seems to have lost its significance in a world dominated by progress. Scientific advancements propel the world forward through innovative instruments like AI, meant to make life better, without ever specifying exactly what a better life looks like. Maybe we’ve mistaken a good life for a life of perpetual improvement and efficiency, forgetting exactly what it is science seeks to attain and why. There is no doubt that science has improved various aspects of life, but perhaps the desire to master nature has eclipsed the quiet inclination to live in accordance with it. As we fly closer to the sun, and our wings grow weak, will we long for the soil as we once longed for the sun?
AI seems to be the latest iteration of our desire for absolute freedom through technical mastery, using the scientific method to transcend the realities we find ourselves confined to. But what is AI moving towards? How can we attempt to measure the trajectory, or the purposive outcome, of an instrument of unprecedented use and impact? To understand this question, we need to familiarize ourselves with the essence of technology, and how it is shaping our understanding of the world.
In his essay, “The Question Concerning Technology,” 20th century German philosopher Martin Heidegger sought to explore the effects of technology on society, penetrating into its very essence in order to understand its most fundamental nature and its implications for meaningful living.
The essence of technology for Heidegger is a way of revealing the world. As he states, “there is nothing inherently technological about technology”. The essence of technology isn’t a practice, or a branch of science, or a set of principles, rather, it is a particular worldview.
By understanding technology as a way the world reveals itself to us, Heidegger was trying to shed light on the ways in which technology defines and distorts how we understand and relate to not only the world, but to others, and even ourselves. What makes his perspective relevant is that he wasn’t concerned with how our everyday use of technology distorts human experience. His analysis pierced deeper, reaching the ontological presuppositions that justify and naturalize a world dominated by science, as if it were the only way to truth and enlightenment. In other words, he wanted to illuminate the worldview predicated on the cold, calculative, and purported “objectivism” of scientific thought responsible for denigrating reality through a failed attempt at technical mastery.
Heidegger turns to key Greek concepts to show that the distortions of modern science arise from a deeper metaphysical shift: a drive to dominate and instrumentalize nature rather than to let beings emerge in their own way. The Greek word Poiesis signals a bringing-forth, a way of allowing something to exist and become in accordance with its nature. Think of a flower blossoming. Techne on the other hand is a way of revealing through artistry or craft, bringing something into being through technical wisdom. Think of a woodworker creating a chair through technical skill and artistic expression. Techne once accorded with the essence of poiesis, things were made so as to bring-forth their natural elements, allowing rather than distorting their nature through creation. The woodworker’s chair was created by bits of wood through care and poetic configuration, and through this mode of revealing, its creation mimicked the logic underlying the processes of a flower blossoming. Thus techne was infused with the spirit of poiesis, technical mastery worked with and not against nature.
Modern technology departs sharply from these modes of revealing through what Martin Heidegger calls enframing (Gestell). Rather than bringing-forth as in poiesis, modern technology challenges-forth nature, ordering it to yield energy, data, and utility. It does not simply allow beings to emerge according to their own possibilities; instead, it positions them within a framework of control and optimization. Enframing imposes a “frame” onto reality — one defined by the instrumentalization of all entities as resources. Within this mode of disclosure, the world presents itself as standing-reserve (Bestand): objectified material awaiting extraction, calculation, and use.
I believe AI is a neutral instrument. However, it gains a particular ethical orientation depending on the actors using it and the values of the world it is brought into. As things stand now, it seems like AI accords completely with the logic of enframing. It functions on seemingly limitless resources, rendering reality as a source of infinite input for the sake of infinite output. The natural world isn’t seen as a source of life, spirit, and thought – it’s denigrated and objectified as a source of endless material to propel illusions of progress. Not only does it transform the world around us, it begins to reshape the very structure of our own thinking and self-understanding. Authentic reflection risks being drowned out by computational thinking, meaningful thought is being subplanted by efficiency. We are becoming creatures who think for the sake of reducing our need for thought, propelled by a logic that values results and ends over meaningful processes.
How reality reveals itself to us has undergone a fundamental metamorphosis.Without reckoning with the impacts of enframing and to what extend it taints our understanding of the world, how can we, both individually and collectively, begin to formulate informed and unbiased approaches to the use and regulation of AI?
In other words, I believe Heidegger’s analysis is warning us of the current foundations around AI discourse. AI is the product of a larger movement, it is the latest instantiation of modern science’s attempts at mastering the human condition. Its essence is ridden with the essence of modern technology: enframing. We cannot begin to speak of regulations and policy recommendations without reckoning with a more fundamental set of questions. Questions attuned to the essence of AI, and what type of assumptions necessitate our belief that it has a place in our future. We need to uproot ourselves from a world dominated by the hegemonic influence of enframing, cleanse our perspective, and re-orient the moral foundation that has given rise to a world destroying itself, materially and spiritually.
I believe the question of AI presupposes the question on the sort of world we wish to collectively realize. And it seems that the trend of modern science has been to perpetually overlook moral considerations for the sake of technological advancements, and the price we pay is the degeneration of being. Let us begin by constructing a moral apparatus that tends to AI as a mode of revealing reality in tandem with a form of techne grounded in poiesis – not enframing. Let’s begin by grounding policy in practice, realizing our visions of a better tomorrow through our everyday uses and interactions with AI, and begin forming the building blocks of a larger system of values grounding the future of AI.
Let’s all take a step back, and examine the ontological presuppositions that colour the world we see as neutral, and proceed with clearer eyes.