Future Farmer

The following comes from the first chapter of my dissertation. Part of why I chose to look at farming technology stems directly from my own upbringing. In this section, I am actually trying out my own version of an “un-disciplined” philosophy of technology.


From a climate controlled tractor cab, eye level nearly nine feet above the field, the disconnect between the soil—the object to be worked, tilled, planted, sprayed, and harvested—and I stands out as starkly as the hills and edge of the horizon. I have come to sit in the tractor because my father was a farmer. He still is, of course, but the past tense serves the purpose of delimiting work he did fifteen years ago and work he does now—a topic particularly relevant to my last chapter. He raised me with machines and equipment—farm implements—that kept me warm in winter cold and cool in summer heat as we—the machines and I—performed the tasks set before us.


I learned to drive a tractor before I could ride a bicycle without training wheels. I came to trust the “me + tractor” hybrid (though I certainly did not understand the relationship in that way then) more than I trusted other “me + machine” combinations. On a bicycle or skateboard, for instance, I was conscious of falling in part because I knew I was, largely, in control. With the tractor, I understood it did not need me to balance it, nor did it need my energy to make it move. Indeed, I quickly apprehended that I did not even need to be at the controls for it to move through a field. Learning to drive the tractor well, of course, required much practice, and learning to use it for farming practices like planting and harvesting necessitated even more time. From an early age, though, I learned the perspectives a large tractor affords: elevation, supervision, domination. Climbing into a tractor cab, each step up takes you away from the thing—the soil—that is meant to be manipulated.


Surveying the field from the tractor, the operator can simultaneously feel in control while also experiencing a kind of surrogacy: the tractor and implements will do the ‘work’ of tilling, planting, harvesting, etc., and the operator will guide and manage the equipment. Rather than a neat separation of duties, as clean as the purported separation of the human and the equipment, the operator and the machines work in tandem, although it is not hard to imagine how the person, as farmer, serves as a proxy for the things that perform the labor.


My father, certainly, does not imagine himself as a resource for the objects doing the farming. He does not consider his feeding the tractor—filling it with diesel fuel—on the same level as his feeding the cows that roam the farm’s pastures. When I use such language with him, he laughs and reminds me machines and animals are not the same things. I should not confuse diesel fuel and hay bales; cows do not, so why would I?


Making the familiar less familiar, of taking things/processes out of their quotidian contexts and making them strange—what Viktor Shklovsky (1917/1965) termed “defamiliarization”—allows us to experience those things and processes in greater detail, perhaps revealing a complexity, meaning, and perspective lost through habitualization. Well-trodden ground in Science and Technology Studies,[1] the act of making the familiar somewhat unfamiliar invites us to reimagine how we envision the shared, even co-dependent and symbiotic, relationships between humans and technologies. For Shklovsky (1917/1965),


The purpose of art is to impart the sensation of things as they are perceived and not as they are known. The technique of art is to make objects “unfamiliar,” to make forms difficult, to increase the difficulty and length of perception because the process of perception is an aesthetic end in itself and must be prolonged. (p. 5)


I understand classical philosophy of technology[2] as, at least in part, an exercise in making the supposedly familiar less familiar. Classical philosophy of technology challenges readers to question their relationships with the artifacts and techniques that permeate their lives. Its practitioners explore the kinds of experiences we have with, through, and because of technologies—a practice taken up by postphenomenologists and which I examine in Chapter 3. Though I will argue that technologies facilitate and mediate the human experience, and in important ways are extensions of ourselves, I do not make the ontological claim that humans are technologies (or vice versa), that the two are actually one.


The project of classical philosophy of technology—macro analysis and criticism of human-technology relationships—deserves renewed attention, and this dissertation participates in that intellectual project. The normative and speculative qualities of classical philosophy of technology—attempting to understand and explain the “right relations” humans should have with technologies—provide the interlocutor with a substantial position to critique, debate, espouse, decry, etc. Much contemporary philosophy of technology, conversely, offers little more than description. Postphenomenology, for instance, often avoids normative judgments and pronouncements.[3] On a methodological level, postphenomenology is normative: descriptions of human-technology relations require the use of certain concepts, and the investigator should try to imagine numerous perspectives on the issue.[4] It tells us how to investigate human-technology relations, but it offers no guidance on what to do with that description. It does not advise us on how we should act, think, work, play, etc., in relation to technologies. Although I do not agree with the pessimism often found in classical philosophy of technology (see Chapter 2 for more on this topic), I do appreciate its explicit normativity.


In Chapter 4, I identify writers that speculate about our potential futures and offer normative judgments about how humans should live and act in a world mediated by technologies. Though these writers—whom I will label “un-disciplined” philosophers of technology—do not reside in traditional academic departments, their perspectives, their narratives, deserve attention from the community of philosophers that seek to make philosophy of technology relevant to more than just academics (Wittkower, Selinger, and Rush, 2014).


These “un-disciplined” philosophers of technology make familiar techniques and technologies unfamiliar by offering narratives that challenge, for instance, the notion of clean divides between humans and technologies. They propose that we have always been intimately linked with technologies and could not have reached our present states without them (Hayles, 1999; Kelly, 2010), or that we will soon reach a point where biology no longer limits (trans)human development (Kurzweil, 2005). “Un-disciplined” philosophers of technology motivate their audiences to change how they see themselves and their world in the present, but they also urge their audiences to imagine potentials beyond the current horizon. A tractor cab might seem a strange place to reflect on such topics, but it was the feeling of connection with the machines, and the lack thereof at times, that motivated me to seek out thinkers that explored how I should imagine my relationship with the machines. Once I found them, I realized I held a perspective that needed updating, that required adjustment to fit the pieces—me, the tractor, the field, the equipment—together into a coherent whole. This project attempts to explain how I assemble the disjointed chunks of metal, plastic, dirt, and flesh—matter all—into a narrative that helps situate me with the objects that surround, support, and guide me.

[1] C.f., Latour and Woolgar (1979/1986), Latour (1987/2003), and Collins (1985/1992) for studies that ask readers to set aside our preconceptions of, for instance, science “as the locus of certain knowledge” and to imagine it instead “as a cultural activity” (Collins, 1985/1992, p. 1).


[2] Throughout this work, I follow Hans Achterhuis’s (2001) demarcation of philosophy of technology. He distinguishes between classical philosophy of technology, as practiced by Martin Heidegger (1979), Jacque Ellul (1964, 1990), Herbert Marcuse (1964/1991) and Lewis Mumford (1964), from contemporary philosophy of technology, as practiced by Peter-Paul Verbeek (2005, 2011), Don Ihde (1979, 1993), Andrew Feenberg (1995), Philip Brey (2010) and Bruno Latour (1992, 1993a). Carl Mitcham’s Thinking through technology (1994) distinguishes engineering philosophy of technology and humanities philosophy of technology. Though useful distinctions, his “Notes toward a philosophy of meta-technology” (1995) begins to demarcate philosophy of technology in ways that closely resemble how Achterhuis (2001), Brey (2010) and Verbeek (2011) distinguish classical and modern philosophy of technology, now a commonly accepted distinction.


[3] Robert Rosenberger (2015) makes explicit the need to include phenomenological accounts of human-technology relations in our explanation of “phantom vibration syndrome.” He offers a detailed description of how study participants use cell phones, how human brains interpret the stimuli associated with the phone’s notifications, and even how social and cultural norms and motivations impact the person in terms of pressure, stress, and anxiety. He argues that any attempt to understand “phantom vibration syndrome” requires all of these accounts because “what is at issue here . . . is whether pointing to the brain itself as the noun committing the behavior in question is the most helpful way to frame our explanation” (p. 130). At the cusp of shifting from description to prescription, however, he stops. He may not have intended to provide normative analysis; he may have only intended to explicate specific aspects of the issue that have gone unobserved. His account is detailed, insightful, and compelling, but he refrains from making a broader connection to how people in general should use cell phones.
[4] In Chapter 3, I examine this methodological principle further.


A Future with Fewer Borders

The above is a very tentative title for a chapter I have started working on that might go to press in 2017 (book still waiting on approval). The book will (rather use future tense than conditional *might) bring up issues regarding our attitudes toward risk when planning for future generations. Of course, I am going to imagine that ‘future generations’ need not necessarily fit standard humanist visions of the term and instead will promote a more posthumanist perspective.

The potential book’s editor, Steve Fuller of Warwick University, recently engaged in debate with Rupert Read, East Anglia University. Audio of that debate and ensuing dialog with audience members is here.

Below is a quick abstract I have worked on for the chapter. I find a number of implications regarding the proactionary v. precautionary debate fascinating, and Fuller has forced me to question a number of ideas I did not realize I un-reflectively supported, not least of which the notion that evolution has some normative standing. I seem to privilege a normative conception of nature, where the ‘natural’ is somehow intrinsically good. I had not considered things in that way, a phrase I usually repeat when reading/hearing/conversing with Fuller.


In any case, my draft abstract:

Posthumanism collapses boundaries, particularly between humans and technologies with the concept of technogenesis: technology is simply part of what makes up the human. Conflating humans and technologies removes one border, and in doing so it might enable a different perspective on the precautionary v proactionary debate. When considering ‘our’ future, the identity of the stakeholders must first be identified, and the speculative goals assessed based on their interests. At first glance, Posthumanists seem closer to the precautionary side of the debate, with Transhumanists on the proactionary side. I will examine proactionary and precautionary principles from a Posthuman perspective—contrasting it with a broad Transhuman perspective—and look at one or two specific examples: automated (driverless) vehicle adoption in the U.S. and/or the use of CRISPR-Cas9 to modify human embryos. I side with the Posthuman perspective in many situations, but can that be maintained while promoting widespread adoption of automated vehicles now and limiting use of CRISPR-Cas9 for the time being? How should the posthumanists deal with the risks associated with automated vehicles and gene manipulation? Does the collapse of the border between humans and technologies provide any normative guidance in considering our adoption or rejection of these technologies?

Borderlands: Between the Self and Others

Driving west from Virginia to California this summer, we only encountered one border patrol station. On Interstate 80, perched high up in the Sierra Nevada near Truckee, traffic passes through what looks like a two-story house with garage-door sized tunnels carved into it. Upon arrival, with my truck sporting Virginia license plates and pulling an unwieldy U-Haul, the border agent politely asked where we were heading and what we had in the cargo trailer. He wanted to know, specifically, if we had brought plants with us from beyond the imaginary lines that separate California from the rest of this country.

My reply that we had nothing of the kind in our possession only prompted him to ask me to open the back of the trailer for him to check. Why, I wondered, did he even bother to ask me about plants: he was going to want the vehicle and its trailer opened for his inspection no matter what I claimed.

Passing through the borders of West Virginia, Kentucky, Indiana, Illinois, Missouri, Kansas, Colorado, Utah, and Nevada had not even required our covered wagon (truck and trailer) to slow its velocity. The turgid, roiling Mississippi flowed beneath and offered no resistance. Crossing the soaring Rocky Mountain range provided a far stiffer test than the rest of the drive, and that only because of the weight of the trailer we pulled. California, though, land’s end for the Western U.S., would require an explanation of purpose and a quick look-see into what I had until then considered private belongings.

The border agent was courteous, amiable and sympathetic: even he appeared to regret having to shove around the trailer’s contents that had shifted over the 2000 mile drive. Yet rifle through he did, and after a couple minutes, pronounced us safe to pass through the, to me at least, now-real invisible line that demarcates California from Nevada.

I began to reflect on our first-world to first-world border crossing after listening to an eloquent and poignant lecture by Frances Stonor Saunders on the London Review of Books site. Entitled “Where on Earth Are You,” the essay engages a subject that caused me to reflect on the borders between our human selves and our quantified selves. As Saunders piercingly articulates, who we are is inextricably linked to what and even where we are: fingerprints, credit scores, geolocations (down to the meter), intelligence quotients, bank account numbers, facial metrics (for instance, the distance between nose and lips–why we cannot smile in official documents any longer), and myriad more voluntary and compulsory bits of information that, to everyone outside our own heads, apparently, define us.

If I am all these things, then I am already code. I am already the technology that philosophers of technology have been on about for over a century. Quibble with the ontology–technology is not the same as human–all you like, I side more with N. Katherine Hayles’s position: we already are posthuman.

If you deny that assertion, if you grasp at the frayed ends of an identity that excludes that plethora of ones and zeros used to define you, then you will need to transcend the flesh/data. And yes, I am conflating flesh and data, emotion and data, desires and data, even habit and data. I refuse to distinguish between these because, in part, no one seems to want them separated. We want our others (our technological extensions) with us (tiny personal computers, nee cellphones), part of us (pacemakers, prostheses, artificial organs), guiding us (digital maps predicting our destinations), and tracking our food intake and exercise (fitbits, etc.). I challenge you to imagine an aspect of our lives not mediated by technologies, our other selves that, to my mind, do not at all seem so other after all.

A gentle warning, though: if you do find that aspect of you/your life  that escapes technological mediation, you might want to keep it to yourself. Once you share it, the race to mediate and or automate it will begin.


One might think that working on a project for years would have made my ideas and arguments clearer. Strangely, though, the closer I get to defending this dissertation–ostensibly my only real idea in the last few years–the further away I get from understanding exactly what I am arguing. That is an exaggeration, but I feel like it sometimes. Now is one of those times.

After writing over 59,000 words on a topic, pulling together a 350 word abstract for an academic audience ought to be simple. What was I on about for nearly 60k words if I cannot summarize that succinctly? I should have an answer to the most obvious question of all: what is your argument? I do, of course, have an answer. It just no longer seems like a very good one. In fact, the more I examine and critique it, the poorer it seems.

So interlocutor, I offer a sacrifice. In exchange, I hope for illumination (and yes, I realize that the sacrifice I am making likely is not worth the cost of illumination. be generous with me). I offer you an abstract (in more than 350 words); I’ll be satisfied if you give me back a bologna sandwich.


Philosophy of technology (PoT) analyzes the nature of technology, its significance and consequences, and its mediation of human experiences of the world. Classical and contemporary PoT propose hard divisions between humans and technologies where, unquestionably, the human matters most in human-technology relations. Conversely, what I label “un-disciplined” philosophy of technology (UPoT) touts the seamlessness of human-technology connections, blurring divisions between humans and technologies. Thus, UPoT denies disciplined philosophy’s first critical maneuver: define and demarcate.


UPoT enables conversations and debate regarding the ontological and moral consequences of imagining humans and technologies as hybrid, co-dependent things. UPoT builds upon environmental and animal rights movements, and postphenomenology, to emphasize pluralist accounts that emphasize the dynamism of human-technology relations. UPoT argues we should imagine technologies as extensions/parts of living things: they do the shaping and are shaped in turn. I argue that such thinking reinforces the habit, already proposed by contemporary PoT, that emerging human-technology relations demand active interpretation and engagement because the relationships constantly change. Thus, we need to imagine a moral theory that best matches the hybrid/connected condition of the present century. Increasing automation in agriculture and surgery, for instance, exemplify technologies mediating human experiences of food and health, thus affecting how we understand and define these categories.


UPoT explores human-technology enmeshing and embraces potentially non-anthropocentric values and ethics. If we imagine technologies as extensions/parts of living things, do some deserve moral patient (even moral agent) status? Disciplined PoT responds axiomatically: technologies are not alive so have no moral status. This unnecessarily limits the purview of philosophical reflection on human-technology relations, however. Instead, our hybridity should compel us to revisit ontology and moral theory. To serve inhabitants of an increasingly automated world, philosophy of technology should be un-disciplined.


I argue that axiomatically privileging humans over any “other,” whether nonhuman animals/life, the environment, or technology, masks our responsibility and co-dependence, and promotes an instrumental view of technologies that leads us away from discussing the technologies as producers, conveyors and sites of value-formation.


For the past four years, I been explicitly working on one project. The project, my dissertation, has developed sporadically and in unexpected ways. My topic has changed radically in four years; it is an unrecognizable distant relative. I’m rather happy with that as where I began was taking me in directions I did not really wish to venture (like, for example, away from STS–and that’s a big tent–and toward a more strictly-Education focus).

I still feel pulled by articles about online education, how it can and should be approached and utilized. I also recognize the tugging on my imagination whenever I hear about or consider STS pedagogy. I want to write, elaborate upon, explore, and, hopefully, offer insights into those areas. No one has told me I cannot. Yet I demure, knowing that I abandoned those paths once. I must have had decent reasons.

I did. I still do. Those topics, particularly the latter, are not resolving themselves. Sufficiently intelligent people are working on them, and if I had anything to offer, I would have produced it already. I have not.

This past year I left the place I have lived the longest in my life. In that I still had to finish the dissertation, I never really escaped. My weekly/monthly emails traveled back there even as I did not. My ideas, developed in my dissertation and in that town, are tethered there still. After so long working on one thing, and not finishing it, some rightly thought I was abandoning the project, the ideas, the goal.

Looking back now, I think I needed to leave. Of course, I would say that. Indeed, I have said that. More than a time or two: after leaving Cali the first time, Flag, Oaxaca, and D.F. My relationships have term limits. Be they towns, people, or ideas, my relationships sputter after a few years. In no way do I mean to ‘passive voice’ that tendency: I know I am the one making them fail. Strangely, though, I am ok with that.

This topic matters to me now because I might have turned some proverbial corner. I’m still going to leave wherever I am in a limited number of years. I’m still going to flirt with other ideas and even career paths. My promiscuity in these areas, however, might be in the service of a larger, and still present, project. If I get this dissertation right, I’ll be working on it the rest of my life.

Though I will not make the hyperbolic, and simply false, claim that I have been working on some new philosophy, I think the ideas I have been struggling to explicate and understand do constitute a direction I can follow for a long time. It is flexible (read: vague) enough to encompass those intellectual mistresses I unceremoniously dropped some years ago, but it should also shelter the as yet unrealized thoughts I pursue down the road.

All that last sentence really means is that the work I have been doing in recent years is already changing my predilections and penchants: it influences what I read and how I think about what I have read. For one who trades in prose and ideas, such influence is consuming. I have inflicted it on myself. I am glad I have.