Security and Autonomous Systems

Users of autonomous systems, or just about anyone using a computer (desktop, laptop, tablet, handheld), can easily comprehend the importance of keeping their devices secure. What, exactly, that security will entail, of course, depends on the device and its ability and requirements to communicate with other machines and systems.

For makers and users of increasingly automated vehicles–like cars–keeping malicious programs and people away from the controls of the vehicle should be more important than any aesthetic choices and equal to environmental concerns. Users must be aware that people and programs could break into the operating systems of their vehicles and make them perform in unanticipated and negative ways, and makers of the vehicles/software must constantly work to keep such intrusions as limited as possible.

That the software of such vehicles are vulnerable to outside programs seems an unintended but unavoidable consequence of the technology itself. Just as markets, elections, and choices in general (by marketing, for instance) can be rigged, so can technology. A drug like piracetam, for instance, has specific targets when prescribed by a physician. Since the drug can be purchased without a prescription, however, its ‘off-label’ uses are vast and hard to trace specifically. To me, the piracetam and autonomous vehicle have a few things in common, and one is the importance for the consumer of investigating what she is purchasing and the risks involved for herself and others.

For more on this topic, see the case of someone trying to raise awareness about this issue: http://venturebeat.com/2016/11/12/before-you-sign-up-for-a-self-driving-car-pay-attention-to-hacker-charlie-miller/

 

Power Belongs to Programmers

The following is inspired by a lovely article found here.

 

CRISPR-Cas9 gives choices and options to people. It allows for a sense of control. We want to imagine that we have control over our lives, our bodies, our habits, our proclivities, and goals. But tools like CRISPR are made by powerful elites and only give the illusion of empowerment when really we are still dependent on the companies, the programmers, making the tools—the software and hardware. We fall under the spell of control, of supposed choice, seduced by own our wants and wishes, not by the tools themselves. These tools have their ethos, to be certain: use me to become better, to fulfill your hopes and dreams. Yet the dreams are pre-programmed: they, like the tools, are given to us like preset buttons on a radio—you may choose from the limited options (AM/FM stations) only. Herbert Marcuse labeled this one-dimensionality. Jaron Lanier and Evgeny Morozov recognize the one-dimensionality masquerading as openness, freedom, independence. The problem, for Marcuse, Lanier, and Morozov, and philosophy of technology in general, is gaining the attention of the masses, to encourage them to self-reflect when the digital and economic and political environments continue to bombard them with so many demands that seem so necessary, so time-dependent. We should not be surprised that we go for the quick fix—CRISPR—and trust that the science will catch up and solve the unintended consequences our quick fixes usher along. The proactionary imperative glorifies the just-in-time mentality, a faith that is well-founded. After all, have technological advances not improved our lives? Have they not made food procurement simple, shelter ample, and luxury as close as our screens?

 

Advertisers and app designers are better schooled in the psychology that underpins our wants and motivates than most of us. They play on these right under our noses.

Counter-intuitively, the ‘right’ design or ethos will also be a bully. It will push people to see the world and themselves in what Heidegger might call the ‘right relation to technology.’ The right relation is worth seeking, but it will not be one size fits all and that means we all must put effort into finding it. We must fiddle with our behaviors until we come to a posthuman view that promotes symbiosis. I do not claim this is the natural, true, or only perspective. It should be the preferred perspective, though.

 

How do we learn to pay attention? To see our technologies as extensions of ourselves, not solutions in themselves. We do not need a new philosophy of technology. We need a philosophy of technology that engages broader audiences, that promotes self-reflection, and that exposes the seducers. We must listen to the mantra of Marcuse. We must accept our dependence on each other, on our communities—and that includes our machines—as opposed to some supposed freedom that we are told lies just a click away, an edit away, a hack away.

 

In education, we make learning a game—an app to download—but unlike games, the penalty for failure impacts our future selves. We mortgage our future for quick fixes because it is easier than trying hard—I am not immune. The siren song of the technology companies and advertisers tell us when we’re happy because their employees study us more than we reflect on ourselves, like slot machines that we play on our phones. We are, per Postman (1985), amusing ourselves to death. The game is tilted toward the producers (Tristan Harris) and our economy runs on the same operating system. The operating system becomes a metaphor for control. Who controls the message, the menu, the reward, has the power. We are just players. And we are all in.