By IAN BOGOST, contributing editor at The Atlantic, February 23, 2017
“Precarity” has become a popular way to refer to economic and labor conditions that force people—and particularly low-income service workers—into uncertainty. Temporary labor and flexwork offer examples. That includes hourly service work in which schedules are adjusted ad-hoc and just-in-time, so that workers don’t know when or how often they might be working. For low-wage food service and retail workers, for instance, that uncertainty makes budgeting and time-management difficult. Arranging for transit and childcare is difficult, and even more costly, for people who don’t know when—or if—they’ll be working.
“Such conditions are not new. As union-supported blue-collar labor declined in the 20th century, the service economy took over its mantle absent its benefits. But the information economy further accelerated precarity. For one part, it consolidated existing businesses and made efficiency its primary concern. For another, economic downturns like the 2008 global recession facilitated austerity measures both deliberate and accidental. Immaterial labor also rose—everything from the unpaid, unseen work of women in and out of the workplace, to creative work done on-spec or for exposure, to the invisible work everyone does to construct the data infrastructure that technology companies like Google and Facebook sell to advertisers.
“But as it has expanded, economic precarity has birthed other forms of instability and unpredictability—among them the dubious utility of ordinary objects and equipment.
“The more technology multiplies, the more it amplifies instability.
“Technology’s role has begun to shift, from serving human users to pushing them out of the way.
“Facebook and Google, so the saying goes, make their users into their products—the real customer is the advertiser or data speculator preying on the information generated by the companies’ free services. But things are bound to get even weirder than that. When automobiles drive themselves, for example, their human passengers will not become masters of a new form of urban freedom, but rather a fuel to drive the expansion of connected cities, in order to spread further the gospel of computerized automation. If artificial intelligence ends up running the news, it will not do so in order to improve citizen’s access to information necessary to make choices in a democracy, but to further cement the supremacy of machine automation over human editorial in establishing what is relevant.
There is a dream of computer technology’s end, in which machines become powerful enough that human consciousness can be uploaded into them, facilitating immortality. And there is a corresponding nightmare in which the evil robot of a forthcoming, computerized mesh overpowers and destroys human civilization. But there is also a weirder, more ordinary, and more likely future—and it is the one most similar to the present. In that future, technology’s and humanity’s goals split from one another, even as the latter seems ever more yoked to the former. Like people ignorant of the plight of ants, and like ants incapable of understanding the goals of the humans who loom over them, so technology is becoming a force that surrounds humans, that intersects with humans, that makes use of humans—but not necessarily in the service of human ends. It won’t take a computational singularity for humans to cede their lives to the world of machines. They’ve already been doing so, for years, without even noticing.”
IAN BOGOST is a contributing editor at The Atlantic. He is the Ivan Allen College Distinguished Chair in media studies and a professor of interactive computing at the Georgia Institute of Technology