Updated: Jun 29, 2020
Several years ago, during my tenure as Head of Forsyth School, I chose Jaron Lanier's book, You Are Not a Gadget, for our faculty and staff summer reading project. In essay-like chapters, Lanier illuminates the vagaries of digital technology with the lights of psychology, philosophy, mathematics, and music theory. A chewy read for polymaths, it's available to the lay reader by fingernail grasp and thus--I learned the hard way--a sketchy choice for a faculty/staff summer reading project.
What was a I shooting for? First, I'll admit to a constitutional revulsion to leadership literature, a genre more revealing of our contemporary hangups than anything else. I also confess to a similarly snobby aversion to popular psychology and sociology and the breathless rhetoric one finds in the genre. Lanier's ideas in Your Are Not a Gadget, by comparison, seemed to offer something simultaneously more immediate and timeless.
About brain research, for instance, Lanier is sobering: scientists know a few things about essential brain functions; they know almost nothing about how we think, nor about true artificial intelligence. This contention is what I was hoping would register with the faculty, who were deeply reluctant (as was I) to embrace without close consideration the digital technologies other schools had rushed towards: the glowing eye of the interactive board, second graders clutching IPads, the paradoxical damages to community in "sharing". In the end, what our educational community got from Lanier's piece that year, and what the rest of the United States is understanding now in the wake of the Cambridge Analytica debacle is Lanier's contention that more often than not, the "user" in contemporary digital applications is truly a cog in a larger engine designed to make money. The "user" is, in fact, the "used."
As a head of school, I wanted children to be able to evaluate media and use digital tools, but more than anything else, I wanted them to be able to discern the role in which they were being cast when using any digital application. The current Facebook/Cambridge Analytica conflict is evidence of how far away from this understanding we are in this nation. Facebook users are justifiably aghast that their data was sold to a company bent on meddling with the American political process but seemingly sanguine (or ignorant) about the fact that selling data is the essential business model of social media and many other kinds of interactive media, including educational applications. For the micro-bursts of dopamine that these platforms deliver in increasingly sophisticated ways, we blithely share the most banal to the most profound details of our lives. For some users--people witnessing entire live concerts through the 3x5 screen of their phones, or spending their vacations posting pictures and monitoring likes--the impulse to share occupies the liminal space in which IRL experiences only become actualized once they have been recorded and shared online. Once harvested and sifted, this data is sold off to advertisers and other entities who want something from us (votes, for instance) and fed back to us in increasingly customized ways to re-confirm and shape our purchases, our biases, our political convictions.
If the purpose of education is to help children become autonomous, to become their own stars, as Emerson has it, then we must help them understand their place in digital spaces just as we help them build their place among their peers and in the wider world.