HEWN, No. 317

A pedagogy controlled by algorithms can never be a pedagogy of care, integrity, or trust

Google announced this week that it was adding a plagiarism detection feature to Classrooms, its pseudo learning management system. The company is calling this “originality reporting,” not plagiarism detection — an attempt to reframe this from being a punitive “gotcha” to a tool designed to help students understand citation practices. But let’s call “bullshit” on that. I don’t think the rebranding really means much, to be honest. It’s more a rhetorical sleight of hand, an attempt to sound “collaborative” while furthering some of the worst practices of surveillance and distrust.

Google insists that, contrary to TurnItIn (the best known plagiarism detection software), it will not amass a database of all student work. Instead it will offer schools the ability to amass a “repository” of their students’ work. This should still prompt us to ask, of course, who really owns students' work. As creators and scholars and artists, students should. But their intellectual property rights are rarely respected — by schools or by publishers or by testing companies or by software companies. Indeed, plagiarism detection software, along with the pedagogical culture that this software emerged from decades ago, views students as cheaters and fakers and frauds.

The business of cheating — cheating and cheating prevention — is big business. Ownership of TurnItIn’s parent company, iParadigms, has changed hands a number of times over the years. The most recent transaction came in March of this year when, for $1.75 billion, the company was sold to the media conglomerate Advance Communications (which also owns Conde Nast) — one of the largest sales in ed-tech history. In addition to selling plagiarism detection software to schools, iParadigms has expanded its offerings recently, peddling automated essay grading and writing assistance tools. It’s all in the service of classroom efficiency.

These products — plagiarism detection, automated essay grading, and writing assistance software — are built using algorithms that are in turn built on students’ work (and often too the writing we all stick up somewhere on the Internet). It is taken without our consent. Scholarship — both the content and the structure — is reduced to data, to a raw material used to produce a product sold back to the very institutions where scholars teach and learn.

In her book The Age of Surveillance Capitalism, Shoshana Zuboff calls this “rendition,” the dispossession of human thoughts, emotions, and experiences by software companies, the reduction of the complexities and richness of human life to data, and the use of this data to build algorithms that shape and predict human behavior.

Knowledge production has a new police force: digital technology.

I finally finished Zuboff’s book this past week, and I’m planning on writing a longer review on Hack Education. One of the things I kept thinking as I worked my way through its 500+ pages — other than “this book is too damn long” — was how folks can still insist that digital technology (particularly education technology) is good and necessary, how folks can still insist that the future demands that students submit to these forces of data extraction and behavior modification. What the hell kind of future is that?! (I think we already know…)

In the case of plagiarism detection and automated essay grading software, it’s not a future that values students’ thinking and students’ voices. It’s not one that, even as Google tries to rebrand its new product, encourages “original thinking.” Rather it’s a future where students will be compelled to conform to the rules of the machine — rules we know are deeply biased, based on extraction and profiteering and information imbalances that have put democracy at risk.

In other Google news this week:

Teachers: stop uncritically adopting and promoting Google products, for crying out loud. It doesn’t make you innovative or progressive. It makes you a shill for surveillance capitalism. You’re not preparing your students for a better future simply by using the latest shiny tech. You’re aiding a company — indeed a system — that’s stealing their future.

Elsewhere in tech and ed-tech:

Also from The NYT, the 1619 Project — a look at the 400th anniversary of American slavery. And from The Guardian, Susie Cagle on “‘Bees, not refugees’: the environmentalist roots of anti-immigrant bigotry.” History matters.

This week’s pigeon is a Szegedin Highflyer (also known as a Crested Tippler). (Image credits)

Yours in struggle,
~Audrey

HEWN, No. 316

Educelebrities and stool pigeons

This week’s pigeon is Martha, the last surviving passenger pigeon, who died in 1914 in the Cincinnati Zoo. (Image credits)

Some scholars trace the phrase “stool pigeon” — an informant or a narc — to the passenger pigeon and to the decoys that were used to lure the birds for capture or slaughter. A pigeon would be tied to a stool, sometimes with its eyes sewn shut, and as it flapped its wings in distress, other pigeons would fly over to investigate, making it easy to shoot or snare them en masse. The passenger pigeon was once the most numerous bird on the planet — some three to five billion of them. But they were hunted to extinction.

I can’t help but think of those poor birds — blinded and trapped, but making enough of a ruckus that others are instictively drawn to them, even at the cost of their own safety — in light of some of the discussions folks have had in recent weeks about “edu-celebrities” and concerns that these high profile speakers are misleading the flock. There is even a Twitter account now (@educelebrity) satirizing the pithy emptiness of these consultants and buzzword-hustlers.

It’s easy, perhaps, to dismiss these figures as mostly harmless. They perform at professional development pep rallies and in their daily affirmations on social media, shouting and flapping and trying to attract attention.

Like the passenger pigeons, we humans are social creatures. We are drawn to the flutter. And like the passenger pigeons, we often misread the signals. We don’t see the danger. We aren’t so good at telling when the presence of these “edu-celebrities,” when their punditry will simply serve to beguile not liberate us.

I don’t know that the field of education is any more susceptible to this sort of clap-trap than other fields. Plenty of people in a variety of sectors fall for those who peddle Big Ideas™. Plenty of people in a variety of sectors fall for those who are (supposedly) the most Brilliant Minds™.

The scientists who Jeffrey Epstein surrounded himself with — or the scientists who clamored for Epstein's money and attention — seem to be a case in point. And I read this week’s story in Slate about what that was like with utter horror and revulsion.

The story features a couple of education/technology's Brilliant Minds™: namely, Marvin Minsky and Roger Schank. The former was just linked to Epstein’s sex trafficking circle in unsealed documents today. And the latter provides some of the most breathtakingly awful justifications for Epstein’s predatory behavior that I think I’ve ever read. And the kicker... well, the kicker to that Slate piece is something else.

These two men — Minsky and Schank — are viewed as luminaries in education technology, as founders of the field of AI. And this is how they act and think. This is how they construe agency and autonomy and personal integrity. The foundation is rotten to its core.

I’ve seen Epstein described elsewhere as a “stool pigeon,” tasked with ratting out other billionaire pedophiles as part of a deal he struck (or hopes to strike) with the government. It seems to me there are several stool pigeons here among his science cronies too, especially those who set out some decoy version of “intelligence” — dare I say “artificial intelligence” — hoping we don’t sense the danger or notice that their eyes and minds and hearts are sewn shut.


It has been another hellish week of loss and trauma. To help us navigate and understand all that, we turn to writers. And we’ve lost one of the greatest ones, Toni Morrison. Rest in power.

Recommended reading:

Yours in struggle,
~Audrey

HEWN, No. 315

A long-neck pigeon, an abbreviated newsletter

This week’s pigeon is an English Carrier, a domesticated pigeon that Charles Darwin noted was “a fine bird, of large size, close feathered, generally dark–coloured, with an elongated neck.” (Darwin wrote quite a bit about pigeons as he worked out his theory of evolution. Sneer at pigeons all you want — they have been at the center of science and culture for a long, long time.) (Image credits)

Long neck, short newsletter. It’s been one helluva week for me, and I’ve barely paid attention to the news or to the Internet. I did watch the first night of the Democratic debates, but I tapped out after about ten minutes of night two and avoided the punditry altogether. (Exception: Lili.) It does feel as though many in the media are steering the information ship directly into the same rocky shoals of 2016. 2020 — You decide: seasickness or death.

(The ed-tech news is all déjà vu and I-told-you-so these days too: “Pearson Hack Exposed Details on Thousands of U.S. Students,” “She Was Arrested at 14. Then Her Photo Went to a Facial Recognition Database,” and what not.)

Light reading: “The Last Days of John Allen Chau” by Alex Perry. “Candy Land Was Invented for Polio Wards” by Alexander B. Joy. “Athleisure, barre and kale: the tyranny of the ideal woman” by Jia Tolentino, whose new book, Trick Mirror, I cannot wait to read.

Yours in struggle,
~Audrey

HEWN, No. 314

Feral pigeons and the mindfulness racket...

This week’s pigeon is a feral rock pigeon. (Image credits)

This week’s recommended reading also includes bird-related content: “The Crane Wife” by CJ Hauser. This felt like a week where the list of what not to read was quite long. You know which ones I mean — the one about the former Senator; the one about the Harvard professor; all the hot takes on Robert Mueller; anything about Boris Johnson. Avoid. Try these instead: David A. Banks and Britney Gil on the bad metaphor “Community.” Amy Martyn on the repo men impounding electric scooters in San Diego. “Engagement Is the Enemy of Serendipity,” says Dan Cohen. Ed Yong writes about how “The Human Brain Project Hasn’t Lived Up to Its Promise.”

I wrote a little bit last week about other brain-related promises, namely the hype around Elon Musk's brain-computer interface company, Neuralink. I’ve been thinking a lot about brains — or “minds” more accurately — as of late, as I try to work through revisions to my manuscript and make sure I have the story straight about education psychology’s move from behaviorism to cognitive science. I maintain that behaviorism never really went away and, despite all the talk otherwise, it remains central to computing — particularly educational computing. And as Shoshana Zuboff argues, of course, behaviorism remains central to surveillance capitalism.

How we talk about “the mind” does seem to have shifted, and many folks — and certainly many of those pithy stars of “edutwitter” — seem increasingly obsessed with promoting proper mindsets and mindfulness. Indeed, there’s a whole mindfulness industry out there — a $4.2 trillion industry, according to an article in New Statesman this week. “...[M]indfulness has become the perfect coping mechanism for neoliberal capitalism: it privatises stress and encourages people to locate the root of mental ailments in their own work ethic. As a psychological strategy it promotes a particular form of revolution, one that takes place within the heads of individuals fixated on self-transformation, rather than as a struggle to overcome collective suffering.” The Wall Street Journal announced that snack makers are now marketing mindfulness: savor that Oreo cookie; chew that Triscuit slowly. And Pinterest launched what it’s calling “compassionate search” to offer “resources that may help improve your mood.” The latter isn’t really about mindfulness per se, and there is a lot of slippage in how the word gets used. But across many of these products, there is a confusion of slow and careful thinking with actual care; the stripping from a Buddhist-inspired presentism of any commitment to justice or social change. Remember how the e-cigarette maker Juul peddled mindfulness curriculum to schools? That’s not even irony; that’s precisely how this industry works — to prop up rather than challenge the things that are making us miserable by training us to breathe deeply and accept it. Or even better, breathe deeply and no longer notice.

Yours in struggle,
~Audrey

HEWN, No. 313

Pigeons and primates...

Kin and I stopped recording our weekly podcast over a year ago. I was in the middle of my Spencer Fellowship at Columbia’s J School, taking Sam Freedman’s infamous book writing seminar, and I just didn’t have time. I’d also grown so tired of the tech industry’s bullshit, so exhausted from listening to their bombastic promises (and then their empty apologies). Each week, each episode, I felt like I was repeating myself with some variation of “Facebook has done something terrible” or “Elon Musk said something ridiculous.” Again.

We’ve re-started the podcast. I know. I know. Everyone has a podcast. We’ve altered our format a little. We’ll record episodes every other week. Rather than just kvetch about the terribleness of the latest tech news, we are going to focus on some of the beliefs and stories that buttress the industry (and more broadly, a digital society). There’ll be some history. Some analysis. And some cussing. We’ll try to unwind some of these powerful narratives that many folks take on faith. Episode 79: “Software is Eating the World” (Or So Venture Capitalists Want You to Think. Next up: No, Technology Is Not Changing Faster Than It’s Ever Changed Before. Then: None of This Is Inevitable, No Matter What Gartner Tells You. Or something along those lines.

I am still trying to avoid as much of the (ed-) tech industry news as possible. But I did catch wind that Elon Musk is at it again, so just a few words on the latest from this techno-dystopian P. T. Barnum. On Tuesday night in an Internet livestream, he offered an update on his plans to connect people's brains to computers (and eventually, bring about human-to-human telepathy). He announced that Neurolink, a company in which he’s invested some $100 million, is ready to submit an application to the FDA and begin clinical trials on humans — hopefully next year, according to the marketing hoopla. The company, which has so far experimented on mice and monkeys, will seek volunteers — not “patients” as this seems to be much more a commercial endeavor than a therapeutic one — who’ll let Neurolink drill some holes in their skulls and insert threads into their brains which will pass data to an implant near their ear. I can’t really speak to the technology or the neuroscience — MIT Technology Review talked to others in the field about what problems Neurolink really has solved and whether this is actually a breakthrough (or just well funded PR). What caught my attention, no surprise, was the invocation of The Matrix and the claim that these sorts of computer-brain interfaces will be the future of learning. (I’ve written previously about this particular fantasy.) The company said that the surgery would be the kind of thing you’d “recommend to family and friends,” “basically an experience like getting Lasik” — a voluntary surgery that’ll cost you several thousand dollars. Even though Musk insists computer-brain implants won’t be mandatory, as he outlines his vision of the future, it seems they surely would be. Computer-enhanced human brains will be necessary in order for humans to keep up with artificial intelligence, he argues. You’ll need them for work; you’ll need them for play; you’ll need them to communicate. So, people will go into debt to obtain this cyborg enhancement, or they’ll be left out, left behind. “I think it’s safe to say you could repay the loan with superhuman intelligence,” Musk insists. “I think it’s a safe bet.”

Oh. Okay then.

Speaking of education technologies tested on primates in what I can only imagine as utterly terrifying scenarios...

I don’t have much to say about the 50th anniversary of the Apollo 11 mission (although I did like this article by Alexis Madrigal because it gets to the heart of one of those cliches about technology that folks repeat without ever questioning). Me, I guess I’m just not as fascinated by space exploration as some folks. I don’t care how cool the math was; I can’t help but think of the costs of imperialism. Hearing and reading all the recollections this week have made me think of Gil Scott Heron’s poem “Whitey on the Moon.” It also made me think about a newspaper clipping that stopped me dead in my tracks as I poured through the papers in B. F. Skinner’s archives: “Teaching Machine Tutored ‘Ham’ For 5000-Mile Ride in Rocket.” Because when I think about Ham, the first hominid in space, I think of this picture:

I’d like to write something more about this some day, I think. The horrors of Ham. The horrors of education technology and education psychology. (The only thing I know of that’s been written on Ham is The Right Stuff. Ugh.)

Recommended reading:

This week’s pigeon is the Nicobar pigeon, the closest living relative of the dodo. (Image credits)

Yours in struggle,
~Audrey

Loading more posts…