HEWN, No. 296

"The problem is not that there is evil in the world. The problem is there is good because otherwise who would care" -- V. M. Varga, Fargo, Season 3

I’m juggling so many ideas in my brain right now, it will be a relief to begin to put some of them on paper — not just as the notes that I’ve been jotting down during my research for Teaching Machines, but to finally craft longer sentences, paragraphs, arguments. It is time to write this book.

But first, a few more tasks...

Most of the reading I’m doing right now in my final weeks of research I’d describe as “contextual” — that is, I’m reading the bestsellers and articles that reflect ideas influencing and influenced by and adjacent to teaching machines and behaviorism in the 1950s and 1960s. Needless to say, I’ve been reading a lot about cybernetics — something that totally colored how I thought about the article Mike Caulfield published this week on “The Homeostatic Fallacy and Misinformation Literacy.” Homeostasis is a cornerstone of cybernetic (and information) theory. And yet here we are, thanks to data-driven “feedback,” all out of whack.

All the reading on cybernetics — Norbert Wiener, Claude Shannon, Gordon Pask, and the like — has colored too how I thought about the 50th anniversary of “The Mother of All Demos” that rolled around last weekend, an event that many in the tech industry like to point to as unveiling or foretelling the future of personal computing. I’ve been thinking, I suppose, how and why the augmentation of human intellect, to borrow a phrase from Douglas Englebart himself, is most often now traced to Silicon Valley and to Xerox PARC as opposed to other places, people, institutions.

And so I’m stewing a bit on how “intellect” (yes, I’m reading some Richard Hofstadter as well) in Silicon Valley might actually matter far less than “innovation.” Scholarship (and perhaps democracy) matters even less. I think there’s something wrapped up in all this marketing and mythology that might explain in part why the tech industry (and, good grief, the ed-tech industry) is so incredibly and dangerously dull. You can’t build thinking machines (or teaching machines for that matter) if you’re obsessed with data but have no ideas. (That’s a Theodore Roszak insight there. Reading him too.) You can’t augment humankind, turns out, if you loathe and if you undermine humanity.

Anyway, all these thoughts and more and the book and such are prompting me to make some big changes to my website Hack Education. I'll write more about that next week...

Reading recommendations, in the meantime: “His Only Living Boy” by Casey Parks. “The Rise, Lean, And Fall Of Sheryl Sandberg” by Anne Helen Petersen.

This week’s pigeon is a Jacobin pigeon (but you could probably tell by the ruffle and wicked side-eye):

(Image credits)

Yours in struggle,
~Audrey

HEWN, No. 295

"The hard focus on information which the computer encourages must in time have the effect of crowding out new ideas, which are the intellectual source that generates facts" -- Theodore Roszak

I’ve been off of social media for a week now. I don’t know if other people have noticed my absence, but the platforms sure have. Facebook now sends me daily emails, trying to lure me to log back in with vague references to what I’ve missed. One message. Nineteen notifications. Four mentions. Facebook wants me to know that Tommy has uploaded a photo, confident I suppose, that I need to use Facebook to see how his very first trip to the UK is going. (I don’t.) Facebook wants me to know that Tressie has commented on Tim’s status update. I haven’t talked to Tim in a while, and Tressie has a book coming out soon. I should email both of them. Thanks for the nudge, Facebook, but I won’t sign in.

I am a little mortified with how much more productive I’ve been this week without the constant checking in on Twitter too, without the incessant “pulling to refresh.” I didn’t really (want to) recognize how many times a day I picked up my phone and casually clicked on the app and scrolled up and down, somehow both bored and enraged by what I saw there.

I have read five books this week, cover-to-cover, and started on a sixth. I made a major breakthrough (“discovery” isn’t the right word) in my research, that has sent me down a whole other new teaching-machine rabbit hole. (Of course, since I’ve given myself until December 31 to complete all my research, I would uncover this angle now.) I made the most delicious meal I’ve ever cooked in my life. All the sentences in this paragraph could have been social media status updates, I suppose. But they weren’t, and they never need have been. Unremarkable and unremarked upon online, these events did not demand your attention or algorithmically shape your “feed” or mine. I spared you. (Instead you get an email. So it probably all works out in the wash.)

I’m rather glad, in addition to my level of personal productivity, to have been AFK (or, rather away from Internet) this week as I truly wanted to avoid the veneration of the 41st President of the United States, a person whose politics I found to be loathsome in almost every way and whose canonization by the pundit class to be ahistorical at best. (See: “The Media Got George H. W. Bush Wrong in Life and in Death” by John Cassidy and “George H. W. Bush’s Presidency Erased People with AIDS. So Did the Tributes to Him” by Masha Gessen.)

While campaigning for office in 1988, Bush famously remarked that “I want to be the education president,” and there is no doubt he laid the groundwork for education reform (as undertaken by his successors) and for the privatization and “entrepreneurial-ization” of public education — even though Bush himself failed to pass any education legislation during his single term in office.

The selective amnesia doesn’t surprise me. The rewriting of Bush history, even by historians, doesn’t surprise me either (particularly when they are implicated in the story). And more broadly — with regards to this man or that decade or that system or that practice — why think deeply or critically about “the past” when “the future” is so gaudily lucrative?

Some reading recommendations (that are not books): “Eve Ewing’s Lesson in Grassroots Sociology” by Nawal Arjini. “The Case for Letting Malibu Burn” by Mike Davis (not a new piece, but a timely one). “The Last Curious Man” by Drew Magary. And good grief, just read everything that Lili Loofbourow writes.

This week’s pigeon is an English Trumpeter pigeon:

(Image credits)

Yours in struggle,
~Audrey

HEWN, No. 294

"Tell me your troubles and doubts, giving me everything inside and out..."

I’m taking a break from social media — as one does — until I finish the draft of Teaching Machines (due in April). I do plan to continue to email you HEWNs, although depending on how the writing goes, their sending might become a bit irregular.

Social media isn’t just where I post my little mini-rants or links to the articles I’ve written; it’s where I watch for others’ rants and writings too. Social media has been a key element in my workflow. Twitter and Facebook are important sources, along with RSS, for the news and information that feed my own analysis of the sad state of education and education technology. But daaaaaamn, I can’t tell you how relieved and excited I am to not have to pay (close) attention for a while.

A friend said that he hoped that when I return to social media that I have a better attitude. I laughed daggers pointedly in his direction and crossed his name off the list of people I might mail handwritten letters to while I’m away. Asshole. But it’s true — so much of what I read about online infuriates me, even if I just stumble upon something casually. And all that crap does wear on a person.

Like this tweet, for instance, that I saw this week from the Sesame Workshop, extolling the virtues of entrepreneurialism, venture capitalism, and venture philanthropy in early childhood development by parodying (haha) the Shark Tank brand. It pretty much ruined my day.

Sesame Workshop@SesameWorkshopOn this episode of Street Tank, a few of our furry friends are pitching their ideas around early childhood development! Learn more about #EarlyFutures and watch the full video: https://t.co/HtM26Fpmu6 https://t.co/s7HHq7hT8r

Sesame Street has veered violently off course from its original mission in recent years. New episodes are now released on the premium cable channel HBO rather than on public broadcasting. It has set up a venture investing firm, funding popular ed-tech trends like coding and private tutoring — the latter being precisely the sort of inequitable educational activity that Sesame Street was originally created to forestall.

It’s bad enough, of course, that one of the high profile investors on Shark Tank has been associated with some of the ongoing #MeToo revelations. But what this short video and larger initiative underscore too is how the folks at Sesame Workshop (and the venture capitalists who’ve locked arms with it) now see early childhood development as product development (not to mention how they see children as consumers not citizens).

One of the VC “sharks” in this video insists that “All of us need to be thinking about the entire venture framework — distribution, customer, technology, fundraising, business models, team, and ultimately social impact.” No, bro. We don’t. The reason kids struggle isn’t that we aren’t doing venture frameworks right. Even more upsetting to me as I watched this piece of social media content™ was the total absence of a core element from Sesame Workshop’s long history: research.

As Joan Ganz Cooney, founder of the Children's Television (now Sesame) Workshop, wrote in the foreword to “G” is for Growing: Thirty Years of Research on Children and Sesame Street, “Without research, there would be no Sesame Street.” Indeed, Sesame Street may well be the most researched television show in history — and not just researched after-the-fact to ascertain how it has affected young children’s literacy and numeracy skills, but researched throughout its entire design and development process. That is something that has made (or had made) Sesame Street the antithesis of the venture capital-funded ed-tech startups, many of whom parrot the Mark Zuckerberg maxim “move fast and break things,” many of whom have been openly hostile to educational theory and research.

Early childhood researchers were central to the design of the show from the outset — in the development of the script, the human and puppet characters, the lessons and goals of each episode. There was attention to repetition and sequencing. There was careful consideration of when to use straightforwardness and when to use fantasy, to how dramatic tension and humor might affect young children's comprehension. There was thought paid not only to the intellectual but also to the emotional development of young viewers. (So please spare me, ed-tech entrepreneurs and investors, with your posturing that you've invented "social emotional learning," something no one has paid attention until you came along.) With a specific mission of reaching preschoolers of color, Sesame Street purposefully cast actors of color. The curriculum was relevant and timely and meaningful and age-appropriate. There was formative research and there was summative assessment with test audiences into how and if all of these pieces “worked.” In other words, the thought that went into the development of a single segment of a single episode of Sesame Street was far far far far far far more thoughtful and in-depth than the kind of A/B testing that today’s tech entrepreneurs think they’re so clever to utilize when gauging what gets the most clicks and “engagement.”

Know your history, folks. Be wary of the tech world’s deep-pocketed storytellers and story-funders. They're up to no damn good. (Because, if nothing else, for the time being, I’m not going to be around as much to point out the moose diarrhea salesmen.)

Related: Duff Macdonald on “The Miseducation of Sheryl Sandberg” and why Harvard Business School is pretty much the worst. “How the Silicon Valley set fell in love with sourdough and decided to disrupt the 6,000-year-old craft of making bread, one crumbshot at a time” by Dayna Evans. “A Business with No End” by Jenny Odell, a story that features Olivet University, Newsweek, Amazon resellers, and more. “How a future Trump Cabinet member gave a serial sex abuser the deal of a lifetime” by Julie K. Brown. “The Future of the Public Mission of Universities” by Robin DeRosa, the transcript of a recent talk she delivered that I think ties in quite nicely with the privatization and venture capitalization of Sesame Street. And this last recommendation is an article that’s over a decade old, but as one of the themes of this week’s HEWN seems to be illusion and misdirection, I’ll urge you to re-read this wonderful profile of Ricky Jay (may he rest in piece): “Secrets of the Magus” by Marc Singer.

This week’s pigeon is a crested pigeon:

(Image credits)

Yours in struggle,
~Audrey

HEWN, No. 293

"Wear your heart on your skin in this life" -- Sylvia Plath

About two minutes in, the tattooer looked up at me and said, “Thigh tattoos are surprisingly painful, aren’t they.” He was right. Holy shit, he was shit. I wasn’t expecting it to be quite so intense. And I still had almost five hours to go.

The boastful “you bet it hurt!” attitude has always felt a bit overwrought to me — a performance of machismo that never quite matched my experiences with tattooing, even though, yes, I’ve been tattooed in some spots that made me scrunch up my face with an “okay okay, it hurts” wince. There’s an ivy leaf that’s part of a band wrapping around my left arm that sits closer to the tender part of the armpit than I would have liked — I didn’t notice when the stencil was placed or I might have said something. The cherry blossoms that run across my chest weren’t so bad all-in-all, except for the few flowers that rest right on my sternum. There, the vibration of the tattoo machine felt like the delicate blossoms were shaking me to my very core.

But damn, this fucking pigeon tattoo hurt. It hurt real bad. I didn’t think the thigh was going to be a sensitive area, and so I did not balk at all as I chose the largest and most complicated design I've yet to have inked into me, and I did not bat an eye when the tattooer said it would take about five hours for the sitting. And it wasn’t just the skin trauma either that hurt. It felt like someone stuck a drill into my leg and ground up my whole leg muscle. Three days later, I’m still feeling tender and bruised. And the dreaded itching hasn’t even started yet.

But this is the most badass tattoo — based off a photo of an early twentieth century spy bird — and I love it. I loved watching the photo turn into a stencil and then transfer, painful line by painful line, onto my skin. I loved watching the artist change needles for outlining and shading and adjust the black ink to the various shades of gray. I loved explaining the history of pigeon technology to him. I loved it that, bless his heart, he said my book sounded really interesting and he would probably read it.

I did not love the twenty-plus minute walk home from the tattoo shop with my mangled leg muscle and the stupid saran-wrap bandage that tattooers seem to use these days; I did not love trying to sleep that night with it throbbing; I did not love standing on it for hours the next day, cooking Thanksgiving dinner. And yeah, I am not going to love it when it starts to itch and peel over the course of the next few days. But then — then! — in a couple more weeks when it’s fully healed, this bird will serve as a motivation for writing my book. Ideally, I’ll look down at my leg and reassure myself, “oh, you got this.”

More likely, I’ll still whine like hell.

I wrote a couple of things related to the book this week: a review of sorts of Joy Lisi Rankin’s book A People's History of Computing in the United States and a call, once again, for help in finding more information about teaching machine inventor Norman A. Crowder. But mostly this week (in addition to the five hours in the tattoo studio), I spent most of my time thinking about cooking, buying things to be cooked, preparing to cook, and cooking. I realize that Silicon Valley wants us all to think that robots are going to make care-taking and food-making better, faster, cheaper. They’re not. They’re going to make things worse — robot-made pizzas are quite the perfect, horrible example of that.

Elsewhere in robots: “The World Will Be Our Skinner Box,” cautions Michael Sacasas. Raffi Khatchadourian writes in The New Yorker about how “a scientist’s work linking minds and machines helps a paralyzed woman escape her body.” (A long read.) But not really, not permanently. Pigeons and lab rats —that’s all of us, it feels like, in some awful experiment that, forty years from now, will likely be recast in another New Yorker article, scrubbed of the violence and the hubris. There, someone will have re-written the history and insist that we’re all post-Skinnerian now — bravo us — and we’ll be asked to forgive everyone involved with the old behaviorist social media because they were young and how could they have possibly have known better. “In the Web’s Hyperreality, Information Is Experience,” says Mike Caulfield. “The Facebook War Did Not Take Place” and such.

This article isn’t related to anything else I’m linking to this week, but know that whenever you see Casey Parks’ byline that you should drop everything and read her work. “A College Degree More Than Fifteen Years in the Making.” Casey’s the most talented storyteller I know.

I’m thankful for a lot of things this week, as custom dictates. I’ve muttered the list of things under my breath. But I’ll type this one out loud: I’m incredibly thankful for folks like Casey who watch and listen and then piece words and feelings together in ways that change the world. (I’m also thankful, in a rough week for birds, that this bedazzled pigeon made it home safely.)

Yours in struggle,
~Audrey

HEWN, No. 292

"All we have to decide is what to do with the time that is given us" -- Gandalf

I should have written something, I realized last Sunday, to commemorate the 100th anniversary of the end of World War I. (Thankfully, Michael Sacasas did: “Technology After the Great War.”) I think I shared a link on social media to a story of the war’s “incredible carrier pigeons.” But I should have written something (and not just something on war pigeons. I’ve delved into that topic at length already, and I’ll probably mention the birds again next week when I show off photos from the new tattoo I’m getting on Wednesday.)

Always with the pigeons.

The war hero Cher Ami, stuffed and on display at the Smithsonian.

I should have written about World War I and education technology — because I can think of few events in the 20th century that had a more significant impact on the development of the field.

I realize that isn’t how the story usually gets told. Many folks prefer to emphasize Sputnik or the computer. (Or, if you’re one of those darling education entrepreneurs, you likely insist the most significant event in the history of ed-tech was when you arrived on the scene.)

Entry into the First World War demanded the United States expand the military rapidly, and because of the changing demographics of the US population — thanks to previous decades of immigration and urbanization, for starters — there were vast cultural, educational, and linguistic differences among soldiers. One question became: how to standardize and how to differentiate their training? (And how to do so efficiently and scientifically.) As such, the war saw educational psychologists like Robert Yerkes and Edward Thorndike attempt to fundamentally reshape how military recruits and military officers were assessed and ranked — on a quality called “intelligence” not simply on the idea of “character.” The new “science” of “intelligence” and intelligence testing needed new instruments and it needed new machinery — and following the Great War its proponents needed new markets and new objects of inquiry as well. That market was found in schools, where some of the same questions arose: how to standardize, and how to individualize education?

I ranted a bit about some of this on Twitter — a response to the latest hoopla about “personalized learning,” which is certainly what these educational psychologists in the 1920s and onward thought they were doing by measuring and ranking students and building machines so that these students could move through material at their own pace. Khan Academy didn’t invent this.

Mark Zuckerberg is just one of the ed-tech reformers bankrolling the messaging around “personalized learning,” which should absolutely strike fear in your heart if you look at how he and other Facebook leaders have “personalized” and weaponized news and information to the detriment of democracy. This week’s NYT article on Facebook is stunning. Simply stunning. “Facebook Is a Normal Sleazy Company Now,” says Siva Vaidhyanathan. Arguably it has been all along, but many folks seem to have believed the sunnier stories that Silicon Valley preferred to peddle about itself. Read Alexis Madrigal on “When the Tech Mythology Collapses” and what the history of falling out of love with another industry might teach us.

Elsewhere in tech dystopia: Michelle Alexander on “The Newest Jim Crow” and on algorithmic sentencing and “e-carceration.” As if California doesn’t have it hard enough right now, here’s what happens “When Elon Musk Tunnels Under Your Home” by Alana Semuels. “I Found the Best Burger Place in America. And Then I Killed It,” Kevin Alexander confesses. “Tech C.E.O.s Are in Love With Their Principal Doomsayer” by Nellie Bowles. Perhaps because, contrary to all the myth-making about “openness” and progress, these folks were more invested in war and authoritarianism all along.

Yours in struggle,
~Audrey

Loading more posts…