FOR FREE PEOPLE

FOR FREE PEOPLE

Schoolchildren and peace campaigners release doves and pigeons in Belfast, Northern Ireland. (Adrian Dennis via Getty Images)

Free Bird: On Elon Musk and Twitter

In more than a decade on Twitter, Walter Kirn watched as the platform manipulated reality in real-time. Now? ‘Let the wild rumpus begin.’

Let me tell you about my strange time on Twitter, a so-called “social-media platform” of the early 21st century. In what I suspect is the increasing likelihood that digital records of our lives now will someday be erased, mislaid, or virtually composted in such a way as to make them irretrievable, the little story I’m about to tell may make little sense to future generations. It takes place during the period when the American establishment sought to control the flow of information in much the same way it once pursued dominion over resources such as land and oil.

This power grab caused a sort of virtual cold war, and I, as a consumer, a producer, and a conduit of information, got caught in it. Here is how the experience felt and how it went and a statement of what I hope will happen next, now that the purchase of Twitter by Elon Musk promises a better, freer day.

I joined Twitter in 2009, finding it a worthy novelty: a bulletin board for stray thoughts and observations, the spillover of my hyperactive mind. The writer Gertrude Stein once said that “remarks aren’t literature,” but I have always disagreed (ironically, Stein is best remembered now for two such quips, the other one being “there’s no there there”), so I set out to make remarks to my new audience. It was small in the beginning, consisting of my wife and a few friends, but it grew as I pushed on. 

The platform belonged to celebrities back then, who hawked their movies, albums, and TV shows in words that were their own, supposedly, fostering in fans a dubious intimacy with figures they knew only from interviews. One of these stars, an investor in the platform, was Ashton Kutcher, the prankish, grinning actor who became omnipresent for a spell and then, stupendously enriched, largely vanished from public consciousness. It seemed that Twitter had sped-up fame such that it bloomed and died in record time. 

The power of the new platform struck me first in 2012. Two incidents. The first one, a small one, occurred in Indianapolis, where I’d gone to watch the Super Bowl. I attended a party the night before the game at which many Hollywood folk were present, including an actor on a cable TV show who played a roguish businessman. The actor was extremely drunk, lurching about and hitting on young women, and it happened that my wife, back home, whom I’d texted about the scene, was able to read real-time tweets about his antics from other partygoers. A few hours afterward she noticed that these tweets had disappeared. Instant reality-editing. Impressive. 

I concluded that Twitter was in the business not only of promoting reputations, but of protecting them. It offered special deals for special people. Until then, I’d thought of it as a neutral broker.

The next illuminating incident happened while I was reporting on the 2012 Democratic Convention in Charlotte, North Carolina. The political magazine I worked for, The New Republic,  had reserved a row of seats in the upper reaches of the stadium in a press section used by several publications. When the then First Lady rose to speak, I watched my colleagues, who had their laptops open, log in to Twitter, almost to a person. They monitored the reaction to the speech, chattering among themselves when they found comments that suited the directions of the columns and opinion pieces they were already typing into their machines. Within minutes, a consensus formed that the speech was a triumph, moving, eloquent—perhaps the best such performance of its kind in living memory. 

Not being on Twitter, just focused on the words I was hearing, I found these conclusions unjustified. But the tide of superlatives kept swelling. An extra step had been added to the process of commenting on political events: consulting the chorus, excerpting its views, and rolling them into a tidy super-narrative. When I couldn’t bring myself to see what others saw, I felt distinctly isolated, particularly after spitting into the trend and publishing my sour reaction online. Remember the playground scene from Hitchock’s the Birds? 


Common Sense is made possible by readers like you. Become a subscriber to support our work and our writers:


My own habits on Twitter changed around that time. Observational humor had been my mainstay mode, but I realized that Twitter had become an engine of serious opinions on current affairs. On election night in 2016, while working at another journal, Harper’s, I was given control of the magazine’s Twitter feed and asked to think out loud about events while following them on cable news. I saw early that Trump was on his way to victory—or at least he was doing much better than predicted—and I offered a series of tart remarks about the crestfallen manners of various pundits who couldn’t hide their mounting disappointment. 

The official election results were still unknown—Clinton retained a chance to win, in theory—but before the tale was told, my editors yanked my credentials for the account and gave them to someone else. The new person swerved from the storyline I’d set (which reflected reality) and adopted a mocking tone about Trump’s chances, even posting a picture of a campaign hat sitting glumly on a folding chair at his headquarters in New York City. 

It struck me at first as pure denial. Later I decided that it was far more intentional—that my left-leaning magazine wished to preserve the illusion for its readers that the election’s outcome was unforeseeable, possibly to maintain suspense or so it could later act startled and disturbed in concert with its TV peers. Its Twitter feed, as a record of its reactions, had to align with this narrative. 

I grew convinced that night that Twitter meant trouble for me. It had become an opinion-sculpting instrument, an oracle of the establishment, and I knew I would end up out of step with it, if only because I’m of a temperament which habitually goes against the flow to challenge and test the flow, to keep it honest. Mass agreement, in my experience, both as a person and a journalist, is typically achieved at a cost to reality and truth. 

My forebodings were confirmed with the launch of the “Russiagate” investigation. I doubted its premises highly from its inception, but when I voiced these doubts on Twitter curious things occurred.  My tweets on the subject, my followers reported, often were invisible to them, and yet, to my eye, they drew engagement. Strange. The Twitter users who “liked” my tweets tended to have tiny followings, I found, and they didn’t follow me. Their profile photos were often stock images. I ran an experiment one night and sent out a tweet of a controversial nature  which I expected would be suppressed or screwed with, and then, when it was, I used screenshots of the mischief to prove to my followers that Twitter was dishonest.

I looked crazy. Concerned DMs arrived. One accused me of grandiosity for thinking I mattered enough to provoke intervention from on high. Innocence about Twitter still prevailed then; its cheerful bluebird logo still charmed the public mind. We had yet to learn, as we finally did this week (in a manner which confirmed my worst suspicions) of the hidden but direct coordination between Twitter’s management and the government, including the Department of Homeland Security, to suppress and guide opinion on topics from war to public health. (“One could argue we’re in the business of critical infrastructure, and the most critical infrastructure is cognitive infrastructure,” one government official put it.)

One definition of “paranoia” is suspecting the truth too early, before your therapist reads it in The Times. I spent years in this uncomfortable state, imagining what I couldn’t prove and occasionally doubting my own doubts. Then Covid struck, Biden was elected, the era of the “current thing” commenced (meaning the sanctioned mania of the moment), and there followed an all-out culling of the platform, championed by the prestige press, for instances of wrongthink, large and small. 

It was around this time that an old friend, a guy I’d known since college, phoned my wife and worried aloud to her that certain irreverent tweets I’d made about the sainted Dr. Fauci suggested I’d lost my mind. I called him back to prove I hadn’t, patiently responding to his charges in mature and modulated tones. By avoiding the use of keywords that triggered the algorithms installed to weed out oddballs and dissenters, I had survived the Twitter purge, but not the secondary old-buddies purge. My friend and I haven’t spoken since.

The end of this period of Twitter—with its creepy secret agents, sponsored mob attacks, whipped-up propaganda drives, and canned applause tracks for approved ideas—could not have come fast enough for me. I’m a cynic by nature, but I see no good reason—for now, at least—to question Musk’s proclaimed intention to turn Twitter’s claustrophobic dungeon back into an airy public square. I expect the transition will not go smoothly, though it does seem to be going quickly. Musk has already fired the company’s C-suite and dissolved its board. Users, however, may need time to adjust to the less inhibited new platform. Emerging from the dimness of Plato’s Cave into the dazzle of daytime may take a while—we grew sleepier than perhaps we even knew. I say let the wild rumpus begin. 


Walter Kirn’s last piece was on the phoniness of A.I.-generated art.

Subscribe now

Latest