The Chronicle Review

The Souls of the Machine

Clay Shirky says the Internet revolution has only just begun

Robert Caplin

June 13, 2010

Clay Shirky has just climbed down from a seven-foot ladder after adjusting a theater-quality stage light. He wants the ambiance to be just right for the senior-project presentations—a weeklong tradition here at New York University's Interactive Telecommunications Program, where graduating students get 20 minutes each to demonstrate working prototypes of wearable mood meters, art-finding iPhone apps, computerized art installations, or whatever else they have dreamed up in what Shirky calls "the land of misfit toys."

This unusual graduate program occupies the loftlike seventh floor of a university building on Broadway—more like a workshop than an office—where one student on a recent afternoon is soldering electronics, one is sewing a costume, and another is drilling holes in a wooden block for some creation or other. It's a place where the goal is to study technology and society by making gadgets that challenge assumptions of how machines fit into daily life and get people interacting. The program is in the Tisch School of the Arts, but these days it turns out social-media entrepreneurs, some who came specifically to study with Shirky, a Web-culture guru who has become the program's star professor.

In the nine years he's taught here, Shirky has made "the floor," as students call the program offices, his second home—he's often rearranging the furniture to try to see how small adjustments alter social dynamics among students and professors. As one of his colleagues tells it, Shirky once wondered aloud whether making one of the tables longer might encourage people to stay and talk more. Or, to use a maxim he often repeats, "Behavior is motivation filtered through opportunity."

He seeks a similar feng shui for the Internet, encouraging programmers and Web designers to build online systems that will get visitors talking, sharing, and creating, rather than passively reading and watching.

That shift may sound subtle, but Shirky sees it as world-changing. He argues that as Web sites become more social, they will threaten the existence of all kinds of businesses and organizations, which might find themselves unnecessary once people can organize on their own with free online tools. Who needs an academic association, for instance, if a Facebook page, blog, and Internet mailing list can enable professionals to stay connected without paying dues? Who needs a record label, when musicians can distribute songs and reach out to fans on their own? And so on.

In other words, the Internet revolution has just gotten started—and the most radical changes are yet to come.

"We are living in the middle of the largest increase in expressive capacity in the history of the human race," he has written. "More people can communicate more things to more people than has ever been possible in the past, and the size and speed of this increase, from under one million participants to over one billion in a generation, makes the change unprecedented."

That argument brought Shirky into the limelight in 2008 with his first book, Here Comes Everybody: The Power of Organizing Without Organizations (Penguin Press). But he was already influential among Web entrepreneurs and programmers, who knew him through his provocative essays in blogs and magazines and from YouTube videos of his riveting talks.

He has become a kind of spiritual guide to the wired set, with a message that those who make playful social networks improve society more than all those now-unnecessary offline organizations. His interest is as much in souls as in machines, in meaning as much as in medium. He even looks the part, with a shaved head and glasses, his fashion a kind of nerd chic—when I talked with him he wore jeans and a T-shirt bearing a joke about Wikipedia.

And he calls his followers to action in his latest book, Cognitive Surplus: Creativity and Generosity in a Connected Age, scheduled to appear from Penguin Press this month. In it, he urges companies and consumers to stop clinging to old models and embrace what he characterizes as "As Much Chaos as We Can Stand" in adopting new Web technologies. He presses programmers and entrepreneurs to throw out old assumptions and try as many crazy, interactive Web toys as they can—to see what works, just as the students here do.

"What the fight seems to me now is around cultural expectations of ourselves," he told me. "It's actually about changing the culture we are part of in ways that take the new medium for granted."

It has worked for his students, some of whom have brought their odd ideas to market—one recently created a service called Foursquare, where people can tag locations with virtual graffiti that others can see with their cellphones. Many analysts see the company as one to watch in social networking.

But to upset the old order, Shirky will have to overcome what he sees as the biggest enemy of progress: reruns of Gilligan's Island.

Shirky's new book began as a 17-minute talk he gave at a summit on social networking in 2008, which was posted online. With his signature blend of jokes, aphorisms, and anecdotes, he blamed television (and specifically its purest form, the sitcom) for a decades-long worldwide brain drain.

Television emerged just as people had more free time—cognitive energy they didn't know what to do with, he says—and so people watched. And watched. And watched.

"Desperate Housewives basically functions as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat," he argued in the talk, getting a hearty laugh from the audience of assorted geeks and business leaders.

By Shirky's back-of-the-envelope calculations, Americans now spend 200 billion hours each year of this "cognitive surplus" on Gilligan and his ilk, with 100 million hours of that squandered each weekend watching just advertisements.

"This is a pretty big surplus," he deadpanned.

"Did you ever see that episode of Gilligan's Island where they almost get off the island and then Gilligan messes up, and then they don't? I saw that one. I saw that one a lot when I was growing up. And every half-hour that I watched that was a half-hour I wasn't posting to my blog, or editing Wikipedia, or contributing to a mailing list."

Of course those technologies didn't exist in Shirky's childhood (he's 46). Now that they do, he sees the potential to siphon off Gilligan-units and apply them to building Wikipedias and other valuable crowdsourced tools.

He figures all of Wikipedia, his gold standard for group activity online, took about 100 million hours of thought to produce. So Americans could build 2,000 Wikipedia projects a year just by writing articles instead of watching television.

People have railed against television for decades, and it's an easy target. The fresher aspect of Shirky's extended argument is that if a critical mass started shifting time from TV to Wikipedia-like creativity and sharing, society itself would change, and he thinks for the better. Those new activities—and he gives plenty of examples in the book of projects already under way—could center on charity, civic engagement, coping with diseases, and more.

Flash back to the invention of the printing press in the 15th century. He points out that in the several decades immediately following Gutenberg's first Bible, not much really changed in European information society. Much later, some world-changing ideas came along on how to use the printing press, like the Invisible College.

In 1645, that group of early scientists, which included Christopher Wren, Robert Boyle, and Robert Hooke, vowed to cast aside the ideas of alchemy and believe only things that could be proved by repeatable experiments. "Within a few years, several members of the Invisible College had produced advances in chemistry, biology, astronomy," Shirky writes. "The problem with alchemy wasn't that the alchemists had failed to turn lead into gold—no one could do that. The problem, rather, was that the alchemists had failed uninformatively."

"This is what makes the ways society shares knowledge so critical, and what helped give the Invisible College such an edge on the alchemists," he adds. "Even when working with the same tools, they were working in a far different, and better, culture of communication."

The implications are clear. Today's open-source software and the hypersharing of social networks represent a new, better order. And we're only starting to see the impact of those inventions.

Shirky's critics laugh at the Gilligan jokes but disagree with his analysis.

"I don't see that we have this cognitive surplus," says Tom Slee, a Canadian author who wrote No One Makes You Shop at Wal-Mart. He says that his own children spend any free time they have playing video games, in addition to watching TV, so, if anything, there's less extra time to join creative social networks.

"My beef is that in focusing on the technology, he tends to forget the role of money when it comes to the new social institutions that get built," says Slee. While he says he is all for encouraging "progressive things" online, he fears that in many cases the people participating online are "in effect working for free for large companies" that run the Web services, like Facebook or Flickr.

Nicholas Carr, author of the new book The Shallows: What the Internet Is Doing to Our Brains, argues that Shirky paints growing up with Gilligan more darkly than it really was. "Most of the people I knew were doing a whole lot of 'participating,' 'producing,' and 'sharing,' and, to boot, they were doing it not only in the symbolic sphere of the media but in the actual physical world as well," Carr wrote on his blog in response to the Cognitive Surplus talk. "They were making 8-millimeter films, playing drums and guitars and saxophones in bands, composing songs, writing poems and stories, painting pictures, making woodblock prints, taking and developing photographs, drawing comics, souping up cars, constructing elaborate model railroads, reading great books and watching great movies and discussing them passionately well into the night, volunteering in political campaigns, protesting for various causes, and on and on and on. I'm sorry, but nobody was stuck, like some pathetic shred of waterborne trash, in a single media-regulated channel."

Carr offers a darker interpretation of what Web 2.0 is doing to culture: funneling urges for sharing into commercially acceptable channels, making it "an engine for social passivity."

Shirky had a blunt rebuttal when I followed up with him by e-mail. "That account is wrong," he said.

"The data is pretty unambiguous: Everywhere in the developed world, the three commonest activities are work, sleep, watch TV. Whatever participation Nick is describing, it was fitted into the free time left over after we'd watched a half-time job's worth of television every week."

As to whether people sharing their photos on Flickr are dupes to its owners: "I take the view of Ed Deci et al. that a considerable amount of human effort isn't labor in that sense, and in fact can't be labor if the participants are to enjoy it," Shirky said. "We don't regard the denizens of a bar as 'in effect working for free' for the bar owner, even though it's their presence that creates the convivial enough environment to make a 200-percent markup on booze worth paying."

Essentially, says Danah Boyd, a researcher for Microsoft Research and a longtime friend, Shirky thinks Karl Marx got it wrong. While critics like Slee may read any online social participation as economic exploitation, Shirky argues that people are motivated by love, not money. She points to Wikipedia: "People contribute because they enjoy the process," she says. Or academe. "Are we doing it for the pay?"

"There's a lot of labor of love. People like being a part of cultural production on every level."

Shirky will watch about eight hours of student presentations the day I visit, with breaks to make a few phone calls for his other gig—consulting for major technology and media companies, including Nokia and the BBC. "I try to watch all of them," he says of the presentations, as he takes a seat in the second row and opens a netbook on his lap. "This is the one time in the institution where everything stops except for each graduating student getting up and saying, 'This is what I've been up to recently.' It's a really important piece of attention."

The first demo is by Cameron Cundiff, one of Shirky's advisees. While showing a PowerPoint presentation, the student explains the iPhone application he built that allows visitors to Beacon, N.Y., to publish events, restaurant reviews, and other information to encourage New Yorkers who trek to the new arts center there to walk down Main Street as well. His app automatically pulls in comments and reviews from other Web sites and encourages people to add their thoughts as well.

Soon after is Gloria Kim, who begins with a PowerPoint about her interactive art installation. It involves giant acrylic polyhedrons that utter recorded words when visitors touch their various sides.

After a few slides, Ms. Kim says she wants to introduce herself properly, and sits down in a chair at the front of the room. She slowly puts on a blindfold, places a pair of scissors on her lap, then drops her hands to her side. As she sits motionless, words of a poem flash across the screen behind her—their basic message: It's impossible to really know what another person is thinking.

Shirky turns to me and explains that the projects fall on a broad spectrum between practical and conceptual. "That one's probably at the extreme of conceptual integration," he says.

He says the professors in the program come from a similarly rich mix of intellectual backgrounds. He teaches the program's only sociology course most semesters, and he shares an office with a professor who builds wearable computers.

The syllabus for Shirky's class, called "Social Facts," includes readings by the French sociologist Émile Durkheim, the philosopher John Searle, and the political economist Francis Fukuyama. But when a class discussion last year veered too far into abstraction, Shirky reminded the students that the goal was to make suggestions for how technologies could improve society, rather than float off into purely theoretical air, according to Cundiff, the student who made the tourism iPhone app. "He completely readjusted the trajectory of the course, adding exercises and having us design around social problems," Cundiff told me.

Shirky admits that theorists don't recognize him as one of their own. "In part because for theoreticians, saying 'Then I understood what was going on' is the end of the sentence," he says. "Around here there's a comma at the end of that phrase: 'Then I understood what was going on, and here's what I did about it.'"

Shirky got the job at NYU because of a talk he gave at a technology conference in the late 1990s, while he was working as a freelance computer programmer and Web designer. The topic was the societal importance of peer-to-peer technology, an easy way to share large files, which ended up sparking the widespread downloading of popular music. He took an unorthodox approach, illustrating his talk with doodles on a white board, and he struck a tone that warned of possible dangers ahead but was ultimately optimistic.

"He manages to pull you back down to the ground when you're floating up to the sky, and then he gives you a reason to stand on your tiptoes" is how Cory Doctorow, a popular technology blogger and science-fiction writer who was there, puts it.

After the speech, Red Burns, founder and director of the NYU program, walked up to Shirky and asked him to turn the talk into a class.

"It was the most clear explanation that I had heard," she told me. "I had been frustrated with every explanation of networks that I had ever heard. His ability to make complicated concepts clear was just wonderful."

Shirky says he wondered whether she was serious, but when he followed up the next week, she hired him on the spot.

At first he was an adjunct professor, but later he came on full time (his title is associate teacher, a nontenured position). "I just hung around here so long that they finally found me an office, basically," he tells me with a laugh.

An art major from Yale University, Shirky taught himself computer programming in the 1990s, while working as a theater director by day. "I'd go home at about 11 at night when the show was over, and I'd log in and I'd be teaching myself Perl until 3 or 4 in the morning," he says, referring to the popular programming language.

He found the Internet during a turning point in his life. "I was divorced, I was in the middle of a lousy long-distance relationship, and I had moved to New York to go into the theater, and I had hit a dry spell that was dry enough for long enough that I thought, Maybe I'm not blocked, maybe I'm done," he says, adding that he was on the verge of leaving New York (said in a tone that suggested the move would be akin to death).

"I discovered the Internet at the absolute personal and professional nadir of my life, and I glommed onto it like a drowning man to a life preserver," he says. "And I thought, well I can either call myself an addict, and try and quit, or I could try and figure out whether there's a way I can make a living off this thing."

He ended up joining a technology-consulting firm that works with media companies, and later hired himself out as a freelance programmer.

Drawn to the classroom, he approached Yale in 1995 about teaching a class there on online social groups. Though students there backed the idea, he says, a university committee turned him down. "They killed it because they said it doesn't really make sense to talk about community online because those people aren't really meeting each other," he says.

Shirky, who does not have a Ph.D., got his start teaching at Hunter College, where he was its first professor of new media and helped administrators design a master's program called Integrated Media Arts. But he left in frustration when his work became too technology-focused for what the department was doing at the time, he says.

Even today he's essentially a professor without a discipline—not fully accepted in computer science, sociology, or art.

That may be why he fits so well at the NYU program, where, as he puts it, "we never use the word 'interdisciplinary,' because we're not even disciplined enough to get to the 'inter' part."

From this land of misfit toys, he'll continue to call for a rethinking of those disciplines and all other institutions we take for granted.

"What matters most now is our imaginations," he writes at the end of his new book. "The opportunity before us, individually and collectively, is enormous; what we do with it will be determined largely by how well we are able to imagine and reward public creativity, participation, and sharing."

Jeffrey R. Young is a senior writer at The Chronicle.