• July 22, 2014

The Souls of the Machine

Clay Shirky says the Internet revolution has only just begun

The Souls of the Machine: Clay Shirky's Internet Revolution 1

Robert Caplin

Enlarge Image
close The Souls of the Machine: Clay Shirky's Internet Revolution 1

Robert Caplin

Clay Shirky has just climbed down from a seven-foot ladder after adjusting a theater-quality stage light. He wants the ambiance to be just right for the senior-project presentations—a weeklong tradition here at New York University's Interactive Telecommunications Program, where graduating students get 20 minutes each to demonstrate working prototypes of wearable mood meters, art-finding iPhone apps, computerized art installations, or whatever else they have dreamed up in what Shirky calls "the land of misfit toys."

This unusual graduate program occupies the loftlike seventh floor of a university building on Broadway—more like a workshop than an office—where one student on a recent afternoon is soldering electronics, one is sewing a costume, and another is drilling holes in a wooden block for some creation or other. It's a place where the goal is to study technology and society by making gadgets that challenge assumptions of how machines fit into daily life and get people interacting. The program is in the Tisch School of the Arts, but these days it turns out social-media entrepreneurs, some who came specifically to study with Shirky, a Web-culture guru who has become the program's star professor.

In the nine years he's taught here, Shirky has made "the floor," as students call the program offices, his second home—he's often rearranging the furniture to try to see how small adjustments alter social dynamics among students and professors. As one of his colleagues tells it, Shirky once wondered aloud whether making one of the tables longer might encourage people to stay and talk more. Or, to use a maxim he often repeats, "Behavior is motivation filtered through opportunity."

He seeks a similar feng shui for the Internet, encouraging programmers and Web designers to build online systems that will get visitors talking, sharing, and creating, rather than passively reading and watching.

That shift may sound subtle, but Shirky sees it as world-changing. He argues that as Web sites become more social, they will threaten the existence of all kinds of businesses and organizations, which might find themselves unnecessary once people can organize on their own with free online tools. Who needs an academic association, for instance, if a Facebook page, blog, and Internet mailing list can enable professionals to stay connected without paying dues? Who needs a record label, when musicians can distribute songs and reach out to fans on their own? And so on.

In other words, the Internet revolution has just gotten started—and the most radical changes are yet to come.

"We are living in the middle of the largest increase in expressive capacity in the history of the human race," he has written. "More people can communicate more things to more people than has ever been possible in the past, and the size and speed of this increase, from under one million participants to over one billion in a generation, makes the change unprecedented."

That argument brought Shirky into the limelight in 2008 with his first book, Here Comes Everybody: The Power of Organizing Without Organizations (Penguin Press). But he was already influential among Web entrepreneurs and programmers, who knew him through his provocative essays in blogs and magazines and from YouTube videos of his riveting talks.

He has become a kind of spiritual guide to the wired set, with a message that those who make playful social networks improve society more than all those now-unnecessary offline organizations. His interest is as much in souls as in machines, in meaning as much as in medium. He even looks the part, with a shaved head and glasses, his fashion a kind of nerd chic—when I talked with him he wore jeans and a T-shirt bearing a joke about Wikipedia.

And he calls his followers to action in his latest book, Cognitive Surplus: Creativity and Generosity in a Connected Age, scheduled to appear from Penguin Press this month. In it, he urges companies and consumers to stop clinging to old models and embrace what he characterizes as "As Much Chaos as We Can Stand" in adopting new Web technologies. He presses programmers and entrepreneurs to throw out old assumptions and try as many crazy, interactive Web toys as they can—to see what works, just as the students here do.

"What the fight seems to me now is around cultural expectations of ourselves," he told me. "It's actually about changing the culture we are part of in ways that take the new medium for granted."

It has worked for his students, some of whom have brought their odd ideas to market—one recently created a service called Foursquare, where people can tag locations with virtual graffiti that others can see with their cellphones. Many analysts see the company as one to watch in social networking.

But to upset the old order, Shirky will have to overcome what he sees as the biggest enemy of progress: reruns of Gilligan's Island.

Shirky's new book began as a 17-minute talk he gave at a summit on social networking in 2008, which was posted online. With his signature blend of jokes, aphorisms, and anecdotes, he blamed television (and specifically its purest form, the sitcom) for a decades-long worldwide brain drain.

Television emerged just as people had more free time—cognitive energy they didn't know what to do with, he says—and so people watched. And watched. And watched.

"Desperate Housewives basically functions as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat," he argued in the talk, getting a hearty laugh from the audience of assorted geeks and business leaders.

By Shirky's back-of-the-envelope calculations, Americans now spend 200 billion hours each year of this "cognitive surplus" on Gilligan and his ilk, with 100 million hours of that squandered each weekend watching just advertisements.

"This is a pretty big surplus," he deadpanned.

"Did you ever see that episode of Gilligan's Island where they almost get off the island and then Gilligan messes up, and then they don't? I saw that one. I saw that one a lot when I was growing up. And every half-hour that I watched that was a half-hour I wasn't posting to my blog, or editing Wikipedia, or contributing to a mailing list."

Of course those technologies didn't exist in Shirky's childhood (he's 46). Now that they do, he sees the potential to siphon off Gilligan-units and apply them to building Wikipedias and other valuable crowdsourced tools.

He figures all of Wikipedia, his gold standard for group activity online, took about 100 million hours of thought to produce. So Americans could build 2,000 Wikipedia projects a year just by writing articles instead of watching television.

People have railed against television for decades, and it's an easy target. The fresher aspect of Shirky's extended argument is that if a critical mass started shifting time from TV to Wikipedia-like creativity and sharing, society itself would change, and he thinks for the better. Those new activities—and he gives plenty of examples in the book of projects already under way—could center on charity, civic engagement, coping with diseases, and more.

Flash back to the invention of the printing press in the 15th century. He points out that in the several decades immediately following Gutenberg's first Bible, not much really changed in European information society. Much later, some world-changing ideas came along on how to use the printing press, like the Invisible College.

In 1645, that group of early scientists, which included Christopher Wren, Robert Boyle, and Robert Hooke, vowed to cast aside the ideas of alchemy and believe only things that could be proved by repeatable experiments. "Within a few years, several members of the Invisible College had produced advances in chemistry, biology, astronomy," Shirky writes. "The problem with alchemy wasn't that the alchemists had failed to turn lead into gold—no one could do that. The problem, rather, was that the alchemists had failed uninformatively."

"This is what makes the ways society shares knowledge so critical, and what helped give the Invisible College such an edge on the alchemists," he adds. "Even when working with the same tools, they were working in a far different, and better, culture of communication."

The implications are clear. Today's open-source software and the hypersharing of social networks represent a new, better order. And we're only starting to see the impact of those inventions.

Shirky's critics laugh at the Gilligan jokes but disagree with his analysis.

"I don't see that we have this cognitive surplus," says Tom Slee, a Canadian author who wrote No One Makes You Shop at Wal-Mart. He says that his own children spend any free time they have playing video games, in addition to watching TV, so, if anything, there's less extra time to join creative social networks.

"My beef is that in focusing on the technology, he tends to forget the role of money when it comes to the new social institutions that get built," says Slee. While he says he is all for encouraging "progressive things" online, he fears that in many cases the people participating online are "in effect working for free for large companies" that run the Web services, like Facebook or Flickr.

Nicholas Carr, author of the new book The Shallows: What the Internet Is Doing to Our Brains, argues that Shirky paints growing up with Gilligan more darkly than it really was. "Most of the people I knew were doing a whole lot of 'participating,' 'producing,' and 'sharing,' and, to boot, they were doing it not only in the symbolic sphere of the media but in the actual physical world as well," Carr wrote on his blog in response to the Cognitive Surplus talk. "They were making 8-millimeter films, playing drums and guitars and saxophones in bands, composing songs, writing poems and stories, painting pictures, making woodblock prints, taking and developing photographs, drawing comics, souping up cars, constructing elaborate model railroads, reading great books and watching great movies and discussing them passionately well into the night, volunteering in political campaigns, protesting for various causes, and on and on and on. I'm sorry, but nobody was stuck, like some pathetic shred of waterborne trash, in a single media-regulated channel."

Carr offers a darker interpretation of what Web 2.0 is doing to culture: funneling urges for sharing into commercially acceptable channels, making it "an engine for social passivity."

Shirky had a blunt rebuttal when I followed up with him by e-mail. "That account is wrong," he said.

"The data is pretty unambiguous: Everywhere in the developed world, the three commonest activities are work, sleep, watch TV. Whatever participation Nick is describing, it was fitted into the free time left over after we'd watched a half-time job's worth of television every week."

As to whether people sharing their photos on Flickr are dupes to its owners: "I take the view of Ed Deci et al. that a considerable amount of human effort isn't labor in that sense, and in fact can't be labor if the participants are to enjoy it," Shirky said. "We don't regard the denizens of a bar as 'in effect working for free' for the bar owner, even though it's their presence that creates the convivial enough environment to make a 200-percent markup on booze worth paying."

Essentially, says Danah Boyd, a researcher for Microsoft Research and a longtime friend, Shirky thinks Karl Marx got it wrong. While critics like Slee may read any online social participation as economic exploitation, Shirky argues that people are motivated by love, not money. She points to Wikipedia: "People contribute because they enjoy the process," she says. Or academe. "Are we doing it for the pay?"

"There's a lot of labor of love. People like being a part of cultural production on every level."

Shirky will watch about eight hours of student presentations the day I visit, with breaks to make a few phone calls for his other gig—consulting for major technology and media companies, including Nokia and the BBC. "I try to watch all of them," he says of the presentations, as he takes a seat in the second row and opens a netbook on his lap. "This is the one time in the institution where everything stops except for each graduating student getting up and saying, 'This is what I've been up to recently.' It's a really important piece of attention."

The first demo is by Cameron Cundiff, one of Shirky's advisees. While showing a PowerPoint presentation, the student explains the iPhone application he built that allows visitors to Beacon, N.Y., to publish events, restaurant reviews, and other information to encourage New Yorkers who trek to the new arts center there to walk down Main Street as well. His app automatically pulls in comments and reviews from other Web sites and encourages people to add their thoughts as well.

Soon after is Gloria Kim, who begins with a PowerPoint about her interactive art installation. It involves giant acrylic polyhedrons that utter recorded words when visitors touch their various sides.

After a few slides, Ms. Kim says she wants to introduce herself properly, and sits down in a chair at the front of the room. She slowly puts on a blindfold, places a pair of scissors on her lap, then drops her hands to her side. As she sits motionless, words of a poem flash across the screen behind her—their basic message: It's impossible to really know what another person is thinking.

Shirky turns to me and explains that the projects fall on a broad spectrum between practical and conceptual. "That one's probably at the extreme of conceptual integration," he says.

He says the professors in the program come from a similarly rich mix of intellectual backgrounds. He teaches the program's only sociology course most semesters, and he shares an office with a professor who builds wearable computers.

The syllabus for Shirky's class, called "Social Facts," includes readings by the French sociologist Émile Durkheim, the philosopher John Searle, and the political economist Francis Fukuyama. But when a class discussion last year veered too far into abstraction, Shirky reminded the students that the goal was to make suggestions for how technologies could improve society, rather than float off into purely theoretical air, according to Cundiff, the student who made the tourism iPhone app. "He completely readjusted the trajectory of the course, adding exercises and having us design around social problems," Cundiff told me.

Shirky admits that theorists don't recognize him as one of their own. "In part because for theoreticians, saying 'Then I understood what was going on' is the end of the sentence," he says. "Around here there's a comma at the end of that phrase: 'Then I understood what was going on, and here's what I did about it.'"

Shirky got the job at NYU because of a talk he gave at a technology conference in the late 1990s, while he was working as a freelance computer programmer and Web designer. The topic was the societal importance of peer-to-peer technology, an easy way to share large files, which ended up sparking the widespread downloading of popular music. He took an unorthodox approach, illustrating his talk with doodles on a white board, and he struck a tone that warned of possible dangers ahead but was ultimately optimistic.

"He manages to pull you back down to the ground when you're floating up to the sky, and then he gives you a reason to stand on your tiptoes" is how Cory Doctorow, a popular technology blogger and science-fiction writer who was there, puts it.

After the speech, Red Burns, founder and director of the NYU program, walked up to Shirky and asked him to turn the talk into a class.

"It was the most clear explanation that I had heard," she told me. "I had been frustrated with every explanation of networks that I had ever heard. His ability to make complicated concepts clear was just wonderful."

Shirky says he wondered whether she was serious, but when he followed up the next week, she hired him on the spot.

At first he was an adjunct professor, but later he came on full time (his title is associate teacher, a nontenured position). "I just hung around here so long that they finally found me an office, basically," he tells me with a laugh.

An art major from Yale University, Shirky taught himself computer programming in the 1990s, while working as a theater director by day. "I'd go home at about 11 at night when the show was over, and I'd log in and I'd be teaching myself Perl until 3 or 4 in the morning," he says, referring to the popular programming language.

He found the Internet during a turning point in his life. "I was divorced, I was in the middle of a lousy long-distance relationship, and I had moved to New York to go into the theater, and I had hit a dry spell that was dry enough for long enough that I thought, Maybe I'm not blocked, maybe I'm done," he says, adding that he was on the verge of leaving New York (said in a tone that suggested the move would be akin to death).

"I discovered the Internet at the absolute personal and professional nadir of my life, and I glommed onto it like a drowning man to a life preserver," he says. "And I thought, well I can either call myself an addict, and try and quit, or I could try and figure out whether there's a way I can make a living off this thing."

He ended up joining a technology-consulting firm that works with media companies, and later hired himself out as a freelance programmer.

Drawn to the classroom, he approached Yale in 1995 about teaching a class there on online social groups. Though students there backed the idea, he says, a university committee turned him down. "They killed it because they said it doesn't really make sense to talk about community online because those people aren't really meeting each other," he says.

Shirky, who does not have a Ph.D., got his start teaching at Hunter College, where he was its first professor of new media and helped administrators design a master's program called Integrated Media Arts. But he left in frustration when his work became too technology-focused for what the department was doing at the time, he says.

Even today he's essentially a professor without a discipline—not fully accepted in computer science, sociology, or art.

That may be why he fits so well at the NYU program, where, as he puts it, "we never use the word 'interdisciplinary,' because we're not even disciplined enough to get to the 'inter' part."

From this land of misfit toys, he'll continue to call for a rethinking of those disciplines and all other institutions we take for granted.

"What matters most now is our imaginations," he writes at the end of his new book. "The opportunity before us, individually and collectively, is enormous; what we do with it will be determined largely by how well we are able to imagine and reward public creativity, participation, and sharing."

Jeffrey R. Young is a senior writer at The Chronicle.

Comments

1. paievoli - June 13, 2010 at 07:10 am

This is a great article and a person after my own heart. The NYU/ITP is the mothership of all interactive media programs. I direct the graduate IMA program at LIU/CWpost. We are very similar in theory and application to this program and it is great to see that we have focused on the exact same things as this prestigious program. Red Burns and all of her group knew this would be where we are 30 years ago. A true visionary and a strong leader.

2. catherine_ford - June 13, 2010 at 05:13 pm

What outstanding insights! Thank you for interviewing Mr. Shirky and for sharing your thoughts.

3. arrive2__net - June 14, 2010 at 05:40 am

This seems like an example of how a really great and contemporary college program develops, by combining recent developments in society with a need to advance what is known and possible. The professor is hired for knowledge, experience, and ability to develop and cultivate the subject-area and students, rather than based strictly on credentialism. However it seems like the program is so based around the personality and tastes of the professor, to where it has become his "second home", according to the article.

"Back in the day", I thought of Gilligan's Island, Three's Company, and similar sitcom escapes as mental relaxation from what had often been a emotionally and mentally challenging day. Sometimes you just need to kick back and laugh, far from the input of the 'productivity terrorists' ... so I have some reservations about that part of what Shirky has to say. None-the-less for those who find Shirky's productive endeavors their source of relaxation and recharge, I say that's great.

Bernard Schuster
Arrive2.net

4. citizenwhy - June 14, 2010 at 10:20 am

Great guy but such a Westerner. Wants to fill up everything with positive space, eliminating the passive, creative negative space where the mind quietens and refreshes. Why not look at TV as one aspect of the negative space of the mind? Nothing happens, and that's great!

P.S. I almost never watch TV. I do have 2-3 favorite shows which I catch when I want to on Hulu.

5. eggdawg - June 14, 2010 at 11:20 am

The mind doesn't quiet when one watches TV. The viewer gets reformatted, sometimes a little, sometimes a lot. I played a game with my kids called "What are they selling you now?" They became ad literate and critical thinkers of the space between content. And now that the content contains product placement, the game is trickier and even more important. "Do not swallow whole" is the sticker that should be on every TV and screen, for consumers and for citizens. Because the worst way that screens reformat our brains is this: They convince us that solutions to life's problems are purchases rather than products and services and policies made by people. So we sit around while we wait for elected reps to make the right decision. But their options are poor because they don't do stuff hands-on like run a small business or a physician's office or a branch bank or a gas station or a hair salon or eatery salon or saloon or pet store. They don't see the same person, everyday over time, go from good times to lost job to long term joblessness. They don't see the college grad come out with $200k in debt and waitressing as her option.

When busy powerful people watch TV, including Sunday morning DC TV they see the echo chamber. Lots of busy people repeating the conventional wisdom of the top dogs in government, industry, finance and the media. Those folks have all lost touch together. Instead of watching and funding their lifestyles and their level of quality, let's take our TV time and start giving them specific information on how things really work or don't and how policy that seemed good at the time never took into account the unintended consequences. Even if the crowd isn't productive, it can be informative...if we build tools to harvest and visualize data from people to add to the data we harvest from machines and sensors. The polling companies are sellers and their offerings over the past 40 years have turned into garbage: support rather than illumination. Not understanding of what's real, but talking points for people and institutions with vested interests. It's not illegal. It's just tearing this country apart, that's all.

So let's take good helpful data from people and machines and combine it with good assembled ideas from people. Who? People who got amazing things done before computers in every industry. Then figure out what things are working now and what things could be better on digital platforms. Get rid of the rest. It's digital red tape that's turned doctors and educators and business people into slaves of the screen. Their faces need to be back with people, not with computers. Because good people pick up valuable cues by being with people in their environments. Oh, now we're back to housecalls.

Combine the old farts with young people who have digital skills and they will make stuff that helps us all perceive and imagine and decide more clearly. We are in a commerce-induced fog and it's blinding us. And industries are hurting themselves by becoming unsustainable and failing because they too are working off bad data: The C-Suite no longer talks to the frontline worker who deals with...the customer.

While we're at it, instead of watching people on TV who have antisocial behaviors and empires that depend on our money and our kids' money, let's put on old reruns of shows that featured effective problem solvers in the workplace and the home.

Clay is right: TV is brain drain, but in so many fine-grained ways that are insidious. And Cass Sunstein is right: We have built echo-chambers and we swallow what we already believe. And that makes us vulnerable to the scam artists of the world like Madoff. Because we no longer know how things work, how systems work, how components work. It's all become black box which is good for the companies and the politicians but bad for an educated consumer or citizen.

So let's unleash not the wisdom of the crowd, but the talent of the crowd and the experience of the crowd and the productive streamlining and troubleshooting ability of the crowd. Let's spend smarter: our money and our time.

Good IT can bring this system to life on every screen including every cell phone. It can boost inventive productivity and slash inefficiency and screen-bred red tape. Instead of going to the moon we can do what NASA did: streamline, cut bulk, create new stuff that launches innovation for decades. We can ban college assignments and homework that's made-up and focus kids on what's real. They can apply their ingenuity now before it gets overly channeled and boxed in. And if they come up with terrific ideas, and they will, let's honor them and pay them what it's worth. Let's start by asking them what we should jettison from education: which systems of belief and which scaffolds and which last-century best practices no longer work in a world that's on the trajectory we are on. Because our system is about to spend a lot of time and taxpayer money reinforcing testing and technology that is judgmental for life, reinforcing of outdated, outmoded ways of thinking and doing.

Put it this way: If our education system is so good, how come all the smart people running the biggest things are struggling to keep Detroit alive, journalism alive, banking alive, housing alive, Seaworld trainers alive, Olympics lugers alive? Could it be it's finally time with technology to involve in policy anyone who's life is directly affected, everyday, by decisions made by people who live far away and couldn't do the job if you paid them double-overtime? Maybe America is showing the world what happens when the Decision Class of Society turns out to be the Hands-Off Class: the theoretical researchers who observe and judge but do not do. Here's a thought: their verdicts are killing America and jobs and our kids.

Ask the fishermen who live in the Gulf area how well the Decision Class of BP and the government and media is doing. Ask the survivors of Katrina or the hurricane that devastated Miami the same thing. Ask how well big organizations used their time to understand the local situation and local resources including the local people. Ask anyone who covered Haiti the same thing. Big needs small. But over the past 40 years, big has become Titanic and Titanic fails to stay humble. Titanic fails to listen to the millions of students and teachers who ARE the guts of the education system. Titanic fails to make it easy for Toyota workers and Countrywide workers and SeaWorld workers and Olympic athletes and BP workers and coal miners to report that something fishy is going on...before customers' lives are ripped apart.

TV used to listen. TV used to actively seek out the truth and share it and hold authority figures accountable. Ditto for newspapers. But no more. Now the profit line defines their survival and their judgments. Now they are just as vulnerable as the consumer and the citizen because they have forgotten something: They are there to do a job and then make money, not the other way around. They have forgotten that a good story, a useful story, a "we're about to get scr*w*d by Titanic business or Titanic government" story IS what drives readership and viewership and listenership. Not coincidentally, when the greats of news retired, the next generation couldn't wait to take the wheel. But because they have never gone through truly tough times like World Wars or the Depression or massive violations of the Constitution, these "media leaders" are soft and dislike discomfort and full of hubris instead of humility.

They keep young digitally savvy employees out of the early process where key decisions are made, decisions that are commonsense by last-century assembly line top-down standards and deadly to innovation in the fastpaced, hyper-networked world we live in. This you can see everywhere. Rather than pull in retirees with chops and young employees, Titanic institutions stay the course. Their leaders stay in denial as the whole thing starts to sink. And do they snap out of it? No, because their friends and family and neighbors, the people who form their little bubble, their cocoon of "what makes sense", those folks all are on the top level of Titanic where the brandy and cigars and nice linen line the table. Meanwhile, the folks locked below in steerage, the lower and middle classes, they are in rising water that's chest deep and they are angry.

TV is keeping the folks on the top deck complacent. That's what's wrong with TV. That's why we need alternate versions of sensing that are linked with tools of doing. We do not need to encourage civic engagement, we need to pave the way and get out of the way. Our institutions by design make it hard for single voices to be heard or even sensed. Titanic is designed to work in a big fat scaled up world, not to deal with little stuff like a tip from the father of the underwear bomber or the FBI field tips that preceded 9/11. Our IT and computers and cellphones could give Titanic government and Titanic business a genuine relationship with Americans on an everyday ongoing basis. The technology makes it practical.

Let's start with having kids weigh in on what works and what doesn't in public schools, and let's take them seriously. Maybe then they and their parents will watch less TV. Maybe then America will reverse its nosedive in international ratings of every kind.

6. fluffysingler - June 14, 2010 at 02:00 pm

Is it possible to be an active watcher of television, to question what is presented, even the advertisements? At my house, we are constantly rewriting the ads as they're presented on tv, talking back to them and revising their intended message, which I then turn back into blogs and comments on my facebook and poetry and such. Television is, for the most part, vile, but as I like to say, even bad art is instructive, if you talk about it and don't just passively take it in. The new media can be used not only to talk about and promote mass culture, but to critique it. It can be even more symbiotic than it is now.

Add Your Comment

Commenting is closed.

subscribe today

Get the insight you need for success in academe.