Randy Enos for The Chronicle Review
Someday we'll tell our grandchildren about our first interaction with the Web. We'll tell them we called it "surfing," and regale them with mythic tales of newspaper sites offering all their content free. Music, movies, games, and videos flowed constantly across our screens. The occasional unobtrusive (and easy-to-ignore) advertisement barely hindered our momentum as we marveled at the overflow of information and misinformation, amusement, interaction, and engagement.
Perhaps we might recall those amazing retailers—such as the original MotherNature.com or webvan.com (a grocery-delivery service)—that gave away products simply for stopping by their Web sites. We didn't need more than one password or login, and the "captcha" technology required to differentiate humans and robots didn't yet slow us down. All we needed was a computer, a decent connection, and curiosity.
Our grandchildren will roll their eyes. They won't believe us. The Web they know will be a place where they are permanently logged in—on Twitter, Facebook, and Google—under their real names. It will be a sphere of retail, primarily, where "one-click shopping," "premium content," and "enhanced access," easily purchased by verified identities, will be standard. The disappearance of the early Web's anonymity will matter little to them because they never will have experienced it. They won't understand how morons once lost their jobs by posting public inanities on Facebook, because throughout their lives, the appropriate boundaries of behavior on social media will have been taught to them by their schools and parents. Our stories will bore them—just like the Web.
We are at a pivotal moment in the Web's evolution. Over the past year, numerous Web sites, blogs, and content providers have started to meter and restrict content to enhance revenue. Because of the slow growth of online advertising, revenue enhancement is now critical to survival. The free-content gravy train is quickly breaking down. The Web is starting to feel constricted and channeled as each new gate and tollbooth appears. Recently, popular blogs such as Talking Points Memo and Andrew Sullivan's Daily Dish have launched pay models, and venerable, popular Web sites like the Atlantic and Slate are reported to be considering new ways of monetizing content. Everyone, it seems, is viewing the transition as inevitable, whether they believe it is progressive or regressive.
But why do we agree to less free content, with more intrusive advertising and surveillance? How does restriction gain acceptance in the land of the First Amendment?
Every communication technology herds, channels, networks, packages, and manipulates its users. Audiences are generally unaware of the process, as they are "narcotized" (in Marshall McLuhan's memorable phrasing) by the medium's dazzling content. They miss the machinations behind the curtain. In exchange for free stuff—like I Love Lucy or a Facebook account—audiences implicitly (and now explicitly) agree to allow their consciousnesses and identities to be packaged and sold by media corporations. This is no conspiracy theory; it is the basis of the American system of commercialized communications, as detailed by scholars like Dallas Smythe, Paul Starr, Tim Wu, and others. They alert us to the clearly discernible pattern of restriction structuring the historical trajectory of all American mass communication.
The restrictions may involve governmental regulation (such as the establishment of broadcast licensing, in 1927) or it might result from the operations of commercial entities (such as when the Big Five—20th Century Fox, Columbia, Paramount, Universal, and Warner Bros.—consolidated control over both movie production and distribution). American audiences generally, but sometimes reluctantly, acquiesced when corporate titans sold them on the benefits of restriction. The concepts if not the buzz phrases of "premium content" and "enhanced user experiences" are not new. Innovative efficiencies are always promised when channeling freedom—whether it's adding sound to silent films or adding pictures to radio.
Understand my ether-lads and digi-lasses, every generation wistfully recalls the media of its youth.
A proper remembrance is in order before the old Web fully slips away. But while the old radio stars cussed the coming of television, and the silent-film actors lamented the birth of the talkies, this transition does not call for a jeremiad. There was no "golden age" of the Web, any more than there was really a golden age of television. To speak of TV's golden age is to celebrate three black-and-white channels with little audience empowerment or interactivity. Today's television production (at its best) is far more visually stimulating, and the writing far more complex and engaging. The media environment is always improving and deteriorating at the same time—thus the simultaneous existence of Mad Men and Toddlers and Tiaras. To argue the purity of either decline or progress ignores the nuances of transition. We're better off pausing to remember and describe the Web as it was, for better and for worse, for the generations that can't experience it the way we did.
Gather 'round, we'll say to those grandkids. Unpaste your cranial connectors from your aural sensor-piercings and let me tell you 'bout the very early days of the Wild, Wild Web.
Even before Facebook, Twitter, Pinterest, and mobile devices served by omnipresent Wi-Fi, the Web offered unlimited creativity, vitriolic outbursts, mystery, and intrigue. It was accessed only on a desk at work or at home. Much of it was crap, as primitive graphics, shocking background colors, and stilted animation vied for attention. It housed racist kooks and conspiracy theorists of all kinds, and the sheer volume of uncollected, unorganized, and unverified information could both dazzle and overwhelm.
Long before we followed the recommendations of our "friends," we were thrilled and anxious as each click hypertexted us into the unknown. We'd use something called a mouse to venture toward enlightenment or mortification. Malware and viruses lurked in dark alleys. But you could also stumble on whole communities you'd never otherwise have found.
The big problem was trust. Credibility and reliability were rare and difficult to establish. Some brand names, like CNN, proved useful for navigating the immensity, but, in general, it was a crazed superhighway with few reliable signs or maps.
For many newbie users, the relative freedom of the Web in its first decade was surprisingly problematic. Some early Web visionaries understood how overwhelmed the uninitiated would be. "Now here is the curious thing," Timothy Berners-Lee, the Web's inventor, told an audience in 1997. "There is so much data available on Web pages that there is a market for tools that 'reverse engineer' that process," he explained, implicitly envisioning the rise of search engines like AltaVista, Yahoo, and Google. The problem, he said, was that such "reverse engineering" ran directly counter to the Web's purpose. "Universality is essential to the Web: It loses its power if there are certain types of things to which you can't link."
Universality and freedom, however, came at a price. Users traded and republished copyrighted materials, while some created forums for nonjudgmental discussions of everything from anorexia to pedophilia. Pornography became ubiquitous. As the Web became more unruly, only obscene cranks and libertarians preferred such messiness to order and efficiency. The government required more surveillance to curb lawlessness. Corporations became more vigilant about copyright enforcement to protect against piracy. Identity theft had to be countered with various new forms of verification and encryption. Retailers, advertisers, and service providers also needed to confirm identities—to sell, and sell to, their audiences. "On the Internet, nobody knows you're a dog," read the caption of a famous 1993 New Yorker cartoon. But soon the joke was on us.
Understand, my ether-lads and digi-lasses, every generation wistfully recalls the media of its youth. Our ancestors heard on the radio Orson Welles's electrifying War of the Worlds or news of Japan's World War II surrender. Or they huddled in front of black-and-white TV sets through the melancholy weekend following a young president's assassination.
In You Can't Go Home Again, Thomas Wolfe explains that home is never a place, dwelling, or community; rather, it is a forever-receding moment. Revisiting old places with new eyes will always doom us to disappointment, whether in Fitzgerald's Gatsby or the Pretenders' "My City Was Gone."
And so it goes with the Web. Yours is not what ours was. Sure, you can still purchase airplane tickets, publish recipes, and blog. But you'll never know the unruly, untamed, riotous public space filled with knaves and fools, demagogues and geeks, gawkers and exhibitionists. You could take or leave Farmville and planking. But the mayhem had its charms.