Three unemployed technology marketers are sitting around the ping-pong table in their garage in the year 2008, depressed and trying to come up with a new idea to re-boot their careers. One of them says to the group “I’ve got it!” The others perk up and direct their attention to the proverbial light bulb over his head. “I hear that there are several new technology startups who are developing the next big thing in Internet software, but they have no idea what to call it.” The others nod their heads in active agreement. “We can offer to promote their companies if we just come up with a name for what they are doing, and I’ve got the best idea already. Nobody’s ever thought of it.” Ripe with anticipation, the other two marketers clamor for him to spit it out. With a grandiose vigor, he stands up and proclaims “Web 3.0!”
Web 2.0, Search 2.0, Life 2.0, World 2.0. The metaphor of software versions to describe technological and social phenomena once upon a time was clever. But as with all clever sayings, it became overused and is now cliché. The draw toward terms like “Web 2.0” is of course that it makes a strong implication that what it represents is a “next generation” of something good enough to have gotten a second run. The trouble with such monikers, though, is their post-modern tendency to merely be “what came after.”
Enlightenment thinking was clear and organized. There were disagreements amongst the thinkers of the Era, but the Era itself was definable. Post-modernism cannot be defined except by saying what it is not. It is not modern; it is what came after the Enlightenment. “Web 2.0” suffers from the same malaise. People across the globe are publishing countless articles and books to try to define Web 2.0, but like its underpinning philosophy, it is not easily defined. In fact, to put it into a box would be to contradict its very nature.
The abandonment of the Enlightenment by the Post-Moderns was not a revolution in Philosophy, but a rebellion against it. The Post-Moderns concerned themselves with the demolition of power-relationships, authority-structures, even the architecture of language itself. The results have been decidedly mixed. The nihilism of The Bomb, the ethical bankruptcy of eugenics and similar traffics in human suffering are examples of its negative effects. On the other side, however, is the emancipation of women, racial equality movements, gay rights, youth voting rights, and so on.
As we watch the advent of the Post-Modern Internet embodied in the Web 2.0 movement, we will see its effects reverberate throughout society. Web 2.0 proclaims to be the era of the User. The power-structures that defined Web 1.0 were a destination-driven experience, one created not by users, but for users, and with little input or insight from them at all. The rebellion has been quite different. Blogging has created pressrooms of one. Social networking empowers regular individuals to reach mass audiences and peer-groups through a series of simple clicks. Video-sharing has made it possible for lay people to produce satire and political speech with budgets of almost nothing.
There is little doubt (in my mind at least) that Web 2.0 will continue to annihilate the current strangleholds on power and influence of the Mainstream Media, traditional movie production studios and distribution agencies, political parties and interest groups, teachers, scholars, religious and educational institutions, corporations, and governments. The Post-Modern Internet, Web 2.0, and its leaders have a responsibility to mature in their power, however. Web 2.0 can take two distinct directions, and it is perhaps the rhetoric of it all that will define the path. Web 2.0 can be the French Revolution of Technology or it can be the American Revolution of Technology.
Joseph Schumpeter’s winds of creative destruction are blowing especially hard in the Internet technology world today, with remarkable improvements to our daily lives. But these winds can blow too hard too often, and an even older economic law, the Law of Diminishing Returns, begins to take over. Our wild-eyed radical phase must ultimately give way to some replacement. We cannot permanently be the rebels. At some point, people will get "2.0 fatigue." That point may be upon us as it is. People eventually want stability. The problem with successful rebellions is that rebels rarely know how to govern or else they take up the mantle of those against whom they rebelled, and like Orwell’s pigs in Animal Farm, they begin to sleep in the old rulers’ beds.
Ultimately, therefore, the success of Web 2.0 depends less on what it accomplishes in the present and more on what groundwork it lays for the future. Indeed, it is rather ironic that the final metric for Web 2.0 is what comes after it. The early 20th century British essayist G.K. Chesterton once observed “The object of opening the mind, as of opening the mouth, is to shut it again on something solid.”
Web 2.0 companies would do well to take his advice to heart.