Bloggers, pundits, and industry analysts have been earnestly debating the question for a while: What will Web 3.0 be? Of course, they have their critics, those who call the term a lot of hype. Unfortunately, their critics need to get harsher. Web 3.0 is worse than a meaningless buzzword; its use is bad for communication, bad for the interactive field, and simply stupid.
Today, everyone is obsessed with Web 2.0. Yes, if you work in "the field," you may say the term is falling out of use. If you take a walk outside of the industry, however, it's actually gathering steam. Organizations are asking to Web 2.0-ify their sites — not quite knowing what they're asking for, but well aware that everyone else is doing it.
As someone who has to communicate with nontech people, I'm tired of explaining that Web 2.0 doesn't require a special browser. I'm tired of explaining it's not a thing but a concept … well, a bunch of concepts, even though people don't always agree about which concepts are included in the bunch.
Yes, we widely agree that "social media" is a big part of it. Many people include tools like AJAX and Flex, which allow for more dynamic presentations of content and make websites look and feel more like a desktop application. Others include XML and open APIs, which set content free from form and allow us to mix and mash content from different sources in ways the original creators may never have dreamed.
The problem with using the term Web 2.0 (and the burgeoning 3.0) is that it's applying software syntax to communications issues. Software developers carefully track versions to mark distinct changes in the technology. (Developers out there may argue that it's not that clear, but that's another debate.) The web doesn't have a development team to decide when the next release is. More to the point, when we talk about Web 2.0, we're not talking about a change of technologies. We are talking about changes in the ways people use the web, changes that have been facilitated by many new technologies.
The web also differs from software in that what is new does not necessarily replace what came before it. Often it augments it. Adobe Creative Suite 3 was intended to replace CS 2. Windows Vista is intended to replace XP. Web 2.0 did not replace 1.0.
The ramifications of this are significant. I can't count the times I've seen organizations that are heavily invested in Web 2.0 but have missed the Web 1.0 boat. Their information architecture is awful. Their user interfaces are unusable, and their web presence is completely out of brand. They have blogs, but are missing the true fundamentals.
If you're a company that's using social media to drive potential customers to your site, hopefully leading to purchases or other conversions, this is disastrous. If you are an information-rich site and nobody can find that information, what's the point of compiling all the data? If you look clunky to an increasingly interface-savvy population, you've lost your credibility — whether you've got an e-commerce site or online magazine.
About a year or so back, I was talking with my friend El, a usability analyst. Some of my earliest web projects were writing functional copy to improve the usability of sites her company was developing. When we were working in the late 1990s, she recalled, you really needed to test everything. Different companies' users could have radically different responses to a web interfaces. That's changed. Though usability testing is often critical, there's been significant codification of the language. People have expectations, some of them quite simple; if there's a company logo in the top left of the screen, users will assume it's a link back to the homepage. (I still see Web 2.0-invested companies that have their logos at the top of their site but don't use it as a link.) That language was developed in the retroactively named Web 1.0, and it hasn't gone away, though it does continue to evolve. The belief that we are in a Web 2.0 world has caused some to overlook the basics we learned during the web's first decade.
I also wonder if all this versioning is limiting our accomplishments by narrowing our view. Communications 101 tells us that language shapes the way we think; we are more likely to make our reality fit our words than the other way around. By defining what Web 2.0 is, have we stifled other applications of interactive technology? Well intentioned though it may be, the struggle to define Web 3.0 is a struggle that could stuff innovation into the confines of a box.
If our obsessive labeling doesn’t get in the way, there won't be a Web 3.0. Instead, there will be thousands of Web 3.0s. The technologies that fuel interactive media and the ways they are implemented are going in as many directions as there are creative minds to push them. The web is breaking out of its mold in every possible direction — from internet-enabled applications to mobile devices, from mash-ups to customized homepages to applications that function in browsers but feel like they're on your desktop.
With open APIs, XML, and a host of related technologies, people are creating tools whose real purpose is to allow others to find a purpose for them. Yes, Flickr is cool and so is Google Maps, but they're a whole lot cooler when you find they can be mashed together. What is Twitter for? Depends on who uses it. Don't like Twitter's web interface? Download a desktop application. Is MySpace a social site or a marketer's new frontier? Both. I SMS to my blog and receive IMs on my phone. In short, we are creating a web that is truly worldwide, one that stretches beyond computers and beyond information.
The whole point of defining Web 2.0 was to figure out where we are. Unfortunately for those who like buzzwords, we are everywhere. The whole point of discussing Web 3.0 is to figure out where we are going. Well, here's the news: We're not all going to the same place, and that is the beauty of this medium (or perhaps these mediums). The possibilities are endless and will continue to defy labels. We are just at the beginning of this "internet thing," and what comes next is going to be many things — some will die anonymous deaths and others will change the very nature of the way we communicate.