Friday, October 19, 2007

Is this the Death-Knell For (Relative) Silence on Airplanes?

The BBC carried a report yesterday that raises the alarming possibility of extending cellphone use on board airplanes from just either end of a journey to throughout the duration of the flight.

Now before I go any further let me just say that I do not go as far as the British essayist Pico Iyer, who once wrote, in "The Eloquent Sounds of Silence":

Silence is sunshine...company is clouds;
silence is rapture...company is
doubt;
silence is golden...company is brass.

But I will admit to a horror of being surrounded by people talking on their cellphones while the rest of us are trying to enjoy the latest movie or catch up on work, on life...or on sleep.

Here's how it would work, according to Ofcom, the official body that's basically the telecommunications regulator in the UK:


The key to the whole thing, the technical trick that circumvents the problem found in 2003 by the CAA that mobile phone signals skew navigation bearing displays by up to five degrees, is that cellphones in the plane are not allowed to connect to any base stations on the ground.

The proposed system utilizes an on-board base station in the plane which communicates with passengers' own handsets. The base station - called a pico cell - is low power and creates a network area big enough to encompass the cabin of the plane. The base station routes phone traffic to a satellite, which is in turn connected to mobile networks on the ground. A network control unit on the plane is used to ensure that mobiles in the plane do not connect to any base stations on the ground. It blocks the signal from the ground so that phones cannot connect and remain in an idle state.

So much for the technical side if it. The social side of it is less clear-cut. One thing is an iPhone, but a skiPhone might just be the death-knell for (relative) silence on airplanes.

Wednesday, August 8, 2007

"No Boss Can Afford to Remain Clueless About Web 2.0"

There is only one path: Marketing and technology in your company must work together to design and implement your Web 2.0 strategy," says George Colony, chairman and chief executive officer of Forrester Research, the very same week that Web 2.0 University has announced that its unique one-day Web 2.0 education event is now being made available to Greater New York Area's executives for the first time ever on October 2nd, in midtown Manhattan.


As BusinessWeek wrote already in March 2006: “For all its appeal to the young and the wired, Web 2.0 may end up making its greatest impact in business.”


In his "Perspective" piece published yesterday at CNET ("Web 2.0 and the CEO"), Colony contends that "Web 2.0 has forever changed the relationship between your company and your customer" and advises them to take on board four key principles of doing business in the age of the New Web:


  • Your company is inside-out in an outside-in world.

  • Your company has a bad Web site.

  • You should be asking your customer one question.

  • You don't own your customer; your customer owns you.

Knowing the customers better than the competition is now crucial, Colony contends. He uses the example of Dell:
"This spring, it launched Ideastorm, where customers make suggestions and vote on them. The result: Dell's Net Promoter scores are back on the rise."

Inspired by Dell's example, Colony issues the following call to arms to Web 2.0 era bosses everywhere:

"They see the power of digital and its inherent flexibility. They know that it can do amazing things, and they could not care less about artificial, archaic restrictions that are designed to protect somebody's 50-year-old business model. So, Mr./Ms. CEO, wake up and face the brutal truths, and get on with inventing the future."

The full CNET article can be read here.

Sunday, July 15, 2007

Would These Web 2.0 Roses By Any Other Name Smell As Sweet?

One of the first things that anyone reading this recently compiled 'ABC of Social Software' will notice is that the Web 2.0 world – with gems like Blummy, Linkwalla, Qumana, Spurl, Yedda and Zotero – isn't close to losing its happy knack for coining jaunty names.

Will the inventiveness ever run out? An A-Z based on the recent Web 2.0 Expo in San Francisco would have revealed further characteristic examples, including Fatdoor, JibJab, Jubble, Quagga, Tangler, and Zuzzle.

I can't help wondering: web-wide, what are the ten most pleasing – and/or exotic – Web 2.0 names to date?

Tuesday, July 10, 2007

Nielsen/NetRatings Elevates 'Time-Sink' To #1 Metric for New-Web Success

"Total minutes is the most accurate gauge to compare between two sites. If [Web] 1.0 is full page refreshes for content, Web 2.0 is, 'How do I minimize page views and deliver content more seamlessly?'" With that declaration, Scott Ross, director of product marketing at Nielsen/NetRatings, aligned his company with the New Web and the next-generation Internet.

While Nielsen/NetRatings will still report page views as a secondary metric, Ross continued, "For the foreseeable future, we will champion minutes if you are comparing two sites.

The news immediately made Slashdot, where worries were expressed that this emphasis on viewing time will cause designers (and their bosses) to try anything they can think of to slow down the user.

Although today's longer user session lifetimes are with us to stay, one Slashdot regular – SmurfButcher Bob – found the implications of the Nielsen/NetRatings announcement objectionable:


"Welcome to Web 2.0. What was the phrase? Oh yes... 'it's about the data, stupid.'

This "2.0" cr*p generally has nothing to do with data; it's generally related to bullsh*t, and that's why most of us don't "get it" as having a point. And in that context - page hits are an excellent metric for data; time-sink is an excellent metric for "feel-good" crud. ... The non-data crap has no point, so a metric that measures something pointless is... pointless.

Ya have to remember - "1.0" success is based on the merit of the data. "2.0" success is effectively based on users, and the data (if any) typically has no actual merit - so page hits have no meaning. It's all about "look at the monkey! look at the silly monkey!" - an area in which Nielsen has great expertise (Wackiness ensues).

The stupidity of "2.0" aside, Nielson is probably correct in their assertion about measuring it (not the stupidity... that's too big to be measured. But the time-sink aspect seems correct.)"

Ouch. If he's not already a member, it sounnds like SmurfButcher Bob will soon be joining the Boycott "Web 2." group now active on Facebook, as reported earlier this week by Social Computing Magazine.

Wednesday, May 23, 2007

Is Web 2.0 Just Riding the Synchronicity Highway?

The Swiss pyschologist Carl Jung believed that many experiences perceived as coincidence were due not merely to chance, but instead suggested the manifestation of parallel events or circumstances reflecting this governing dynamic. He even gave this occurrence of multiple events which occur in a meaningful manner a name, he called it "synchronicity."

There is a school of thought that views "Web 2.0" as nothing more than a marketing term in search of a meaning. I do not agree. Clumsy as the term itself may be, it resonates with enough people using and harnessing the Web in their life and their work to make discussions about its merits as a lexicon item moot. Even Cisco's chairman and CEO, John Chambers, is happy now to give Web 2.0 the endorsement of the ruler of routers, the sultan of switches; and when Cisco Systems, with a market cap of about $165 billion, gives a phenomenon the business thumbs-up – emphasized by its recent purchased of WebEx and the select assets of Utah Street Networks, putting it firmly in the social networking marketplace – then you know it is definitely very, very real.

What other "multiple events" can we adduce, to bolster the sense that Web 2.0, in the words of Social Computing Magazine, is "Reaching into the Business World with Both Hands"? And can we reliably conclude that this is more than just happenstance?

Chambers, when Cisco bought Webex for $3.2BN, said that Web 2.0 "is redefining how people, companies and countries collaborate in ways never before realized." In the press conference given to announce the purchase, he defined Web 2.0 as simply "the technologies that enable user collaboration," – adding that "these technologies include web services, Unified Communications, TelePresence, blogs, Wikis, pier-to-pier networks, podcast, Myshelf" etc.

The business relevance of all this co-technology (a term that I have coined to help with connoting somewhat more accurately than "Web 2.0" the essence of the next phase of the Internet) was spelled out by Chambers in a keynote this week at Interop Las Vegas. It is that it will increase enterprise productivity.

The gains won't come fast (Cisco's own leadership team, Chambers revealed, took four years to adapt to a more collaborative work environment), but they will come – at a rate of about 3 percent or more through per annum the next several years, according to Chambers.

"What kids started with social networking," Chambers intoned, "will move into business."

In other words, as Dion Hinchcliffe has repeatedly pointed out:

"It’s usually the new arrivals and the technologically savvy, younger workers, who will be using those new tools. They are going to proliferate and spread, no matter what you do."

So Enterprise 2.0 tools are the spiderwort of the 21st Century work environment. Jan Wergin, executive vice-president of Jubii, agrees with Hinchcliffe:

"We know what we are doing at home. It gets us excited, it gets us involved. Now we want to do it in our workplace as well where we spend 8-10 hours a day. Why not do the same thing? It’s not the focus on technology and making it more sophisticated and making it more complex, but it’s making it very easy, making it intuitive, making it easy to understand and taking something that we know from consumer Web 2.0 into the business suite and using it there."

What are the barriers to co-technology in the workplace. A British PR executive and social media analyst by the name of Richard Stacy has written that the disruptive nature of social media is so significant that, in the consumer world, it will mean that in some categories consumers will effectively have all the power and will operate the category under the form of a virtual user franchise. The counterintuitive consequence of this is that, since best-of-breed options will replace the mere illusion of choice, consumers will – in Stacy's view – "know as they walk down the supermarket aisle that they won’t be facing a range of duplicates – each and every one of the products they put their hands on will be the best in a class of one."

Here's how Stacy unpacks this idea more fully:

"I know this sounds a little like the Soviet Union or China where there was only one choice in every category and a very limited range of categories. The difference however is that the provider is not a remote and inflexible state enterprise but will have to be an organization that can instantaneously respond to consumers’ needs or else run the risk of losing its franchise.

Such an organization probably won’t own the means of production – this will be contracted out – but in reality many branded product manufacturers already do this. Instead, the organization will be focused on quality of product and real product innovation, rather than spurious innovation designed to establish differentiation.

You could say that the current system of having many organizations offering many choices and all competing against each other already provides this or is the best possible way of doing this. Well no. We don’t actually live in a world where the consumer is king – we live in a world where the consumer is important but where collective corporate profitability is king. We should also not forget that consumers don’t necessarily like choice, consumers like options and they like to know they are getting the best in all of the options they select."


So Stacy, in short, believe that co-technologies are capable of transforming more than just the workplace: they're likely to turn upside down the world as we know it today:

"The implications of this spread beyond just marketing. The possibility of social media actually empowering real consumer (or citizen) choice could have quite profound consequences and might actually deliver the forms of more efficient or perfect markets that proponents of free market economics set up as a necessary condition, but supporters of free markets find conveniently impossible to deliver in the real world."

So, is Web 2.0 just riding the synchronicity highway, hanging on to the coat-tails of happenstance? Or is social computing, as I have been alleging for some time, genuinely turning the world – including the business world – upside down and inside out? What do you think?

Sunday, May 20, 2007

Making the Business Case for Enterprise Social Software Is Getting Easier

According to a report today by CRN's Heather Clancy, Forrester analyst Laura Ramos wowed the audience at the recent Forrester Research IT Forum 2007 by sketching a scenario – "Enterprise Software in 2017" – in which consumer expectations made business users more impatient and went with service-oriented architectures that support myriad application combinations, especially those promised by Web 2.0 and the social network movement.

This is precisely the same trend addressed by serial entrepreneur Mark Sigal in his recent article for Social Computing Magazine, "Social Media: It’s All About Breadcrumbs and Conversations."

Sigal asks whether social media is "just a consumer phenomenon or the tip of some larger iceberg that subsumes big brands and large enterprises" – then answers emphatically that it is the latter.

"Specifically because this stuff is so visceral and because it has proven to be so virally effective," Sigal writes, "its role in business, today a tiny heartbeat, is destined to grow into a walking and talking organism that some people call Enterprise 2.0."

Wikipedia already agrees that "enterprise social software" is now a real and distinct category, which is why too CMP Technology next month is launching a new conference devoted entirely to Enterprise 2.0.

And why Dion Hinchcliffe is running an Enterprise 2.0 Track on Tuesday at Interop Las Vegas, one of the biggest IT shows of the entire calendar this year.

Convincing upper management of the business benefits of social software used in enterprise contexts is the next ongoing task. Besides the usual suspects like enterprise wikis, corporate blogs, and unified communications, what are the most interesting, productive, and profitable "edge cases" of social software being used right now in the corporate enterprise? I'd love to hear up-to-date reports from the trenches - jeremy at geelan dot com.

Sunday, May 6, 2007

Are Enterprise 2.0 Tools the "Wildflowers" of the 21st Century Workplace?

Tom Davenport, who the Harvard Business Review informs us holds the President’s Chair in Information Technology and Management at Babson College, contended recently that "Enterprise 2.0 Won't Transform Organizations."

Davenport is nothing if not straightforward:

"The primary proponent of this movement is HBS professor Andy McAfee, for whom I have a lot of respect. His are some of the most interesting thoughts on IT to come out of HBS in a long time, and he's a nice guy to boot. What he's trying to do is to bring Web 2.0 technologies into the enterprise, to understand and describe how blogs, wikis, tagging, and other participative tools will change large bureaucracies. He believes they will empower employees, decentralize decisions, free up knowledge, and generally make for better places to work. I share his goal of more democratic organizations and hope he is correct.

However, I fear he is not."

The reasons Davenport gives for his skepticism include "organizational hierarchy and politics," together with all the usual suspects whenever change is involved: "power differentials, lack of trust, missing incentives, unsupportive cultures, and the general busyness of employees today."

That last factor is uncannily reminiscent of the Oregon logger found sawing down a Pacific Silver Fir with a bread-knife. When interrupted by a neighbour who'd observed him hacking away for the past three days and who sought to lend him a chainsaw, the logger replied "I'd love to stop and chat but really I haven't the time – I've got this huge tree to fell."

Dion Hinchcliffe, as we have come to expect, has a radically different take. Davenport is missing the predominant reality of E2.0, Hinchcliffe retorts, which is that "the challenges of Enterprise 2.0 adoption will likely take care of themselves." By this he means that, as a generation of professionals enters the workplace who have been brought up on Wikipedia, MySpace, and Facebook, "it is inevitable that Web-based tools will simply appear, like wildflowers, in the fertile fields of our businesses and institutions."

If anyone be in any doubt, by the way, they needn't be. Enterprise 2.0 a process that has already begun.

This is precisely why, in April, a global IT leader like Accenture launched its new global employee network, an Intranet application that borrows ideas from Facebook, De.licio.us, YouTube, Wikipedia and Second Life.

“The younger employees carry it,” explained Accenture's CTO Donald Rippert in a presentation at his company's recent Global Convergence Forum in Rome – meaning that they were the first to publish on wikis, to tag content so it can more easily be found by their Accenture colleagues worldwide and so on.

So when Tom Davenport rounds off his "Why Enterprise 2.0 Won't Transform Organizations" post with the remark that

"It's going to be very interesting to see what happens when the young bucks and buckettes of today's wired world hit the adult work force. Will they freely submit to such structured information environments as those provided by SAP and Oracle, content and knowledge management systems, and communication by email? Or will they overthrow the computational and communicational status quo with MySpace, MyBlog, and MyWiki?"

he is, well, basically...too late.

Hinchcliffe documents the process as follows:

"I now routinely collect stories of firms large and small encountering these tools sprouting up within their organization, both via internally installation of these platform to employees just putting their favorite externally hosted Enterprise 2.0 tool subscription on their corporate credit card. In other words, because they appear to so easily cross organizational boundaries, can be adopted so easily, require virtually no training, are highly social, and so on, Enterprise 2.0 apps appear to have their very own 'change agent' by their fundamental nature."

Now I am certain that neither Hinchcliffe nor I would claim that enterprise wikis and/or other collaboration platforms can turn around businesses by themselves. As the Burton Group's Mike Gotta has written:

"Change is a complex choreography and as new ways of doing things takes shape, new tools are one facet of that emergence. So tools can indeed help enable all types of transformation (expected and unexpected), but there is no silver bullet, you need to do more than deploy technology."

But when the likes of Accenture, Morgan Stanley, and Wells Fargo – all three blazing a trail in the enterprise-wide adoption of these kinds of tools – endorse something, you have to believe that the rest of the business world cannot be too far behind.


Don't forget to catch the first episode of The Enterprise 2.0 TV Show, by the way, in which 4 industry pioneers talk about their work in the trenches of Enterprise 2.0: Socialtext, Near-Time, Kapow Technologies, and Jubii. Disclaimer: Jeremy Geelan is the Web 2.0 Anchor of The Enterprise 2.0 TV Show, and Dion Hinchcliffe co-presents.

Tuesday, May 1, 2007

Are We Witnessing the Death of Personal Computing?

Death is an unusually severe and degrading punishment, but sometimes it serves a higher purpose – and the death of personal computing is a great example. Because the potential of its replacement and successor, Social Computing, is exponentially greater.

Much has been made, recently, of the amorphous term "Web 2.0" and even Jeremy Zawodny, a leading blogger, is so challenged by its elasticity that he's just asked his readers "What the heck is Web 2.0 anyway?"

Zawodny's question spawned the usual selection of knee-jerk responses, but one very nearly nailed it:

"I think Web 2.0 is really more of a philosophical concept than a technical one. People always refer to open source, AJAX, perpetual betas, etc. as cornerstones of Web 2.0, but really there's only one common thread between things as diverse as YouTube, Twitter, Flickr, blogs, and social networks:

They connect people and are driven by people.

I know I'm not the first to use the term, but I think Web 2.0 is 'The Living Web'. This includes 'old' technology like web forums, IRC, and even Usenet. They connected people and were driven by people...the original UGC in some ways!"


"Driven by people" though still doesn't quite, for me anyway, pinpoint the essence of social computing, which is the coalescence of collective intelligence (a tried and true concept) with the sudden (and much newer) maturing of various co-technologies such as feeds, tagging, trackbacks, photosharing, wikis, and the like.

Such "co-technologies" are to Social Computing what HTTP, TCP/IP, and View Source are to the World Wide Web – the technical glue that makes it cohere.

The result is that we are seeing the emergence of what Warren G. Bennis inspirationally called "Great Groups." Bennis was fascinated by what he called "The myth of the triumphant individual," which he felt was deeply ingrained into the American psyche.

"Yet we all know that cooperation and collaboration grow more important every day," Bennis wrote, continuing:
"The problems we face are too complex to be solved by any one person or any one discipline. Our only chance is to bring people together from a variety of backgrounds and disciplines who can refract a problem through the prism of complementary minds allied in common purpose."

He called such collections of talent "Great Groups" and went on to note:
"The genius of Great Groups is that they get remarkable people -- strong individual achievers -- to work together to get results. But these groups serve a second and equally important function: they provide psychic support and personal fellowship. They help generate courage. Without a sounding board for outrageous ideas, without personal encouragement and perspective when we hit a roadblock, we'd all lose our way."

Social Computing leverages, through technology, the genius of groups. It is as simple, and as wonderful, as that.

Does this truly equal the death of personal computing? Well now of course I don't beleve for a moment that it does. Even though as long ago as 1994 I had the good fortune to publish Groupware in the 21st Century, the first ever comprehensive essay collection devoted to the phenomenon recognized (and named) by Peter & Trudy Johnson-Lenz back in 1978 when they coined the term, I don't actually contend that groupware supplants singleware.

But I do on the other hand truly, madly, deeply believe that their notion of groupware as 'a whole system of intentional group processes plus software to support them' is finally, through Social Computing, becoming a reality...but with the Web as the platform, rather than the PC.

Social Computing, though it is a computer-mediated culture, is a living system, shaping itself to support evolving group life, including that of Great Groups. Web-augmented dialogs among those holding diverse viewpoints on the critical questions and issues of our time represent, many of us believe, the hope of creating a better future for our children and our children's children.

"None of us," as the bumper sticker says, "is as smart as all of us."

Friday, April 13, 2007

What's the Significance of "The Mainstreaming of Web 2.0"?


During the Churchill Club's ninth annual Top 10 Tech Trends Debate last month in Silicon Valley, Kleiner Perkins Caufield & Byers legend John Doerr – best known for investing early in Compaq, Netscape, Symantec, Sun Microsystems, Amazon.com, and Google – observed that the venture capital industry funded more than twice as many Web 2.0-related deals in 2006, involving nearly twice as much money, as it did in 2005 – a trend that he did not expect to slow down any time soon.

At the same event Roger McNamee, cofounder and partner of Integral Capital Partners, called for "Web 2.0 applications that really move people's lives," which prompted Nexaweb's founder and CTO Coach Wei to blog that a company like his own provides an implementation of the fast-emerging Web 2.0 technology stack, which Wei defines as an Application Client Container | Internet Messaging Bus | Enterprise Mashup Server.

"I am lucky to be involved with quite a few Web 2.0 companies beyond Nexaweb (such as VisibleMeasures and HeyLetsGo)," wrote Wei. "I see tremendous untapped market/customer opportunities."

I was reminded of all this when reading VC Peter Rip's recent post on "the mainstreaming of Web 2.0" – a process that he contends is reached when there is no longer a profitable point of friction between the Present and the Future:

"Profits accumulate in the gap between What Is and What Is Possible. Web 2.0 is now firmly in the category of What Is."

Rip goes on to map the next stage of building out of the New Web:
"The hard problems in the vision of a true web-as-platform involve all the usual hard computer science issues. How can we normalize information from disparate sources to make it interoperable? How do we get to a lingua franca without waiting for moribund standards (think CORBA and SOA)? How can we then manage the transition of legacy information and services into this world of interoperability?"

According to Chad Jackson too, Web 2.0 is "the current stepping stone in the evolutionary process to the future where media is completely integrated and user-interaction becomes effortless."

Tim O'Reilly, whose "Blogger's Code of Conduct" recently made it to the front page of The New York Times, flags up that the real issue, while we wait for that future, is that "there is still huge opportunity in bringing Web 2.0 principles to mainstream business."
"I was at Thomson (Westlaw etc.) last week, and they are studying Web 2.0 like there is no tomorrow, and getting a good understanding of how it will apply to their business. Ditto many other mainstream companies. Salesforce.com is working hard to build a business platform for network applications, and I'm sure that there are others."

"Where are the enterprise applications?" O'Reilly continues, before adding:
"What does the open, network-enabled supply chain look like? What does the Web 2.0 insurance company look like? What does the web 2.0 credit card company look like? (Especially when they realize that they are most likely a phone application.)"

Tim O'Reilly, whose Web 2.0 Expo with CMP Technology starts on Sunday in San Francisco, believes that Mashups – which he defines as "brute force web data access, manipulation, and display" – are going to be superseded by Meshups – "natural data access, manipulation, and display" – in lock-step with the gradual realization by the broader Web user base that the Semantic Web is really about a global data integration and data generation effort.

As O'Reilly puts it:
"The Web is moving beyond unstructured and semi-structured blurb, it is becoming a bona fide database."

This "web of data" (to use Tim Berners-Lee's own characterization) inevitably, is already being dubbed "Web 3.0" – or, to use O'Reilly's descriptor, as the "Data Web (Semantic Web Layer -1) frontier."

As Indus Khaitan, Sr. Manager of Enterprise Marketing at Symantec, notes: "Web 2.0 is just a stepping stone for the Semantic Web."

For anyone with an interest in the future of the future, it promises to be an intensely interesting week here in San Francisco.

Thursday, April 12, 2007

Are We Blogging Each Other To Death?
– A Part-Response to Nick Carr and Dan Farber

Thanks to GrokDotCom's Jeffrey Eisenberg, attention has re-focused on this blog, originally posted 08:00 November 24, 2005. Eisenberg asks his own readers "Do you think [Geelan] has a point? Do you agree with him, think he's nuts, or what?" so it will be interesting to see who says what, 18 months on...


"For a journalist, technologist, politician or anyone with a pulse and who doesn't know everything," wrote Dan Farber on Monday, "blogs matter." Then, in almost a textbook demonstration of why in fact they don't, Farber adds:

"Every morning I can wake up to lots of IQ ruminating, fulminating, arguing, evangelizing and even disapassionately reporting on the latest happenings in the areas that interest me, people from every corner of the globe."

That "even" says it all. Dispassionate reporting would certainly be the exception rather than the rule. So in what possible way, then, is this testimony to why and how blogs "matter"? Farber is mistaking energy for insight, prevalance for significance, and quantity for quality. He might almost have written that every morning he wakes up with a column to fill...and an abundance of free material with which to fill it, served right up onto his desktop by the RSS reader of his choice. Every lazy journalist's nirvana, in other words.

It is no wonder then that Nick Carr, he of the first Web- then world-famous "Does IT Matter?" essay, jumped on Farber's hymn to the wonder of it all and mused:

"Experiencing the blogosphere feels a lot like intellectual hydroplaning - skimming along the surface of many ideas, rarely going deep."

At the risk of being uncharitable to Carr (sorry, sir!), this is a prime example of what my old Cambridge University friends would call self-iteration. In other words, Carr himself skims along the surface in his blog, without going deep, in order to demonstrate that one of the perils of the blogopshere is intellectual hydroplaning.

Let us then instead don a snorkel and mask, or even a full-fledged scuba, and head down beneath the surface. For there is definitely more (and less) to blogging than meets the eye.

Farber's notion of the blogosphere as comprising "self assembling communities of bloggers" who "hold a kind of virtual Socratic court, sorting out the issues of the day in a public forum, open to anyone, including spammers" is wildly fanciful. Shades of Jerry Garcia, in fact – for don't all self-respecting Dead-heads subscribe to Garcia's fantasy that "Once in a while you can get shown the light/ In the strangest places if you look at it right"?

The blogosphere is not nearly as noble a place: mainly because, of course, it isn't a place (unlike Socrates' ancient Forum) and therefore isn't subject to some of the basic advantages of, for example, ID verification. Nor can anyone look anyone else in the eye, across the blogosphere.

Anonymity can muddy the waters of almost any debate – yet the blogosphere is full of it, from Groklaw's "PJ" to PC Magazine's "Robert X. Cringeley." And as if that weren't enough to contend with, anonymity is compounded in six cases out of ten by the kind of vehemence more often associated with the bar-room than the Forum. Bloggers, it very often seems, are all legends in their own minds; they commit arson every day in their imagination, burning down the previous day's lies and distortions. Worse still, so many bloggers suffer from what Albert Camus called "the sign of a vulgar mind," namely the need to be right.

Why would anyone think that RSS, a wonderful enabling technology beyond a doubt, could somehow kiss the frog of human intolerance and ignorance and transform it into a prince of insight and wisdom? Beats me. "Groupthink," history shows us, can often in and of itself be worrisome. Just post to Groklaw that the emperor incorporated in Somers, NY, has no clothes and watch the brow-level of the replies/ripostes/flames sink...slowly at first, then faster. Or post to a Java user group that C# rocks...and watch the selfsame thing happen.

I would go so far as to say that, on a bad day anyway, there would seem to be an inverse ratio between an opinion's worth and the ease with which that same opinion can be expressed and disseminated. But it is better to light a candle than to curse the darkness, so I am going to end this brief entry with an upbeat thought about, not blogging itself, but the superset of which I believe it forms a (tiny) part...that of insight capture.

Insight capture merits the full weight of all our attention and expertise in the publishing industry, because it is only through trapping "the best of the rest" that we shall ever achieve the promise of the bumper sticker: 'None of us is smarter than all of us.' Unfortunately insight doesn't reside in blogs any more than wisdom resides in Fortune cookies. Insight is more chaordic: it occurs wherever opportunity meets preparation, at conferences, in airplanes, on trains, in private e-mail exchanges. Above all, it takes place in context. If there were a way of capturing such epiphanies, if one could but scale them up so that humanity could benefit from epiphany-en-masse, then that would be quite another pair of shoes. But waiting for the Epiphany Machine to come around makes waiting for Godot look reasonable by comparison; and anyone who thinks blogging is the light at the end of the tunnel of collective consciousness has failed to spot that it's much more likely to be the headlight of an oncoming train called The Techno-fad Express.

It's a medium, neither more nor less. An interesting one. A disintermediated one. But it is not any kind of hopeful message in and of itself. Blogging is to human insight as reading glasses are to human hyperopia. An enabler, a tool. It is a neat way of capturing disparate viewpoints, but not of synthesizing or critiquing them. For that we need other, still-emerging tools such as those that TBL is developing along with the supporters of the Semantic Web.

That – let us call it Web 3.0 – is still a long, long way away. Let us just hope, before such tools are ready to become mainstream, that we shall not already have blogged each other to death.


[Originally Posted 08:00 November 24, 2005]

Sunday, April 8, 2007

Social Computing Is Turning the Web Upside Down and Inside Out


Since most any two words can and will be put together in this world, what with us being Homo Loquens and all, it's easy just to shrug when you hear new colloquies like "social software," "social networking," or "social computing" and dismiss them as just three more inevitable permutations in a world of whirling words and phrases.

But this time, trust me, things are different. "Social computing," far from being just a random word-combo along the lines of wannabe duos like "air walking," "base jumping," "text messaging" and suchlike, is that fabled "New New Thing" (a reference to Michael Lewis's invaluable book of that name, documenting Netscape's Jim Clark's serial Webpreneurship in the heady days of the Internet Boom 1.0).

In other words, Social Computing is turning the Web world upside down and inside out.

Before I explain how and why, let us just lay to rest one other ghost. There will be those who, out of nothing but the sheerest prejudice against computer geeks and geekdom, suggest that "social computing" is a blatant oxymoron, right up there with "benevolent despotism." I have no truck with such bigots. On the contrary, computing - it turns out - is one of the most social technological innovations in the last thousand years.

Think I'm exaggerating? Read on.

Social Computing has been defined as centered on "software that contributes to compelling and effective social interactions" (http://research.microsoft.com/scg/).

At IBM Research, where the premise of the Social Computing Group is that it is possible to design "digital systems that provide a social context for our activities," the group characterizes social computing thus:

The central hallmark of social computing is that it relies on the notion of social identity: that is, it is not just the data that matters, but who that data "belongs to," and how the identity of the "owner" of that data is related to other identities in the system. More generally, social computing systems are likely to contain components that support and represent social constructs such as identity, reputation, trust, accountability, presence, social roles, and ownership.

What's the big deal? Why am claiming that Social Computing is right up there with Quantum Mechanics in terms of its likely impact on our modern world?

The answer to that question has already been hinted at by Forrester, which has published a slim, 24-page report on Social Computing subtitled "How Networks Erode Institutional Power, And What to Do About It." And it has been succinctly explicated by Dion Hinchcliffe.

Published in February of this year, the Forrester report notes:
To thrive in an era of Social Computing, companies must abandon top-down management and communication tactics, weave communities into their products and services, use employees and partners as marketers, and become part of a living fabric of brand loyalists.

Then, linking it directly with "Web 2.0," Forrester nails its colors to the mast by drawing a very telling analogy to help people wrap their minds around the raw disruptiveness of Social Computing:
"Web 2.0 is the building of the Interstate Highway System in the 1950s; Social Computing is everything that resulted next (for better or worse): suburban sprawl, energy dependency, efficient commerce, Americans' lust for cheap and easy travel."

Hinchcliffe reiterates this point, noting that one thing is clear, namely that the technologies of the modern Web are indeed reshaping our society, particularly of the younger generations that spend so much of their time there.

"The consequences could be dramatic," Hinchcliffe avers, "in the same way that the highway systems fundamentally disrupted the railroad industry."

Anyone wishing to explore further can click through on any of the below.

Further Reading on Social Computing

Feedster: www.feedster.com/search.php?q=%2522social+computing%2522
Digg: www.digg.com/?s=social+computing&a=30
Technorati: www.technorati.com/search/%22social%20computing%22v
Magazine: Social Computing Magazine

Sunday, March 11, 2007

What Is The Big Picture?


Nowhere in the preamble to the Declaration of Independence did Thomas Jefferson reference the Internet, eBay, Skype, or Flickr. But if he’d lived another 180 years, to 2006 instead of 1826, I feel certain he would at some point have said something like:

"We hold these truths to be self-evident,--that all
websites are created equal; that they are endowed by
their Creator with certain inalienable properties;
that among these are HTML, hyperlinks, and
the pursuit of AdSense clicks."

But what of the bigger picture? Beyond AdSense and AdWords and Mediabots and all that good stuff. Where, in short, is it all headed?

Well, one sure way of anticipating the future is to see what the professional anticipators are saying and thinking…and then getting ahead even of them. (As John Maynard Keynes used to say, “Successful investing is anticipating the anticipations of others.”) So let us turn for a moment to those who call themselves, or have been called by others, “futurists.”

Bill Joy’s Why the future doesn’t need us, for example, hypothesized that intelligent robots would replace humanity, at the very least in intellectual and social dominance, in the relatively near future. Now a partner in venture capital firm Kleiner, Perkins, Caufield & Byers, Joy certainly lives with one foot firmly in the future. But while Joy and his colleagues at Sun indisputably grasped earlier than most the enormous impact the Internet would have on both computing and entertainment, I’m not certain that he’s any longer today the best person to turn to for a sense of where the Web is going. Designing and writing Berkeley UNIX is a remarkable achievement; but it’s not necessarily a qualification for designing and writing the future of the future,

Last year, for example, at the MIT Emerging Technologies conference, Joy actually retreated – for the organizing principle of his presentation – to the analytical framework used by those selfsame colleagues of his back in the 90s, in which Joy and his team described ‘Six Webs’:

  • the “far” web, as defined by the typical TV viewer experience

  • the “near” web, or desktop computing

  • the “here” web, or mobile devices with personal information one carried all the time

  • the “weird” web, characterized by voice recognition systems

  • the “B2B” web of business computers dealing exclusively with each other

  • the “D2D” web, of intelligent buildings and cities.


  • Java was created with all six of these webs in mind, a deliberate attempt at creating a platform for all six of them. But times they are a-changing: today, the key to anticipating the future is to concentrate on the “social” web and Enterprise 2.0, both built on what Professor Andrew McAfee calls an infrastructure of SLATES (search, linking, tagging, authoring, extensions, and signals).

    In other words, instead of there being six webs there’s really only one, let us call it – for simplicity’s sake – the New Web. The overriding characteristic of the New Web is its multi-faceted yet converging nature. With its acquisition of YouTube, Google for example has in a heartbeat ensured the eventual convergence streaming video and search. With its acquisition of Skype, eBay has done the same for VOIP and real-time auctions. And there are plenty of convergences ahead: newspapers and blogging (witness News Corp’s $580M acquisition of MySpace); what alliances lie ahead – MTV and Technorati? Bank of America and Flickr?

    Before you accuse me of being deliberately far-fetched, consider the fact that the two co-founders of Digg, Jay Adelson and Kevin Rose, have set up the newly-funded Revision3 Corp – not a social-bookmarking site but an Internet video company? Or how about Jimmy Wales, who with Angela Beesley and Wikia is clearly setting his sights on the as-yet-untried fusion between wikis and search?But let us go back to looking at what the ‘professional anticipators’ are anticipating.

    eBay co-founder Pierre Omidyar, for example, has a pretty good track record, having staked Omidyar Network money on Digg.com – as did Netscape co-founder Marc Andreessen and Greylock partners. Omidyar, who launched eBay as “Auction Web” in 1995, was a multi-billionaire just three years later when it IPO’d. His current investment portfolio is informed by his firm conviction that “strangers connecting over shared interests” is the key to the Web’s future. He sits accordingly on the board of Meetup.

    In Andreesson’s case he isn’t only trying to second-guess the future by investing in other people’s social news ventures; he too, like Omidyar, is proactively helping to create it. His Ning, which launched in October 2005, is an online platform – currently free as in beer--for creating social websites and social networks.The list goes on and on, of investors – financial futurists – who are investing in the building out of the New Web that is poised to subsume all and every web that has gone before.

    As for so-called ‘professional futurists’ like Ray Kurzweil – most commonly associated nowadays with his views on AI, genetics, nanotech, robotics, and (echoes of Bill Joy) the rapidly changing definition of humanity – he too has written much that can apply to those interesting in the future of the Internet, including what has been called “The Law of Accelerating Returns”:

    "An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense 'intuitive linear' view. So we won't experience 100 years of progress in the twenty first century—it will be more like 20,000 years of progress (at today's rate). The 'returns,' such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth."


    In volume sixteen of Patrick O'Brian's 20-part nautical series, The Wine-Dark Sea, Aubrey and Maturin and the HMS Surprise finish their adventures in the Pacific and land in Peru. There, Stephen Maturin gives a gratuity to a local who has helped him find his way, and the local bids him goodbye with a parting blessing, "May no new thing arise." Maturin solemnly replies, "May no new thing arise."

    Whatever else you can or can’t say about the future of Internet technologies and the world they will take us to, you can safely say that it won’t be like Peru!

    In other words, bring on the New Web, with all its convergences already happened, currently happening, and still to happen. It is what makes life in the software and Web applications world rich, wondrous, complex, challenging, rewarding, maddening, joyous and never boring. It's always something, and I for one wouldn't want it any other way.

    I can only say that my own parting blessing, next time anyone favors me with a gratuity will be this: “May every new thing arise.”

    Tuesday, February 27, 2007

    Blogging Is Just the Tip of the "Co-Technology" Iceberg


    Gartner says that the total number of bloggers will peak during the first half of this year at around 100 million, causing John R. Patrick to ask rhetorically whether Spring 2007 truly is The Peak of Blogging?

    It isn’t, he says in answer to his own question. "Blogging," he writes, "is just beginning!"

    The significance of blogging is not the word 'blog' whether used as a verb or a noun, but its role as a harbinger of the game-changing Web-as-platform revolution. In particular, the migration of blogging from the individual toward the enterprise represents a massive validation of those like Professor Andy McAfee who argue that Enterprise Web 2.0 is already a reality.

    Put crudely: by embracing blogging, corporate America gave a big wet kiss to Web 2.0 (if you prefer Dale Dougherty's handy term). To the "New Web" (if you prefer a somewhat simpler term, and one that no one owns).

    One part of the Gartner report that made me think of my earlier “Are We Blogging Each Other To Death?” post and that was the bit about how less than 2% of all Internet users are frequent contributors to content on the web. This is the kind of statistic that always exercises anyone involved with building Web-based applications, because it's a classic full-glass/empty-glass situation: do we rejoice at the fact that 98% of the marketplace may yet come round, or bemoan that only 2% have seen the light?

    Wikipedia provides us with a good working example. According to the John Musser/Tim O'Reilly report "Web 2.0 Principles and Best Practices" it's known what percentage of the registered Wikipedia users are contributors: around 7%.

    If like me you view Wikipedia as an application rather than as a destination site, then that means that 70,000 people or more are actively using the Wikipedia application. That's a lot of users for a fairly sophisticated app.

    Kudos to EMC's Cornelia Davis for nailing one very simple way of realizing how significant a number:

    "that is more than twice the number of people working for my employer (EMC has around 31,000 employees)"

    So the challenge for today’s software developers is to achieve for their apps the same kind of buy-in that already exists out there on the New Web. Because unless a company has the same kind of percentage of its intranet users actively contributing content, my contention is that it will swiftly be overtaken by those companies that do.

    Small wonder then that Sam Palmisamo (who famously has his own avatar) is rumored to have enjoined his fellow IBM execs to participate in Second Life. He is probably interested in seeing whether the 7% figure operates there, too: if it did, there would be over 25,000 Big Blue avatars by the end of this year!

    As for the wider New Web itself, a 7% participation rate in any one application on a global scale would, beyond a shadow of a doubt, be the foundation of the next Microsoft. Total Internet users are estimated at 1.1 billion, so the 100 million people who blog (if we go by Gartner's figures) takes us far above the 7% mark (77 million). But no one company owns blogging, any more than any one company owns Web mail, or Instant Messaging, or photo sharing.

    What kind of app is missing, from the New Web landscape? What kind of app would attract 77 million users?

    One that adheres to all the tried and true principles of first-rate co-technology (my turn to coin a term now), plus adds functionalities and leverages emergent methodologies like automatic semantic video tagging, audio search, and social bookmarking. Above all, one that solves a problem not already being solved; or solves one already being solved, but ten times better.

    ("100% Spam Free E-mail" would be one obvious example of the latter; "Google for Memories" would be a less obvious example of the former.)

    For those many Webpreneurs and innovators who believe they're on to exactly that, I say only this. Stay with it: the i-Technology world can do way, way better than mere blogging. I just know it.

    Friday, February 9, 2007

    What's In a Name?


    Optaros calls it "NGI" – for "Next Generation Internet" – and Nexaweb calls it "EW2.0" – for "Enterprise Web 2.0" – but one this is certain...whatever you call it, the next iteration of the World Wide Web is not just on its way: it's already here.

    If you have any doubts about the arrival of the New Web, then you've either been asleep in a cave for the past two years, which is exactly how long it has been now since Jesse James Garrett coined the term "AJAX" for the approach to users' experience of the Web that freed them from the tyranny of page-refresh, or you've not ever visited Google Maps, Del.icio.us, 24sevenoffice.com, or Basecamp.

    As the recipient of the Roundarch 2006 Interactive Experience Award, as well as Chairman of a group of companies pioneering an "all-media" approach to communicating the future, I feel it behooves me to do whatever I can, whenever I can, wherever I can – in the best Web 2.0 spirit, though avoiding that label – to help instil a sense of perspective about the New Web, a sense of excitment anchored in reality.

    Please join me here, and my co-author Charles Fiesel, as often as you can.