19 July 2012

The Post-Tablet Era

We’ve all read the “Post-PC” hyperbole du jour being spouted by various pundits that purports to say that the ball-and-chain of the PC will be replaced by myriad task-based mobile devices resulting in the explosive over-connection of users. 

There are several problems with this “Tastes Great; Less Filling” ploy.

To date, every mobile device uses advertising-driven operating systems from Apple or Google (OK, the popular ones).  Their raison d’ĂȘtre is fine tuning the delivery of evermore targeted ads at users, which in turn generate revenue for Apple, Google and app developers.  Because of this primary motive, advertising-driven operating systems are at odds with delivering productivity because, like phone sex services, they make money based on time spent, not efficiency.

In addition, several recent studies have shown that click-through rates for online advertising are considerably less than dusty old direct mail (better known as junk mail). Advertising may turn out to be a game of diminishing returns.  The more bombarded, the more inured we become to their effects; another boy too often crying “wolf.”  I’m not here to fix advertising, but I do wonder about the long-term fate of operating systems and apps whose revenue is derived from fostering it.

The other problem with proclaiming the post-PC world is that currently there’s nothing to replace it.  Sheer computing power and size considerations aside, mobile operating systems do not offer a platform capable of running the same richness of applications as a PC.  Tablets offer snippets of larger data sets and limited tools for interacting with it (after all, most apps are tidbits of data teased into a specialized data container for those too lazy to type a URL).

We are also in a period of device explosion or as the Italians say, throwing pasta (or “shit” if you’re in sales) against a wall to see what sticks.  In the developed world, people average three-point-something internet-connected devices and that will double in less than five years.  Device proliferation can partly be blamed for the rise of gargantuan purses, non-sporting backpacks, and the dreaded “murse.”  Even an IT geek’s cargo shorts have limits. 

It’s safe to say the ultimate winner won’t be the maker of the most device types, but rather the maker with the fewest devices required for the highest level of productivity. For this reason, it’s even safer to say that the PC isn’t going anywhere, at least for a while. 

Today’s mobile devices are a tight balance between processing and ergonomics, a battle fought by laptops years ago.  The other wrinkle is the desire to intertwine computing with communications, both with their own ergonomic needs.  As processing dynamics improve, today’s ad-driven, immature tablets will die a relatively quick death like mobile phone bricks.

For example, I got my first laptop in 1990. For sixteen years I owned both a desktop and a laptop before dropping the desktop PC for a lone laptop six years ago.  I got my first tablet nearly two years ago and never imagined it becoming my primary computer.  After getting a Windows 8 test-tablet a month ago, I guarantee it will not be sixteen years before I drop the laptop for a tablet-like device.  Speed, storage and screen size and resolution will be the key deciding factors, all of which will improve relatively quickly.

You may be thinking I’ve just negated my premise about the “post-PC era” but I haven’t.  Windows 8 extends the PC era in the same way laptops extended desktops. 

Microsoft Windows is the traditional (and bloated) operating system that’s slimmed down and become touch-sensitive in version 8.  Users can enjoy their apps and “real” PC applications without visiting the ad-driven whorehouses of Apple and Google.

Windows 8 marries the mobile with the stationary, most notably running the Office Suite on either device; a first for a (presumably) mass-produced tablet.  It turns today’s plaything tablet into a new PC form factor (FYI, the most downloaded Apple app is an Office emulator).  For Apple to compete, the extra step of the iTunes interface must go away.  Apple will have to also merge its PC and mobile operating systems to create a merged experience. 

Ease of use is an oft sited hallmark of Apple iPhone/iPad and to a lesser extent Google.  While Microsoft’s traditional operating system may not be termed user-friendly, it is engrained into the computer literate public.  Unlike Apple’s intuitive tiles, Microsoft’s usability comes from sheer repetition.  For those seeking a more tile-friendly and colorful world, there’s the Microsoft’s Metro interface. Personally it seems a little remedial to me.

Touching on Android, it’s an illustration of too many cooks spoiling the soup.  Versions have been flying out at such a pace that developers haven’t caught up.  Most smartphones run Android 2.3 while tablets are on 4.0, unless you have a tablet that was stranded on the 3.0 version.  Backing this up are Avaya and Cisco whose first iterations of tablets (Cius and Flare) were Android-based.  Both have been largely abandoned in favor of iPad.

Finally, Windows 8 tablets have standard USB plugs!  Nearly every technology reviewer has lamented about how Steve Jobs’ plug phobia resulted in some wonderful, albeit less useful, designs. 

Am I giving the win to Microsoft?  Hardly.  This is just the latest skirmish in the continuing struggle to make computing more effective.    Microsoft has also been known for as many blunders as successes, especially in recent years.  Like Star Trek movies, every other release of Windows seems to be a bomb, with Windows 7 being the latest success; Windows 8 is in a difficult position. 

Microsoft has also made a fundamentally stupid move by not enabling an upgrade from Windows Phone 7 to Windows Phone 8. Not only does this strand users, it’s another reminder that you’re not dealing with Apple.  But while Windows 8 is a Hail Mary play that will succeed or fail for Microsoft in 2013, a unified, non ad-driven operating system is the future.

Steve Jobs jumpstarted the tablet form factor, but in merging mobile and stationary operating systems, Microsoft has upped the ante.  Post-PC? No.  A new PC form factor? Yes.

24 February 2012

Electronic Brainstorming: Can Microsoft SharePoint, Cisco Quad and IBM Connections Reclaim Productivity Lost to Collaboration?

The message coming from all communication technologies, be they consumer, business or “bizumer,” is that they increase user productivity and problem-solving by keeping users connected to the larger group (herd).  Unfortunately, scientifically speaking, nothing could be further from the truth.

Behavioral study after study has proven that incessant connectivity robs people of the solitude required to solve problems while delivering the mediocre results of groupthink.  We instinctively know, and the Discovery Channel proves, that the wildebeest that strays from the herd gets eaten.  During meetings and conference calls (real-time) we agree to stupid things for fear of rejection by the group or being stoned by the boss; thus groupthink is born.  (It’s vestigial brain stuff from the stone age and middle school.)

Interestingly, artists, inventors and innovators tend to be loners.  Superman had the Fortress of Solitude, and hell, if you believe Creationist claptrap, God worked alone too.  (Additional insight can be found here, here, here and here)

Anecdotally, we see “Generation Clusterthink.” College students who should be learning to stand on their own calling parents to sort out minutia.  They are more comfortable living with their parents because they never became an individual; apron strings turned to steel.  This is the generation that sacrifices solitary individuality to simultaneously listen, watch and broadcast for fear of being left out.

This all began in the 1950s when an advertising executive (no surprise here) named Alex Osborn created the idea of group brainstorming, believing that groups led to higher quantity and quality of ideas and creativity. The irony being that Osborn was probably alone when he came up with the thought, self-defeating his own premise (but advertising and history never lets truth get in the way). 

Brainwashing on brainstorming persists and has literally been translated into office environments.  In the 1970s the average U.S. worker had 46.5m2 (500 square feet) of working space.  By 2010, that had shrunk by 60% to 18.5m2 (200 square feet) with 70% working in open-plan offices.  The subliminal belief being that proximity bred productivity.  Yes, it saved real estate costs, but remember your first open plan office?  What was said? “We’re tearing down the walls to increase productivity by encouraging group collaboration” (unless you’re an executive).

As anyone who’s ever been in a group knows, they are unproductive, stifle creativity and often deliver a lowest common denominator result.  In-meeting multi-tasking isn’t a sign of being busy, but rather equates to apathy about the topic, feelings of powerlessness, inevitability of outcome and avoidance of boat-rocking. 

Therefore, promoting SharePoint, Quad and IBM Connections Next as productivity tools due to their connected and collaborative properties is a sham. (I have to say IBM Connections Next because no one knows what Connections is – Vulcan, Calgon, whatever)

What these vendors need to do is stop marketing against science.  This application class should be positioned and used to free contributors from the endless banality of the group while enabling them to engage when needed while making the contributions and quantum leaps delivered by uninterrupted, uncompromising, individual problem-solving.

In addition to proving that brainstorming is ineffective, research shows that “electronic brainstorming” can be.  If you strip away the verbal and body language components of communication and depersonalize it, vestigial groupthink triggers weaken.  Extrapolating the concept, groupthink could be mitigated by groups using non-real-time communication (few meetings, fewer calls). 

For Microsoft, Cisco and IBM this means that instead of promoting their applications as “the next level of group collaboration,” they should be promoting these applications for their abilities to unearth the best ideas and most effective work by managing and controlling over-collaboration.

This is not to say that real-time communication is not productive, but it must be deliberate, controlled and measured for productivity to be improved.  In many ways, the realities of group and individual dynamics undermine the arguments for video communications.  The argument for ubiquitous video resurrects AT&T’s “the next best thing to being there” slogan and that may very well be the problem (a proposition Cisco might grit their teeth at). 

18 January 2012

Twitter: What I Didn't do Over the Summer

For those who lack reference, the TeleContrarian Twitter persona is an outgrowth of my “calling bullshit” as a commenter on several blog sites, most notably No Jitter.  When I started my Twitter experiment I devised a pair of questions I hoped to answer:

·    Could Twitter enhance and simplify my personal monitoring of the enterprise comms market?
·    Would anyone want to follow and interact with someone anonymous and snarky (but informed)? In other words, would my “personality” and point of view enhance others’ Twitter goals?

I set myself a goal of a year to find out and slightly overstayed.  Along the way I expanded the Contrarian “brand” by creating this blog.  In doing so, I could also begin to see the “six degrees of separation” brought about by social media (how many of my Twitter followers would re-tweet my blog link and where would readers come from?).  I thought it more interesting than releasing a sex tape (as so many have) and documenting its internet voyage.

Some may wonder why my first goal wasn’t obvious from my non-anonymous Twitter profile.  The answer is that my public profile has become so enmeshed in marketing, promotion and relationship-building that its goals couldn’t be completely separated for this experiment.  And then there’s the ribald personality of TeleContrarian which is not allowed in polite business.

Is Twitter measurably helpful?

I use several methods to monitor the UCOM market including RSS feeds, Google Alerts, push email services, vendor AR/PR and the like.  Could Twitter replace, unify and expand the reach of those services?  Would any additional information posted by my followers enhance my knowledge?

The short answer is “no,” Twitter did not do the job better.  Those articles not part of my traditional monitoring systems, while sometimes interesting, did  not uncover a lodestone of unforeseen knowledge. 

What Twitter did excel at was speed (items posted to Twitter in minutes) and the occasional sound bite of opinion (though usually gleanable by reading the linked document).  So, is the simple speed of knowledge helpful?  Again I say, “no.”  Knowing about Avaya’s IPO or Alcatel-Lucent’s Enterprise yard sale a few hours earlier doesn’t make the information more actionable.  Similarly, experiencing Avaya’s Flare launch live, while exciting, was meaningless. 

Another dimension of Twitter use in business is the fallacy that following opinion and vendor leaders would somehow reveal useful unvarnished thoughts.  Twitter for business is a marketing channel where meaningful thoughts are guarded and usually either enticement or corporate puffery.   Unfortunately candid thoughts limited themselves to the minutia of daily life.  For example, I now find myself requesting hotel rooms Charlie Isaacs hasn’t slept in (harder than you’d imagine!).

This brings me to an unfortunate side-effect of following some, bearing witness to their over sharing.  To the untrained eye, Vanessa Alvarez may appear to be a confident woman swanning her way from jet to beach swaddled in Gucci and shod in Blahnik. I, however, see shallow trappings and self-importance.  I’ve always thought her work lacked depth and now I understand more completely why.  I also more rapidly seek better company at cocktail parties.  Given the private messages sent to me, I am not alone in my opinions, just in expressing them.

The bottom line is that Twitter is helpful for those needing to be breathlessly first (and my public persona needs this perception for many reasons).  It may also help those with a more casual interest stay abreast of the market.

Goal 2: Did this persona help you?

I have no idea.  Certainly over time my following has slowly increased, so I can assume that some of what I said was interesting, useful or at least entertaining.  My primary goal began with a desire to unveil truths obfuscated by corporate marketing or those dependant on it.  To ask the questions others only thought but couldn’t diplomatically ask (including my public persona).  Most have understood that I am not necessarily malicious, just frank with a sprinkle of bitch.

The bitchery may be the most attractive part of TeleContrarian.  In a world of buttoned-down business-speak banality, someone should prick the balloon.  As proof of this, several of my tweets were the result of private messages sent asking me to tweet something the sender couldn’t.

Some have engaged with me on this journey, others ignore my existence.  It has been interesting to have dual “relationships” with most of my followers.  I appreciate the fact that it’s unsettling to interact in an inherently unbalanced relationship and appreciate their mostly jovial participation.  Most funny have been the marketing-drone Twitter accounts who must grit their teeth to answer me.  But a few times I’ve been wrong and have apologized and corrected where I could.

While I expected some bemused speculation about my identity as TeleContrarian grew, some have made the quest their raison d'ĂȘtre, as though unmasking me would finally gain them a seat at the “popular” lunch table. This has been especially amusing when I have been asked if I know who “I” am. But this hunger has also limited my ability when seated near a roving-eyed colleague. For those still playing the game, no, I have never heard anyone correctly speculate on my identity. Jim Croce sang it best, “You don't pull the mask off the old Lone Ranger” because when the mystery is solved, the game is over.  And it’s the game that makes it all worthwhile.

Lessons Learned

People are, and always will be, sheep.  Corporate marketing Twitter accounts often see a story and retweet it again for wider distribution without reading the referenced article.  Because of this, on several occasions I have been retweeted by vendors I’ve been critical of.  I cringe when this happens, but it’s human nature to trust as much as it is to be lazy.  A dangerous combination.

TeleContrarian is addictive.  It’s a wild ride to be clever and have a forum to say what you think.  It’s also self-competitive as I found myself looking for ways to “out do” myself.  This persona has been largely silent since October 2011, and I didn’t miss it.  In fact, I enjoyed the rest.  What started as a vacation became more.  The future will bring a more focused voice with less bitchcraft.

TeleContrarian is a crutch.  While I shed light on some truths, those truths are there for anyone to find.  If you find yourself believing the “increased productivity and reduced spending paradigm of a cloud-based OpEx model” mantra of this week’s marketing campaign, it’s your job to take a step back and evaluate.  Like any religion or belief, if you unchallengingly follow, you can’t complain about view or the destination.