Thursday, March 29, 2012

Creative discomfort

While reading this TUAW article on "how the iPad inspires new content creation," I stumbled on a related, more interesting thought.

(Isn't that always the way of things? The best meeting I've been in this week was boring and useless, and once I realized it was going nowhere and focused my mind instead on the other things on my to-do list, I was able to do some great work brainstorming and project-planning, all without leaving the meeting room. Sometimes the best ideas don't come from what others put in front of you, they're the ideas that are one or two steps to the side, just out of view until you allow your focus to shift.)

The author argues that the iPad inspires creativity because it's a direct interface, rather than one that's mediated through a keyboard and a mouse. There's no doubt some truth to that, but I would argue that the iPad's greatest inspirations come from the combination of two qualities: its simplicity and its novelty.

Simplicity is what everyone notices about the iPad: it does away with a lot of the interface conventions that we're all used to when working with computers. Conventions are comfortable to experts, but sometimes they make very little sense in absolute terms. The world is typing on QWERTY keyboards because we're all used to it and trained on that interface, not because it's better. The iPad's simplicity wipes away all of that and allows interactions that are not predefined by convention.

That's a great thing, but what's more significant is that the iPad forces the user out of the comfort zone. If you present me with a conventional computer, complete with keyboard and mouse, I think I already know how I can and will use it. My preconceptions of the device constrain my use of it. The iPad, though, is a little uncomfortable. It forces you to use it in ways that initially can seem awkward. In the process, new possibilities present themselves.

In retrospect, iPad keyboards and styluses are a terrible idea. They're extensions of ideas that we're already comfortable with. The power of devices like the iPad lies in the ways that they can make us uncomfortable, and in so doing expand the possible.

Friday, March 23, 2012

Follow the leader

Every now and then, publications like ESPN publish a self-indulgent piece of pseudo-journalism called "Power Rankings," which allege to rank teams from best to worst. Today I saw that ESPN had issued Power Rankings for the NFL, which is even more pointless and self-indulgent than normal because the NFL is currently in its offseason, free agency is ongoing, the draft is still a month away, and so these sportswriters are ranking teams that a) are not playing, and b) have incomplete rosters. But no matter, there are bored sports fans like me who click through to this sort of thing, and so I did.

The best team in the NFL, according to this chart, is the NY Giants. In fact, four of the five reporters voted them first. That might seem only fair, since the Giants just won the Super Bowl, if it weren't for the fact that the Giants were generally considered the fourth or fifth best team in the NFC, looked for a while like they might miss the playoffs entirely, and then went on a hot streak at the end. The Giants were a good football team, no doubt about it, but they were also very lucky.

By any objective standard the Giants were quite obviously not the best team in the league last year, and they are unlikely to be the best team next year, so how do they end up at the top of power rankings? Because they won the big game. This is something we see in tech journalism as well: whoever wins is the best, by definition. Steve Jobs was the worst, so bad that he was thrown out of his own company, and then he was the best. Sergey Brin was the best, until Mark Zuckerberg became the best. According to the pundits, there is no such thing as luck in Silicon Valley, only talent, skill, and dedication. The cream rises to the top, and shows that it's the cream by virtue of rising.

It's a satisfying story, more so than "they were in the right place at the right time." That narrative is far more inspiring to would-be entrepreneurs than the alternative, that they might have a great idea and solid skills and work really, really hard and still fail in the marketplace. Unfortunately, it's a fairy tale. The universe doesn't hand out ribbons to the best, it hands out ribbons to the ones who win. And sometimes they win by luck.

Wednesday, March 21, 2012

You only get one chance to be honest

What do Mike Daisey and Komen have in common? They both blew their one chance to be honest. Komen made a politically-motivated move and then claimed it had nothing to do with politics. Daisey presented eye-witness testimony to things that he didn't personally witness.

There's no doubt that both Daisey and Komen executives feel that these slips have been overblown, that the baby is being thrown out with the bath water. But here's the thing: in both cases, people assumed that they were telling the truth. The general public gave Daisey and Komen the benefit of the doubt. In a world where it is impossible to verify everything that you hear, this is essential. You are required to extend your trust to certain people, because the only alternative is near-universal skepticism.

This is a benefit we extend to anyone like Daisey and Komen who appear on the surface to be well-intentioned, but it comes with a cost: when you opt for "truthiness" rather than truth, the backlash can be severe. When you lie to us, or even when you engage in half-truths and exaggerations, you make us question what else we should be skeptical about. When you show that you have a casual relationship with the truth, we learn that we can't necessarily trust you. We're not going to fact-check everything you say, and you've already proven that we can't take you at your word, so the only viable option remaining to us is skepticism. In the process some babies will end up out with the bath water, but that's not our fault, it's yours.

In corporate communications and in life, you only have one chance to be honest. Don't blow it.

Wednesday, March 14, 2012

Stop the presses

The New York Times reports on the death of the print edition at Encyclopedia Britannica, which brings back memories -- I worked at Britannica for a few years in the late nineties.

The article states that peak sales for the print encyclopedia came in 1990; I started work there seven years later, just as I was finishing my Ph.D. in religious studies. By then I had realized what a tough job market academia was becoming. Getting a job was very difficult; getting a job that would not suck all the joy from your life was nearly impossible. Out of the blue I got a call from Britannica saying they needed a religion editor. It was kismet; it was fate. I signed on the dotted line and took my first full-time job.

My time at Britannica came in three phases. The first and third phases were with Britannica.com -- the digital arm of the company -- and were great. I liked the people, I liked the product, and I really enjoyed my time there. The second period was downstairs with the print people, and that's where I witnessed first-hand the death throes of a proud institution.

By 1997 the world had begun to shift to digital in a big way, and Britannica was reluctantly following suit. Microsoft was bundling CD editions of Encarta with every PC sold, and Britannica knew that it had to come up with a digital version of the encyclopedia if it was going to compete. Unfortunately, the first efforts were half-hearted at best: the first edition of Britannica on CD was text-only (not a single image) and cost $1,500. They were afraid that the CD would cannibalize sales of the print edition, so they priced it in such a way as to severely punish anyone who thought to buy it. Needless to say, that was a disaster, and by the time I came on board the company was trying again.

The print staff was not taking this well. Many of the editors and managers in the print department had been with the company for decades, and they had always assumed that they'd be there for life, tweaking articles and corresponding with authors and puttering about in the intellectual garden they had so painstakingly constructed. By 1997, though, things were changing far more rapidly than they liked, and they were afraid for their jobs. As it turns out, they had reason to be, but in the meantime that department was the angriest, most toxic environment I've ever worked in. After 18 months of passive-aggressiveness and back-stabbing I escaped back to the digital section of the company, and it was with a profound sense of relief. I was finally back among my own kind: people who like their work and treat their colleagues with respect. I felt liberated.

In retrospect, Britannica's struggles with culture and process were severe to the point of absurdity. These were the days when Wikipedia was just getting off the ground, but the potential of crowd-sourcing was so far off the radar that we never seriously discussed it. Instead, we were busy with what should have been a much smaller task: fighting with the print editors to get them to update the web version of the encyclopedia at a rate faster than the five-year turnaround that they were used to in print. When a celebrity dies or a candidate gets elected, Wikipedia is updated that same day, often within the hour; Britannica always wanted to take its time to consider its words, run drafts past area experts, review their changes, possibly direct the copy to another expert for review, and then consider its words one final time before putting anything new in front of the reader. The world doesn't run at that pace anymore, and though there were plenty of smart and insightful people at Britannica, in the end there weren't enough to change the company into something it wasn't.

By the summer of 2000 our dot-com business plan was in a shambles and the writing was on the wall, and I made the decision to move to Seattle with my then-girlfriend, now-wife. Three months later most of my friends and former colleagues working on Britannica.com were laid off. At that point the layoffs in the print division had already begun, so in a way this was only fair; the company was sinking and everyone was going down with it. The rats were not the only ones to get their feet wet.

Now, at last, the print edition has been deep-sixed, and for the most part I'm surprised that it managed to hold on for so long. The company is under new ownership and management, and company president Jorge Cauz has this to say about their current efforts:
The Web site is continuously updated, it’s much more expansive and it has multimedia.
That's good to hear, but it would have been better if I'd heard it in 1997, when it might have made all the difference in the world.

When 20% becomes half-assed

A former development director at Google decided to return to Microsoft, and posted the reasons why in a blog post that has received a ton of attention. For me, the most meaningful section comes towards the close, when he talks about the consequences when Larry Page decided that Google's primary mission is to catch up with Facebook in social:
Suddenly, 20% meant half-assed. Google Labs was shut down. App Engine fees were raised. APIs that had been free for years were deprecated or provided for a fee. As the trappings of entrepreneurship were dismantled, derisive talk of the “old Google” and its feeble attempts at competing with Facebook surfaced to justify a “new Google” that promised “more wood behind fewer arrows.”
This is very discouraging news for anyone who's committed to promoting innovation within their workplace; for years, Google's 20% time has been the gold standard for allowing employees to innovate officially (rather than when no one is watching), and the argument has been that if a company as successful as Google provides room in which employees can tinker and experiment, then we should do that too, right? If Google ends up scrapping the 20% time, that could have repercussions throughout the business landscape, as bean-counting managers use it as justification for shutting down the more speculative endeavors of their own employees.

Unfortunately it's a fact of life that innovation and unconventional thinking have a very hard time surviving the bean-counting. The real test of a company's commitment to innovation comes when times are tough and competitors appear to be winning: will you stay the course and continue to invest in your employees, or will 20% time and bottom-up thinking be like free massages and Bring Your Dog To Work Day, eliminated in the interest of pursuing greater discipline?

In Google's case, I wonder if Larry Page is still smarting from Eric Schmidt's crack that he was brought on as CEO at Google to provide "adult supervision?" Is he pursuing focus and discipline at Google to prove that he's all grown up now? If so, it may come at the expense of his company.

Thursday, March 8, 2012

Only one shoe has dropped

Techcrunch has a smart post on how the new iPad announcement was only half the story; the other half will likely come during WWDC in June, when iOS 6 is unveiled.

There's a mistake that tech journalists continue to make with respect to Apple products: they compare them to their competitors on the basis of features. Whether it's screen resolution, the number of processors, the speed of those processors, how much RAM the device has, or whatever it is, they operate on the assumption that Apple's customers are looking for the product that has the best features. And so they triumphantly announce that competitors have equivalent or superior features, and predict that this foretells Apple's imminent downfall. It never happens that way, but the journalists don't learn the lesson.

The lesson is simple: Apple's formula is to combine hardware with software. Neither exists in isolation from the other; they are a seamless unit, and that seamless experience is the whole point of Apple's famous walled garden. Steve Jobs' philosophy was to achieve differentiation in the marketplace by doing the best job in the world of unifying the two, and under his guidance that's been Apple's approach to the market for most of its history.

This is the way forward for the industry as a whole. Obsessive focus on technical features is a characteristic of an industry in its infancy, when most customers are early adopters and hobbyists. As an industry matures, its products become relevant and attractive to a broader segment of users, and the demands of the market shift. There will always be an enthusiast tech market that obsesses about features, but increasingly the average tech consumer doesn't care about any of that, s/he just wants her stuff to work and to empower her to do something she couldn't do before. Apple is an industry behemoth today because they understood that fact far earlier than their competitors. Eventually other companies will learn the same lesson; the ones that don't will cease to exist.

Hardware matters, but hardware + software is exponentially more significant.

Friday, March 2, 2012

Precisely!

The New York Post is speculating that Apple is working with content providers to present television channels as apps.

What an intriguing idea.

Inscrutable

Wired has an interesting story on how dolphins say "hello." Apparently there's a "signature whistle" they use in social situations, when meeting up with dolphins from other pods. There also appear to be rules to the interaction, most of which are poorly understood, if at all.

I've been a science fiction fan for most of my life, so between books, movies, and television I've probably read or watched hundreds of scenes in which humans come into contact with aliens for the first time. Usually the process is completely seamless; as in "Close Encounters," there may be a bit of coordination at first, but soon the universal translator is working properly and communication proceeds apace.

This dolphin study reveals how hopelessly naive and optimistic that assumption is. We've lived alongside dolphins for tens of thousands of years, and we've been studying them pretty intensively for decades now. We've also had the remarkable opportunity to study entire generations of dolphins who lived in captivity. On top of that, dolphins are mammals, and so they share a certain degree of genetic heritage with human beings. With that wealth of information, insight, and reflection as backdrop, we've now been able to reach the conclusion that dolphins maybe, possibly say "hello." Otherwise, we know basically nothing.

If it's that hard to understand dolphins, with whom we have so much in common, how are we ever going to understand the attempts at communication by a species that we've just encountered, that evolved in completely different circumstances, and with which we have absolutely nothing in common? It's no wonder that science fiction makes this process so much easier -- those books would have been pretty boring if they had all featured long chapters of mutual incomprehension.