Monday, November 26, 2012

Five to Seven

A few weeks ago I went through a change management course. If you haven't heard of it before, change management is one of the more recent business terms to achieve buzzword status: it describes the process of strategically managing change within an organization, rather than simply inflicting it on employees.

The meat of the course was on a painstaking process for managing change that my employer is teaching throughout the organization, but as a communicator a few things stood out for me. First was the importance of managers -- studies show that they're the most resistant to change in any organization, and they also exert a great deal of influence on their subordinates in determining whether they'll be open or resistant to change (or any communication for that matter).

Even more striking than that fact, though, was my other takeaway: you have to communicate something five to seven times before people will remember it. Think about that: five to seven times, or your message will be forgotten.

The implications are profound. If you take that number seriously, it means that the core task of communications is not informing, it's reminding. If there's something you really need your readers / followers / customers / employees to remember, your communications must be circular rather than linear: communicate, then remind, then touch on it again, then connect to it again, and finally offer a final reminder -- and that's at a minimum. When was the last time you authored a communication strategy that had so many touch points?

Do you think this point is worth remembering? Put it on a Post-It and stick it to your screen. You'll need to see it another four to six times before it's yours to keep.

Tuesday, September 25, 2012

The critical kairos

Wired has a great article on Corning, makers of Gorilla Glass -- the remarkably strong glass that famously forms the screen on iPhones and iPads (as well as products from a number of other companies). The entire article is a great read and well worth your time, but I took special pleasure in this line, toward the middle of the article:
Innovation at Corning is largely about being willing and able to take failed ideas and apply them elsewhere.
This is a great point that often gets overlooked in discussions of innovation. Back in the day I studied classical Greek, and one of my favorite words from that language is kairos, which refers to the right time for something. In my experience kairos is critical to the success of an idea -- it's not enough for it to be a good idea, and it's not enough for it to receive the necessary backing, it also needs the elusive element of kairos (a.k.a. good timing) or it won't have the impact that it deserves.

Functionally, of course, this can seem like a difficult requirement to satisfy, because you can work to make your ideas better but, unless you're Dr. Who, you have no control over time. The bottom-line lesson, though, is that you should keep failed ideas in your back pocket. When you put something out there and it fails, either it was a bad idea or it was a good idea at the wrong time. Put the good but failed ideas to the side, out of sight where they're safe, until the circumstances have changed and it's time to try again.

Sometimes innovation is a function of memory. Remember your failures, because tomorrow they might be the seeds of your success.

Tuesday, September 11, 2012

Follower fallacies

Wired has an article arguing that "focusing on hardware is the wrong way to compete with the iPhone." It's an indisputable point. The Apple aesthetic has always been a combination of hardware and software; the hardware on its own conveys a simplicity and elegance that has value in its own, but Apple's devices also show that a lot of thought has gone into how the hardware supports the software and the software supports the hardware. Imitate one and not the other, and you might get a good device but you're not going to get a great one. (Which is pretty much how I'd sum up the entire Android ecosystem.)

So  the point in the article is valid, even important. It's not new, however -- people have been making the same point for years now. The question, then, is why it never seems to stick. In part, I think this is a function of the engineer culture. For an engineer, flexibility and power are synonymous with quality, and if you have two devices that are otherwise similar the engineer will look for the one that has the superior processor power and other quantifiable specs.

However the primary reason, I suspect, is that followers are simply not in the position to do anything other than tweak the leader's designs. Apple personnel have testified that the iPhone was five years in development before it was released; meanwhile other smartphone designers were so far removed from what Apple was thinking that RIM's leaders responded to the first views of the iPhone by refusing to believe that the phone could be as good as what Steve Jobs demonstrated. Since then everything has flipped; nearly every phone that comes out is directly inspired by the iPhone. The designs we're seeing on the market today are rushed copies, rather than concerted design efforts. They're not the product of five years worth of iteration and refinement; they're me-too efforts in which the only chance for distinction is to say, "We're just like that other phone, only better!" When you're in a hurry, software+hardware is too big of a problem to solve. Instead you make the screen slightly larger, or the camera slightly better, and hope that consumers will see that as a compelling difference.

My hope for the market is that, behind the scenes, R&D efforts are going on at these various companies in which they pursue singular visions and are willing to tweak and fuss and polish until everything is just right. The first iPhone came out just about five years ago, which means that companies like Motorola and Samsung have had enough time to begin with iPhone inspirations and develop their own designs that marry hardware and software in unique and compelling ways. I have my doubts, but we'll see how it goes and hope for the best.

Thursday, August 23, 2012

Preaching to the unconverted

Via Daring Fireball I came across this scathing review of Windows 8. I've come across two very distinct takes on the Windows 8 experience. More positive reviews come from those who've tried it out on tablets and are excited about the experience. The negative takes tend to come from those who are more or less happy with the traditional Windows experience and, by and large, hate Microsoft's combination of two distinct interfaces. This is the risk that MS is running: by "refusing to compromise" and combining a tablet interface with a traditional Windows desktop, they're providing a one-size-fits-all experience that, in the end, doesn't fit all.

If you want customers to learn a new interface, you need to make the value proposition very clear. Apple had a relatively easy scenario, because the iPhone/iPad experience was marketed only to people who wanted a phone or a tablet and so could easily see the benefit of a touch-optimized experience. Microsoft, on the other hand, is confronting a very large body of customers who already use Windows, are already comfortable with a certain way of doing things, and may have no intention of buying a tablet in the near future. For them, all they want is to use the interface conventions that they've already learned, and the new design gets in their way without offering anything very compelling in return.

Touch interfaces and mouse/keyboard interfaces are very different, both functionally and conceptually. At work I use an iPad and a Dell laptop, and with the magic of screen-sharing software I'm able to load my Windows desktop on my iPad screen. To some extent I can get it to work using multitouch and pinch-to-zoom, but it's an awkward experience and it's very much slower than using the tools that are specifically designed to work in that scenario. I do it when I need to -- when doing so solves a problem -- but otherwise I prefer to wait until I'm back at my desk.

Microsoft's design decision removes that choice. The touch-optimized interface presents itself wherever you are, even if you don't have a touch-sensitive screen or trackpad to work with. That's probably fine -- great, even -- if you're using a tablet device, and maybe Microsoft's true bet is that the desktop is going the way of the microcomputer and soon we'll be laughing about the days when we used two-button mice to select things on our screens. If they're wrong, though, and the traditional desktop interface has some life in it yet, then Windows 8 could be a latter-day Vista: a version of Windows that customers choose to skip, in the hope that the next version will be better conceived.

Tuesday, July 31, 2012

First principles

Wired.co.uk has a brief interview with Jonathan Ive, in which he reiterates a point first made by Steve Jobs:
"We are really pleased with our revenues but our goal isn't to make money. It sounds a little flippant, but it's the truth. Our goal and what makes us excited is to make great products. If we are successful people will like them and if we are operationally competent, we will make money."
I suspect that there will be two reactions to this statement. Those who are inclined to hate Apple will think, "What a bunch of self-serving bull$%&^." Those who are sympathetic to Apple will think, "Yes, exactly -- and that's what makes all the difference."

My own orientation is that Ive is on the level, and for one reason: it explains something that is otherwise inexplicable. The puzzle with Apple isn't how it managed to get so big -- other companies have been big before, and others will become big in the future. The puzzle with Apple is how it manages to overcome the famous innovator's dilemma. Companies are not supposed to be able to disrupt themselves, as Apple did by naming the iPhone the best music player it ever built at a time when the iPod was critical to its bottom line, and by releasing the iPad in the full knowledge that it would cannibalize sales of the highly profitable Apple laptops. The principles of sound business management are supposed to make it impossible to choose an uncertain market over a certain one. And yet Apple has done this repeatedly.

The key, I believe, is in Ive's statement. The innovator's dilemma describes a manager's inability to abandon a profitable position in order to develop a market that might become profitable later. A manager whose professional well-being depends first and foremost on profits will experience this problem. But take the same manager and focus him instead on making the best product, tell him not to worry about profits, and that changes everything. Then the company disrupts itself as a matter of course. With that one change in perspective, the innovator's dilemma becomes almost irrelevant.

If this orientation accurately describes what it's like to work at Apple, building that culture within the organization was Steve Jobs' greatest achievement, and the biggest test of his successors at Apple will be how long they can maintain that focus on product over profits. It's the key to everything.

Friday, July 27, 2012

Razor thin

There are some headlines this morning on Amazon's quarterly earnings, in which their razor-thin margins got even more so. MG Siegler opines thusly:
Yes, I realize Amazon is viewed as a growth business (forgoing short-term profits for long-term gains). But these numbers keep going the wrong way. At some point, they have to start going the right way, right?
I suspect that the answer to that question is "wrong" -- at least for the foreseeable future. Jeff Bezos plays one card consistently, and that's the "trade profits for market share" card. If you look at Amazon's historical earnings, they have a remarkable ability to stay just above (or below) the break-even line. Bezos is in it for the long term, and his long-term play is to keep prices as low as possible until his competitors either go out of business or leave the market in search of profits.

Lately I've heard Amazon compared to Apple, but that is a fundamental error. Amazon is not the new Apple, and has no intention of becoming so. Amazon is the new Dell.

Sunday, July 1, 2012

Antisocial

On Gizmodo, Sam Biddle asks the provocative question: "Does Google Have Any Social Skills At All?" Biddle questions whether, despite all the money that Google has poured into social media lately, the company really understands social at all.
We've had privacy concerns before, but could it be more? Could it be that Google just doesn't get real people?
Of course there's an unsavory normative quality to that question: Biddle presumes that his vision of "real people" is more valid than Google's. But at depth the article exposes a real quandary that Google must address as it attempts to make its products more sccial: not everyone shares the company's values.

It's a problem for any tech company. When you design, if you want to design well, you must design for yourself first. Begin by trying to make something that you want to use, and then maybe you'll design something that lots of other people want, too. Steve Jobs famously scoffed at the need for focus groups; when Apple designed the iPhone, he said, they were merely building the phone that they themselves wanted to use.

That may well have been true, but it would be a mistake to think that the iPhone was a design that began and ended with the sensibilities of engineers. Anyone who's spent time around developers knows that they love power and functionality. An engineer-designed product tends to have 20 buttons on the front, dials on the side, and an easy-access hatch on the back that allows you to swap out the motherboard and install custom cooling systems. It may be ugly as hell, but it does a lot of cool things (at least when it's working properly, which is sometimes) and it's fun to tinker with.

That is not the iPhone. The iPhone is a device that may have begun on an engineer's screen, but along the way it was filtered through the values and desires of Jobs and Jonathan Ive and what resulted was an embodiment of that image Jobs liked: a product at the intersection of technology and the humanities. It's the combination of those two qualities that makes the iPhone and iPad so distinct, and so controversial within the technology community. Many engineers hate exactly those things that consumers love about both devices. So what are the odds that engineers, left to their own devices, can develop anything with such broad appeal?

This is Google's quandary: it's a company of engineers run by other engineers. There are no significant clashes of culture internal to the building, so those clashes occur on the outside, when product (or product vision) encounters the marketplace filled with people like Biddle who do not share Google's values. Maybe Google's ahead of the game and everyone else just needs to catch up, or maybe they're blinded by an excess of engineers.

Monday, June 25, 2012

Fear the middle manager

Today while pondering Microsoft's pending acquisition of Yammer, I came across another corporate acquisition story: how WinAmp went from the cusp of domination in the digital music market to where it is now -- basically forgotten in the continental U.S. -- because of interference and indifference from its corporate overlords at AOL.

Whenever a story like this comes along, it's tempting to rant and rave at just how stupid corporate leaders can be. I mean, they thought highly enough of the company to buy it, but not enough to give it any air to breathe. That explanation doesn't really hold water, though; the managers at AOL didn't wander in off the street, and neither did the ones at Yahoo, Microsoft, HP, or any other major corporation. They got where they are because they're hard-working and reasonably astute. So how is it that, time after time, small companies with momentum get sucked up by larger units only to be destroyed from within?

As I was reading the article, I kept thinking back to a post that Seth Godin made some time ago, on the "hierarchy of business to business needs." Seth makes the point that, with the exception of the CEO and anyone who holds a lot of stock, the people who work there are not immediately focused on growth or profits. Instead, their highest needs are largely personal: the reduction of hassle and the avoidance of risk.

I could be wrong, but this is what I think tends to happen: startup companies are purchased at the instigation (or at least with the approval) of company leaders, who see the company's potential and want to add that to their own company's bottom line. Once in the door, though, those companies become subject to mid-level managers whose primary drive is avoiding risk and trouble. Those are two completely irreconcilable goals: you can't grow or disrupt or evolve without creating trouble for somebody, and now you're reporting to someone whose primary intention is to stop that from happening within his neck of the woods. From that perspective, it's not surprising that fast-rising companies sometimes fail when they're bought out. It's surprising that sometimes they succeed.

The way out of this scenario would seem to be the one that Amazon took when they bought Zappos: they left it running as a quasi-independent company, and it seems to be working out OK. If Yammer was able to negotiate a similar relationship with their new insect corporate overlords, things might go similarly well. You do have to wonder, though -- considering that Microsoft bought the company because they were beginning to disrupt the market dominated by SharePoint, how much room for disruption is there going to be now?

Xobni Apocalypse

I just un-installed Xobni. Allow me a moment to explain why you should care.

I used to think Xobni was cool. Working under a freemium model, it offered an extension to Outlook that dramatically improved the email search experience. It also had some social elements that I never really used, but I thought there was some potential there.

Somewhere along the way Xobni disappeared from my machine, and by the time I noticed the change and went back to re-install it, something had changed. Now Xobni was annoying. It was a serious drag on my day, because Xobni actively interfered with the creation of email -- every time I opened a new message window and started to type an address, Xobni would try to upsell me to the Pro edition by showing that its address book was so much more effective than Outlook's. The window dropped down far enough to cover the CC and Subject lines, and, worst of all, if Outlook recognized the name and auto-filled it, the Xobni box would enter the name a second time if I tabbed out of the address field. I had to learn an entirely new action -- click to exit the field, rather than using the tab key -- to fix a problem created by an overzealous marketing effort.

Presumably Xobni's freemium model isn't working out too well, but I don't care. I wasn't inclined to pay for the Pro account before, because I didn't see enough value there. Now that I'm expanding my definition of "four-letter word" to include the occasional five-letter exception like "xobni," they have lost me for good.

Marketing is a delicate effort. On the business side, it's driven by the urgent need to make money, pay the bills, and satisfy the shareholders. The mandate is on the business, though, to put that aside and look at things from the customer's point of view. If your marketing effort makes things worse for your potential customers, even if you manage to convert some of them in the process, you are destroying your brand and sentencing your company to a long and lingering death. This is the road that Real Networks went down, and the outcome is not pretty.

Marketing is not about what you want, it's about what your customers want. Forget that at your peril.

Tuesday, June 19, 2012

Surfacing

Yesterday Microsoft announced their new tablet. This comes in two configurations, of course, because if Microsoft can't name a product "Personal Productivity Home Edition RT Preview," they don't feel like they've done a good day's work.

There are some strong ideas. The keyboard is an interesting concept, and there are a lot of people who are really excited with the design of Windows 8. Still, this announcement was well short of what it should have been:

  • There's no availability date yet, and the rumors state that -- at the earliest -- the tablets will be available in October. That means that MS announced their product at least four months in advance, and possibly more. Feeling excited now? Well, sit on that feeling for four months, and then let's see how you feel. Apple makes billions by getting you excited within the same moment that you can whip out your credit card and make a purchase. Microsoft still struggles with this, so many years later.
  • The keyboards didn't work at the unveiling event. Those who got their hands in display devices found that the hardware wasn't connected, which means that Microsoft unveiled hardware that doesn't work yet. Everyone who's worked on a project is familiar with the experience of assembling the airplane while it's speeding down the runway, but announcing what you've got before you've got it is foolish at best, disastrous at worst.
  • The full-Windows-capable version comes with a stylus. Not to put too fine a point on it, but styluses suck. If you've got a touchscreen and still feel compelled to include a stylus, you're doing something wrong with your interface.
  • It's complicated. The two versions have different specs, run on different hardware, and will have different software capabilities. It's a little confusing even if you know what you're talking about, and a lot of customers in the consumer space won't be so savvy. The strongest marketing messages are based on a simple, clear message, and Microsoft makes that difficult with its complicated product portfolios.
  • No word on pricing. The enterprise market might not care about price so much, but consumers absolutely do. If the Surface costs more than $499, it will face severe challenges, and statements that the price will be competitive with Intel Ultrabooks is not a good sign: ultrabooks go for $700 and up
  • It's a conventional design. The interface is innovative, sure, but everything else you get -- a rugged case, a keyboard, a mess of ports -- spells "scaled-down laptop." The iPad is successful in part because it redefined what a mobile device could and should be. By comparison, the Surface looks like a conservative reaction, rather than a bold step forward.
This device could succeed. I think it will be a pretty easy sell in the enterprise market, pitched as a lighter and more mobile solution than the clunky Dell laptop that employees would otherwise be using. If it makes much inroads into the consumer market, though, I will be surprised. This is a device designed to sustain Microsoft's profits in an exiting market, rather than an attempt to disrupt its way into new markets.

Friday, May 25, 2012

The eternal present

John Lilly has a post that has been doing the rounds lately, in part because of this money quote:
I picked up a phrase some time ago that I think applies: “The next big thing is always beneath contempt.” Implication being that it is, of course, until it isn’t. Until it’s too big to ignore. This has happened over and over again in our society. In the middle ages, people assumed that no serious discussion could happen in anything but Latin — the so-called “vulgar” languages had no merit. And writers assumed that nothing interesting or lasting would come from this new medium of television. And, I think, people assume right now that nothing important will be created from a 10” touch screen without a keyboard (let alone a tiny 3.5” screen).
With the benefit of hindsight, we're tempted to point at history and laugh. "How could they ever imagine that an idea expressed in Latin was worth more than one expressed in German, French, English, or Italian?" But they were stuck in what they perceived to be an eternal present. At the time, all sophisticated discourse was composed in Latin, and they assumed that this would continue forever.

This error -- the assumption that present conditions will continue indefinitely -- is a fundamental weakness of human cognition, and you'd do well to become skillful in identifying it when you see it in front of you. I once heard a television sportscaster confidently predict that women would soon be running marathons faster than men, because their times were improving so rapidly. That was 30 years ago. It didn't happen because the current slope of the curve does not define its future slope.

Extrapolating current conditions into the future is the thinking that leads otherwise sensible people to pay $1 billion for an iPhone app with niche appeal -- it is growing in popularity today, and so it will certainly be overwhelmingly popular tomorrow. It's the thinking that left Apple for dead, crowned Microsoft the champion for all time, and says Google has won search and Facebook has won social and nothing will ever change that. It's a style of thinking that's wrong every single time.

Tomorrow will be different from today. The people who change the world are the ones who can escape the eternal present.

Tuesday, May 15, 2012

Number games

Fast Company reports on a new study on Google+'s numbers, which (again) suggest that Google's social product is essentially a ghost town, with low engagement and very weak repeat traffic numbers.

Google, of course, disputes the point. They consistently report that Google+ is booming, with numbers going through the roof. The numbers they choose to report, however, are suspiciously vague.
The company has been asked repeatedly for monthly active users, and it's repeatedly denied such requests, essentially calling them irrelevant. The closest we've seen of active usership was when the company explained how many Google+ users were engaging with Google Plus-enhanced or -related products. The problem is that Google Plus-enhanced products include YouTube and Google.com, meaning if you are engaging with basically any Google property (there are 120 Google+ integrations thus far) while signed up with Google+, Google is basically counting this as engagement with Google+, which is incredibly misleading. 
Google is, essentially, playing the same game that Amazon plays with Kindle sales numbers. Amazon constantly brags about how great the Kindle is doing, and yet to date has refused to disclose how many, precisely, have been sold. They do so in the knowledge that some lazy reporters will pass on the headline without reflection: "Kindle sales going up, up, up!" But those who are paying attention ask: if the product is really doing well, wouldn't you want to be specific about how well it's doing?


Google is undeniably playing games with its numbers for Google+. Counting engagement with search, Gmail, and YouTube as if that were intentional engagement with a social network is disingenuous, and that begs the question: what motivates that behavior? Success tells its own story; if the numbers are really good, you'll share those really good numbers with anyone who will listen. When you brag about your success while playing games with the numbers, it makes you look like a liar.

Monday, April 23, 2012

Success eludes us

Avinash Kaushik has a typically detailed and clear-headed take on website success metrics: namely, the Key Performance Indicators that you select can either save or destroy your business, because what you choose to measure (and pay attention to) defines you as either a customer-centric or customer-hostile organization.

It's a good point and well worth reading, but it exposes a larger truth that I find equally perplexing: that most organizations have a very hard time defining success in the first place.

Let's say you're running a site and carefully gathering metrics. Let's say that you're also lucky or unlucky enough to be running an internal site, so you can loop in Active Directory and know, not only how many people accessed a page, but what department they work for, which office they're in, and other metrics that allow you to translate clicks and page views into concrete business scenarios. You're swimming in quality metrics, so you must certainly have a good handle on site success, right?

Not necessarily. Let's say there are two employees who sit next to each other: Bob and Sally. Bob thinks that, for any good intranet, the employee is the customer. Bob knows that employees come to the intranet to get their jobs done, so his success metrics focus on site paths to critical resources; the shorter the path the better the metric, since an employee who's clicking around looking for something is not a happy, productive employee. Sally, though, knows that the #1 customer in her world is the CCO, and the CCO's top priority is that employees understand what a great job the company is doing. Sally's success metrics will relate, first, to the number of articles that were published to the home page (getting the message out) and, second, the number of page views that those articles receive. The more page views the better the metric, since in Sally's world that translates into an audience actively engaged with her messaging.

Two definitions of success, both defensible, result in two success metrics that could not be more different.

Avinash underscores the need to choose performance indicators that orient your business in the proper direction, but in my experience a company only rarely will have a single definition of site success. There are many versions of success, some of which can vary from employee to employee, and that complicates the success scenario tremendously.

One truth is undeniable, though: you cannot pretend to build or run a site successfully unless you pull up your socks and engage in the hard work of defining your success metrics. What is your site home page trying to accomplish? What measurable action counts as a success within that scenario? If you get everyone in the room and lead them through that discussion, the various definitions of success have a chance to bubble up to the surface, where they can be reconciled or otherwise dealt with. Without the success metric conversation, the best you're doing is punching the clock and cashing your paycheck, which is good enough for some people but (I trust) not enough for you.

Friday, April 20, 2012

Tasteless

Marco Arment highlights the two things that most of Apple's competitors will never have in their battle against iPhones and iPads: time and taste.

Their lack of time is simply a question of math and the passage of time: unlike Apple, they have not invested years in developing, tweaking, and refining their products; instead they rush to market with a me-too device that most customers will recognize to be cynical and inferior.

Their lack of taste is the more hopeless scenario, though. A patient CEO can eventually invest enough time to get his company into the same ballpark as Apple. Taste, though, is not something you develop through will or patience. It's a quality that almost everyone thinks they have, but relatively few people actually possess. Worse, recognizing good taste when you see it is far, far easier than creating tasteful objects yourself.

If you work in or on the periphery of a creative field, you've seen this happen: someone in your office sees something cool or well-designed and decides, "We should do that, too!" And so a project spins up that seeks to develop a copy of that cool/well-designed something. In the end, it shares many of the qualities of the original, but it's neither cool nor well-designed, because the people involved in the project believe that they possess good taste and many of them don't. Worse, they designed this product by committee, which allowed those with no taste to drag the product down to their level.

It's a problem with no easy solution. Everyone sincerely believes that they have taste, so you can't ask those with bad taste to remove themselves from the process. Everyone is convinced that they have what it takes to design something special, but of course very few of us actually do. So perhaps one approach is to challenge everyone in your organization to design one thing, using the tools and media that suit them best. Gather their products, and where you find beauty, elegance, or that certain indefinable something, grant its creator a chance to participate in the creation of more important things.

Otherwise, your best chance lies in hiring the next Jonathan Ive -- and then not allowing the committee to shout him down when he's trying to design something beautiful.

Thursday, April 19, 2012

Totalitarian technology

Wired has a down-beat article on what's come of Singapore, which twenty years ago looked like it was on the fast road to technology utopia. I'm old enough to remember the William Gipson article that bemoaned Singapore's combination of totalitarian control and seemingly visionary planning. Sadly -- or happily, depending on your perspective -- not much has come of those visionary plans. Singapore is still a centrally-planned, repressive state, and those big dreams never made it much past the planning stages.

This article should be of interest to anyone who thinks China will be the next big thing in the technical world. We know enough about innovation to be pretty sure that two elements are required: lots of money, and a vibrant idea economy in which inspirations can build on top of one another. China has no shortage of money, and with their enormous population they certainly have plenty of people to dream up the next big thing, but the Communist government remains allergic to radical ideas and non-conformist styles of thinking. I've seen China's ultimate success celebrated in some quarters as if it's only a matter of time, but Singapore's counter-example raises very interesting questions. China may, in short, need to choose between control and innovation. If it comes to a choice, my money is on the government opting for power over prosperity.

On a wider front, is it realistically possible for a government -- any government -- to set up a tech hub like Silicon Valley by way of careful design and top-down planning? It's hard to imagine that a bureaucracy wouldn't screw it up in some way, through bad laws, bad plans, or simple inertia. If it was that easy to foster hotbeds of ideas and innovation, we'd have more and better examples of such communities to talk about.

Monday, April 16, 2012

Process trumps ideas

This morning brought the discouraging news that Netflix has never used the winning algorithm in its $1 million contest, and has no plans to do so. Apparently the cost of implementing the solution exceeded the expected benefit, and in any case the company had moved on to other priorities by the time the contest was completed.

I've always been one of those starry-eyed optimists who see the potential of crowdsourcing and X-Prize-style innovation. It is, however, a little too easy to announce a contest and whip up enthusiasm around the idea of boundless innovation. If you don't also follow through on the back end and make sure that your internal processes are lined up to accept and implement the solutions that emerge from the contest, it might all come to nothing in the end.

I know that Netflix is probably more or less happy with their contest. It brought a lot of publicity to the company. Maybe not $1 million worth of publicity, but still -- the contest made Netflix look like a cool, forward-thinking company, and the polish that applied to the brand can only be positive. Even so, that sort of thing only goes so far. I've worked in an office where there were a series of internal innovation efforts in which employees were gathered together and invited to propose solutions to business challenges. At first it was very motivating. It felt like every one of us had the chance to make a difference. Once the ideas were collected, though, there was no mechanism in place to turn them into products; we captured our ideas on Post-Its and wrote them on whiteboards, but in retrospect that's as far as it went. The outcome was toxic: employee cynicism ramped up significantly, to the point where it was difficult to generate any enthusiasm around new efforts.

Process trumps ideas. It doesn't matter how many smart people you have contributing great ideas that solve problems for your company if you don't have processes in place to turn those ideas into action. This is the sort of thankless work that's absolutely essential if your high-profile innovation efforts are not to crash and burn.

Thursday, April 12, 2012

To mobile, or not to mobile?

Net Magazine has an article by Josh Clark, in which he takes Jakob Nielsen to task for suggesting that companies need to provide distinct mobile websites, optimized for the mobile experience.
Nielsen is confusing device context with user intent. All that we can really know about mobile users is that they're on a small screen, and we can't divine user intent from that. Just because I'm on a small screen doesn't mean I'm interested in less content or want to do less.  
Stripping out content from a mobile website is like a book author stripping out chapters from a paperback just because it's smaller. We use our phones for everything now; there's no such thing as "this is mobile content, and this is not."
Of course there's some truth to that. Mobile web usage is skyrocketing, and it's increasingly difficult to say with confidence that you know much of anything about the mobile user. When mobile web browsers sucked, you could say that you were designing a mobile site just to alleviate the suffering of trying to view the regular website on that tiny screen. Now, though, iPad, iPhone, and Android users might be perfectly happy with the same view as desktop users.

However, there's a subtle risk to Clark's point of view, because it gives you a reason not to make a choice. Nielsen's approach is to choose a user scenario and optimize for that; Clark appears to be arguing that you should optimize for everyone, in every scenario. And, if that seems like a difficult task, his advice is to just do it better:
Responsive design, adaptive design, progressive enhancement, and progressive disclosure give us the technical tools we need to create a single website that works well on all sites. We're still learning to use those tools the right way. Just because it's a design challenge to use them correctly doesn't mean we shouldn't strive to do it right.
This is great advice for creative geniuses. For everyone else, it's a recipe for failure. I was once peripherally involved in a website redesign project in which the target audience was carefully mapped and defined and put into a series of boxes that, taken together, amounted to everyone who can read. Even people who almost certainly lack the physical capacity to access the website were included in the target audience. It was a complete planning disaster, because when you design for everyone, you design for no one.

That project eventually went off the rails, and it was no surprise to anyone, because one thing that had clearly been established is that the people in charge of the project were not making decisions. Design is about decisions. It's about cutting the cake into smaller pieces, and discarding what you're not going to use. It can be agonizing; the second-guessing can go on forever. But if you choose instead to make no decisions and build your site for everyone, you are headed down the path of creating a site that will delight no one.

Clark, in short, is both right and wrong. He's completely right that we know nothing about mobile users other than that they are using a mobile device, but he is completely wrong in arguing that it is a mistake to optimize for a mobile scenario. Even if the scenario you choose is only partially correct for your visitors, you still stand a better chance of creating a site that users will want to visit.

Tuesday, April 10, 2012

Words are so 1995

In the midst of an excellent article on Facebook's purchase of Instagram, Paul Ford writes two paragraphs that will strike fear into the heart of any seasoned web editor:
It used to be that web people "published websites" — like the site you’re reading now. But today people who work on the web “manage products.” I'm not sure when that changed, but clearly a memo went around. At one time, in the nineties, everyone was a “webmaster,” then for a while they were “site editors” or “site managers” and now they're “product managers.” A website — even one as simple as Twitter — is no longer a singular thing; it’s a multitude of things from all over the place. 
See what happened? On the web, “product” has gone meta. Companies once made sleds or dreamcatchers or software, but that’s all outsourced; an Internet product is very often a thing that lets other people make things — a kind of metaproduct — and you can get 30 million people working for you, for free, if you do a good job of it.
If you've been paying attention even a little bit, you've seen the same thing happening. When I worked at Microsoft Game Studios I used to joke that the sites we produced were "post word" because -- with great effort -- we had transitioned from boring, ineffective, text-and-image articles to videos and multimedia presentations introduced with a bare minimum of text. That was the market: if you're promoting video games, and you're using a lot of words to do so, you're doing it wrong.

I'm in a different job now, but the experience is the same: articles perform poorly, videos do somewhat better, while photo galleries do the best of all. The visitors to your site are ready to give you perhaps 30 seconds of their time; if your stock in trade is words, you don't have many good options.

The very nature of communications is shifting in response, from words -- in email or face-to-face -- to tools like Twitter, Yammer, and Facebook. The job of communications professionals is now to drive adoption of platforms and engage in real-time dialogue, leaving precious little time for the careful word-crafting that used to be our stock in trade. These can be bewildering times for old school communicators, but if you're comfortable with change it also can be exhilarating. The possibilities are endless, and there's no guarantee that everything will turn out well, but at least it won't be boring.

Monday, April 2, 2012

It's in the increments

Glenn Fleischmann has a very smart article up on TidBITS arguing that Apple's competitors keep trying to match it with big-splash innovation -- LTE, bigger screens, 3D -- when what makes Apple different is its commitment to ongoing, incremental innovation within existing products. The consequence of this disparity is computer and phone makers like Dell and Samsung are hoping that you'll buy every new device when it comes out, while Apple assumes that you'll own your device for three years or more and continues to deliver updates throughout that lifespan. The outcome: customer loyalty.

In part this disparity is a function of different business plans, but I also suspect that companies other than Apple just don't respect incremental innovation. I've seen the same in my work. When leaders speak of innovation, they're generally thinking of big, disruptive, change-the-world thinking. That's the sort of product that Clayton Christensen wrote about in The Innovator's Dilemma, and it's also the sort of sexy change that gets people excited.

Unfortunately, it's a conception of innovation that leads businesses right over the cliff.

There's no doubt that big-splash innovation happens, and when it does the world truly does change. The smart phone, the personal computer, television, radio, the automobile, antibiotics; these had enormous effects. However, truly disruptive products only come around once in a while, and it's very difficult (perhaps impossible) to predict which drawing-board ideas will be disruptive and which will fail to find a market. Betting the company on big-splash innovation is like gambling with money you can't afford to lose: it's bold, it's exciting, you do have a chance to win big, but it's more than likely that you'll lose your shirt.

In the meantime, there's another style of innovation that anyone can pursue, at much less risk. It's not as sexy, and when you win the winnings may not be quite so big, but it has significant potential. That's incremental innovation: the sort of innovation that pursues the improvement of products, services, and processes that already exist.

I believe that innovation is the job of everyone in the company, but that's not to say that everyone should be working on the next big thing. Instead, everyone should be looking every day for ways that they can do what they do better, look at it in a new way, see new possibilities, new ideas, and new ways of measuring outcomes and applying what they've learned to the next version. In a well-run business, there should be a restless, ceaseless striving to learn constantly and do everything better. My experience, though, is that very few businesses take this approach. If innovation is a business goal, it is likely that one team -- or maybe even one person -- will be charged with being innovative, while everyone else is just supposed to do their job.

But imagine if incremental innovation was part of every worker's job description. What if you used the capacity for incremental innovation to guide your hiring? What if your managers were tasked with maximizing the innovative potential of their direct reports? You might be allowing a little bit of creative chaos to enter your business, it's true, but the bottom-line results could be incredible.

Thursday, March 29, 2012

Creative discomfort

While reading this TUAW article on "how the iPad inspires new content creation," I stumbled on a related, more interesting thought.

(Isn't that always the way of things? The best meeting I've been in this week was boring and useless, and once I realized it was going nowhere and focused my mind instead on the other things on my to-do list, I was able to do some great work brainstorming and project-planning, all without leaving the meeting room. Sometimes the best ideas don't come from what others put in front of you, they're the ideas that are one or two steps to the side, just out of view until you allow your focus to shift.)

The author argues that the iPad inspires creativity because it's a direct interface, rather than one that's mediated through a keyboard and a mouse. There's no doubt some truth to that, but I would argue that the iPad's greatest inspirations come from the combination of two qualities: its simplicity and its novelty.

Simplicity is what everyone notices about the iPad: it does away with a lot of the interface conventions that we're all used to when working with computers. Conventions are comfortable to experts, but sometimes they make very little sense in absolute terms. The world is typing on QWERTY keyboards because we're all used to it and trained on that interface, not because it's better. The iPad's simplicity wipes away all of that and allows interactions that are not predefined by convention.

That's a great thing, but what's more significant is that the iPad forces the user out of the comfort zone. If you present me with a conventional computer, complete with keyboard and mouse, I think I already know how I can and will use it. My preconceptions of the device constrain my use of it. The iPad, though, is a little uncomfortable. It forces you to use it in ways that initially can seem awkward. In the process, new possibilities present themselves.

In retrospect, iPad keyboards and styluses are a terrible idea. They're extensions of ideas that we're already comfortable with. The power of devices like the iPad lies in the ways that they can make us uncomfortable, and in so doing expand the possible.

Friday, March 23, 2012

Follow the leader

Every now and then, publications like ESPN publish a self-indulgent piece of pseudo-journalism called "Power Rankings," which allege to rank teams from best to worst. Today I saw that ESPN had issued Power Rankings for the NFL, which is even more pointless and self-indulgent than normal because the NFL is currently in its offseason, free agency is ongoing, the draft is still a month away, and so these sportswriters are ranking teams that a) are not playing, and b) have incomplete rosters. But no matter, there are bored sports fans like me who click through to this sort of thing, and so I did.

The best team in the NFL, according to this chart, is the NY Giants. In fact, four of the five reporters voted them first. That might seem only fair, since the Giants just won the Super Bowl, if it weren't for the fact that the Giants were generally considered the fourth or fifth best team in the NFC, looked for a while like they might miss the playoffs entirely, and then went on a hot streak at the end. The Giants were a good football team, no doubt about it, but they were also very lucky.

By any objective standard the Giants were quite obviously not the best team in the league last year, and they are unlikely to be the best team next year, so how do they end up at the top of power rankings? Because they won the big game. This is something we see in tech journalism as well: whoever wins is the best, by definition. Steve Jobs was the worst, so bad that he was thrown out of his own company, and then he was the best. Sergey Brin was the best, until Mark Zuckerberg became the best. According to the pundits, there is no such thing as luck in Silicon Valley, only talent, skill, and dedication. The cream rises to the top, and shows that it's the cream by virtue of rising.

It's a satisfying story, more so than "they were in the right place at the right time." That narrative is far more inspiring to would-be entrepreneurs than the alternative, that they might have a great idea and solid skills and work really, really hard and still fail in the marketplace. Unfortunately, it's a fairy tale. The universe doesn't hand out ribbons to the best, it hands out ribbons to the ones who win. And sometimes they win by luck.

Wednesday, March 21, 2012

You only get one chance to be honest

What do Mike Daisey and Komen have in common? They both blew their one chance to be honest. Komen made a politically-motivated move and then claimed it had nothing to do with politics. Daisey presented eye-witness testimony to things that he didn't personally witness.

There's no doubt that both Daisey and Komen executives feel that these slips have been overblown, that the baby is being thrown out with the bath water. But here's the thing: in both cases, people assumed that they were telling the truth. The general public gave Daisey and Komen the benefit of the doubt. In a world where it is impossible to verify everything that you hear, this is essential. You are required to extend your trust to certain people, because the only alternative is near-universal skepticism.

This is a benefit we extend to anyone like Daisey and Komen who appear on the surface to be well-intentioned, but it comes with a cost: when you opt for "truthiness" rather than truth, the backlash can be severe. When you lie to us, or even when you engage in half-truths and exaggerations, you make us question what else we should be skeptical about. When you show that you have a casual relationship with the truth, we learn that we can't necessarily trust you. We're not going to fact-check everything you say, and you've already proven that we can't take you at your word, so the only viable option remaining to us is skepticism. In the process some babies will end up out with the bath water, but that's not our fault, it's yours.

In corporate communications and in life, you only have one chance to be honest. Don't blow it.

Wednesday, March 14, 2012

Stop the presses

The New York Times reports on the death of the print edition at Encyclopedia Britannica, which brings back memories -- I worked at Britannica for a few years in the late nineties.

The article states that peak sales for the print encyclopedia came in 1990; I started work there seven years later, just as I was finishing my Ph.D. in religious studies. By then I had realized what a tough job market academia was becoming. Getting a job was very difficult; getting a job that would not suck all the joy from your life was nearly impossible. Out of the blue I got a call from Britannica saying they needed a religion editor. It was kismet; it was fate. I signed on the dotted line and took my first full-time job.

My time at Britannica came in three phases. The first and third phases were with Britannica.com -- the digital arm of the company -- and were great. I liked the people, I liked the product, and I really enjoyed my time there. The second period was downstairs with the print people, and that's where I witnessed first-hand the death throes of a proud institution.

By 1997 the world had begun to shift to digital in a big way, and Britannica was reluctantly following suit. Microsoft was bundling CD editions of Encarta with every PC sold, and Britannica knew that it had to come up with a digital version of the encyclopedia if it was going to compete. Unfortunately, the first efforts were half-hearted at best: the first edition of Britannica on CD was text-only (not a single image) and cost $1,500. They were afraid that the CD would cannibalize sales of the print edition, so they priced it in such a way as to severely punish anyone who thought to buy it. Needless to say, that was a disaster, and by the time I came on board the company was trying again.

The print staff was not taking this well. Many of the editors and managers in the print department had been with the company for decades, and they had always assumed that they'd be there for life, tweaking articles and corresponding with authors and puttering about in the intellectual garden they had so painstakingly constructed. By 1997, though, things were changing far more rapidly than they liked, and they were afraid for their jobs. As it turns out, they had reason to be, but in the meantime that department was the angriest, most toxic environment I've ever worked in. After 18 months of passive-aggressiveness and back-stabbing I escaped back to the digital section of the company, and it was with a profound sense of relief. I was finally back among my own kind: people who like their work and treat their colleagues with respect. I felt liberated.

In retrospect, Britannica's struggles with culture and process were severe to the point of absurdity. These were the days when Wikipedia was just getting off the ground, but the potential of crowd-sourcing was so far off the radar that we never seriously discussed it. Instead, we were busy with what should have been a much smaller task: fighting with the print editors to get them to update the web version of the encyclopedia at a rate faster than the five-year turnaround that they were used to in print. When a celebrity dies or a candidate gets elected, Wikipedia is updated that same day, often within the hour; Britannica always wanted to take its time to consider its words, run drafts past area experts, review their changes, possibly direct the copy to another expert for review, and then consider its words one final time before putting anything new in front of the reader. The world doesn't run at that pace anymore, and though there were plenty of smart and insightful people at Britannica, in the end there weren't enough to change the company into something it wasn't.

By the summer of 2000 our dot-com business plan was in a shambles and the writing was on the wall, and I made the decision to move to Seattle with my then-girlfriend, now-wife. Three months later most of my friends and former colleagues working on Britannica.com were laid off. At that point the layoffs in the print division had already begun, so in a way this was only fair; the company was sinking and everyone was going down with it. The rats were not the only ones to get their feet wet.

Now, at last, the print edition has been deep-sixed, and for the most part I'm surprised that it managed to hold on for so long. The company is under new ownership and management, and company president Jorge Cauz has this to say about their current efforts:
The Web site is continuously updated, it’s much more expansive and it has multimedia.
That's good to hear, but it would have been better if I'd heard it in 1997, when it might have made all the difference in the world.

When 20% becomes half-assed

A former development director at Google decided to return to Microsoft, and posted the reasons why in a blog post that has received a ton of attention. For me, the most meaningful section comes towards the close, when he talks about the consequences when Larry Page decided that Google's primary mission is to catch up with Facebook in social:
Suddenly, 20% meant half-assed. Google Labs was shut down. App Engine fees were raised. APIs that had been free for years were deprecated or provided for a fee. As the trappings of entrepreneurship were dismantled, derisive talk of the “old Google” and its feeble attempts at competing with Facebook surfaced to justify a “new Google” that promised “more wood behind fewer arrows.”
This is very discouraging news for anyone who's committed to promoting innovation within their workplace; for years, Google's 20% time has been the gold standard for allowing employees to innovate officially (rather than when no one is watching), and the argument has been that if a company as successful as Google provides room in which employees can tinker and experiment, then we should do that too, right? If Google ends up scrapping the 20% time, that could have repercussions throughout the business landscape, as bean-counting managers use it as justification for shutting down the more speculative endeavors of their own employees.

Unfortunately it's a fact of life that innovation and unconventional thinking have a very hard time surviving the bean-counting. The real test of a company's commitment to innovation comes when times are tough and competitors appear to be winning: will you stay the course and continue to invest in your employees, or will 20% time and bottom-up thinking be like free massages and Bring Your Dog To Work Day, eliminated in the interest of pursuing greater discipline?

In Google's case, I wonder if Larry Page is still smarting from Eric Schmidt's crack that he was brought on as CEO at Google to provide "adult supervision?" Is he pursuing focus and discipline at Google to prove that he's all grown up now? If so, it may come at the expense of his company.

Thursday, March 8, 2012

Only one shoe has dropped

Techcrunch has a smart post on how the new iPad announcement was only half the story; the other half will likely come during WWDC in June, when iOS 6 is unveiled.

There's a mistake that tech journalists continue to make with respect to Apple products: they compare them to their competitors on the basis of features. Whether it's screen resolution, the number of processors, the speed of those processors, how much RAM the device has, or whatever it is, they operate on the assumption that Apple's customers are looking for the product that has the best features. And so they triumphantly announce that competitors have equivalent or superior features, and predict that this foretells Apple's imminent downfall. It never happens that way, but the journalists don't learn the lesson.

The lesson is simple: Apple's formula is to combine hardware with software. Neither exists in isolation from the other; they are a seamless unit, and that seamless experience is the whole point of Apple's famous walled garden. Steve Jobs' philosophy was to achieve differentiation in the marketplace by doing the best job in the world of unifying the two, and under his guidance that's been Apple's approach to the market for most of its history.

This is the way forward for the industry as a whole. Obsessive focus on technical features is a characteristic of an industry in its infancy, when most customers are early adopters and hobbyists. As an industry matures, its products become relevant and attractive to a broader segment of users, and the demands of the market shift. There will always be an enthusiast tech market that obsesses about features, but increasingly the average tech consumer doesn't care about any of that, s/he just wants her stuff to work and to empower her to do something she couldn't do before. Apple is an industry behemoth today because they understood that fact far earlier than their competitors. Eventually other companies will learn the same lesson; the ones that don't will cease to exist.

Hardware matters, but hardware + software is exponentially more significant.

Friday, March 2, 2012

Precisely!

The New York Post is speculating that Apple is working with content providers to present television channels as apps.

What an intriguing idea.

Inscrutable

Wired has an interesting story on how dolphins say "hello." Apparently there's a "signature whistle" they use in social situations, when meeting up with dolphins from other pods. There also appear to be rules to the interaction, most of which are poorly understood, if at all.

I've been a science fiction fan for most of my life, so between books, movies, and television I've probably read or watched hundreds of scenes in which humans come into contact with aliens for the first time. Usually the process is completely seamless; as in "Close Encounters," there may be a bit of coordination at first, but soon the universal translator is working properly and communication proceeds apace.

This dolphin study reveals how hopelessly naive and optimistic that assumption is. We've lived alongside dolphins for tens of thousands of years, and we've been studying them pretty intensively for decades now. We've also had the remarkable opportunity to study entire generations of dolphins who lived in captivity. On top of that, dolphins are mammals, and so they share a certain degree of genetic heritage with human beings. With that wealth of information, insight, and reflection as backdrop, we've now been able to reach the conclusion that dolphins maybe, possibly say "hello." Otherwise, we know basically nothing.

If it's that hard to understand dolphins, with whom we have so much in common, how are we ever going to understand the attempts at communication by a species that we've just encountered, that evolved in completely different circumstances, and with which we have absolutely nothing in common? It's no wonder that science fiction makes this process so much easier -- those books would have been pretty boring if they had all featured long chapters of mutual incomprehension.

Tuesday, February 28, 2012

Ghost town

The Wall Street Journal reports that Google+ is essentially dying, with users spending on average three minutes a month there, while they're spending seven hours per month on Facebook. If those numbers are even close to accurate, they're brutal.

The WSJ spins this as Google's failure to differentiate Google+ from Facebook, but it's a mistake to think this is about features. Google+ has failed to date because of network effects. I've signed up for Google+, but I never go there, because no one I know goes there, either. My friends and family are on Facebook, and so that's where I go, too. It's quite simple. If you want to beat Facebook, steal its members. Features are largely irrelevant; what you need is people.

Sooner or later someone will disrupt Facebook -- it's inevitable -- and that company will do it by being cooler than Facebook. The initial community will be high school and college students, who will like it in part because their mothers and grandmothers aren't already there; only later will the numbers grow so great as to attract the less cool elements. It's like real estate: at first the artists move into a neighborhood that no one else likes, and it works for them because it's cheap and filled with other artists. Then the yuppies and the wanna-be's follow the artists into the neighborhood, and the whole thing gets so gentrified that the artists move out and look for the next cool spot.

My guess is that Google made a fundamental mistake when rolling out Google+, in that its initial community (and, by consequence, its initial appeal) was among the tech-savvy crowd. Techies only appear cool to other techies.

Wednesday, February 22, 2012

The enemy of my enemy, take two

This week there's been a back-and-forth over screenshots that may or may not show Microsoft Office running on an iPad. Microsoft has denied it, but in carefully-couched words that to some eyes read more like a non-denial denial.

M.J. Siegler thinks that Microsoft is putting Office on the iPad as the ultimate screw-you to Google: the iPad is already far more popular than any Android tablet out there (even the quasi-Android tablet known as the Kindle Fire), and with Office it would also be far more useful for business users. Android in the tablet space would be left with very few selling points -- and remember, this is a platform that's already struggling to find a strong connection to customers. Android on a smartphone benefits by the sales muscle of carriers and device manufacturers; Android on a tablet has to sell on its own merits, and so far it's mostly failing that test. If Office is exclusive to the iPad (and Win 8 tablets, when they come out) that might just about do it for Android in the tablet market.

FastCompany has a piece on why this move, as satisfying as it might be, would put Microsoft in a very difficult situation. Briefly stated, Microsoft has three primary options, none of which is appealing:

  • They could bring Office to the iPad and price it at competitive levels to comparable apps on the platform. Apple's own productivity suite costs about $10/app, so the Big Three of Office (Word, Excel, and PowerPoint) could logically be priced at a collective $30. This poses a serious problem, though: Microsoft Office for the desktop will run you $149.99 retail (or $123.49 if you buy it through Amazon). 
  • Alternatively, they could protect the price structure of Office in one of two ways: either by offering a full-featured tablet version at $50/app (absurd on its face) or by presenting the tablet version as a stripped-down, "Lite" version of Office optimized for touch input but missing some core features. That, though, causes them problems down the line when Windows 8 comes out on tablets. If the tablet version of Office that comes pre-installed is not a true version of Office, why would you buy that tablet? 
  • The third option is no option at all: argue that Office requires a keyboard and mouse, and limit any tablet app to the sort of "view and annotate" versions we've seen on handheld devices before. This would almost certainly have the effect of convincing more and more consumers that they don't really need Office, as they try out tablet-based Office alternatives and find that they can actually get their work done that way, too.
This is a real problem for the company. Microsoft cannot hope to sit out the tablet revolution and still prosper going forward, but by the same token they can't sharply cut prices on the tablet version of the software with which they've printed money for decades now and hope that customers won't expect price cuts on the desktop side as well. A lower-priced desktop version of Office might sell like hotcakes, but it's very possible that even so Microsoft's bottom line would suffer.

In the end, this dilemma could prove to be Ballmer's true legacy. He's managed the company through a long profitable period, but he also mocked the iPhone and iPad when they came out. Clear vision would have recognized strong competition when he saw it; strong leadership would have had the company already building for new markets, rather than focusing on wringing every last cent out of the markets they already occupied. If Ballmer can find a way out of the mess he's at least partly created, I'll owe him an apology; in the meantime this will be a very interesting drama as it unfolds.

Monday, February 20, 2012

Who is your customer?

Amid reports of Google's latest ethical lapse, I found myself reflecting on an adjacent point: Google would never have thought to engineer ways to bypass users' browser security settings if those users were still their customers.

Way back when, in the days when Google was young, you and I were their customers: we were the people using their cool, new search engine and (I'm sure) they delighted in delighting us with the power of their tools. Subsequently, though, Google stumbled upon the fact that you can make a huge amount of money connecting advertising results with search, and that occasioned a shift of outlook and intent. No longer were Google search users the customers; now they were the product, and advertisers were the new customers.

Any business that wants to be around for a while seeks to delight its customers. But the product? That's just the product, to be packaged and marketed in the most effective manner. Google has had more than its share of ethical stumbles this year, but in the end what some people are angry about is that Google has stopped treating ordinary people like you and me as their customers. That has been the case for years, but it's only now that we're seeing the full implications of that switch.

It's a critical question: who are your customers? I've found that this can vary widely even within an organization. Since I was hired to manage an intranet, I've always taken it for granted that my customers are the company's employees. Five feet away from me, however, sits a woman whose primary client is the company CEO. Next to her sits a woman whose clients are certain divisions within the organization, and beside her sits a woman whose customer is the department head. We have many different customers whose interests do not always perfectly align, and yet this is a fact that we never seem to speak about.

Do you know who your customers are? Do the people you work with know that, too? If not, ask yourself what might happen if they stumble on that information themselves.

Friday, February 17, 2012

Tell me what I think I already know

There's a very useful concept in economics and behavioral research that you're hopefully already aware of: confirmation bias. In a nutshell, confirmation bias is your tendency (and mine, I'm not pointing fingers here) to selectively accept information that confirms opinions that you already hold. Confirmation bias is why global warming seems so obvious to Democrats and so obviously wrong to Republicans. When new evidence comes out confirming global warming trends, those who already accept that global warming is a fact will nod in agreement, while those who think it is a fraud will think that this is just the latest deception. Confirmation bias is one of the primary reasons why we're not as rational and objective as we think we are, because we have a strong tendency only to listen to what we want to hear.

Enter this morning's big story: Google (and others), the WSJ reports, has been using a hack to trick mobile Safari into circumventing the iPhone's default security settings, so that they can track user info in ways that the user has not authorized. When the WSJ contacted Google about this practice, Google abruptly stopped tracking this information, and it's no wonder. What Google was doing was (probably) not illegal, but it was certainly sneaky, underhanded, and borderline unethical.

Except if you're John Battelle, in which case it's all Apple's fault.

The Apple-Google rivalry is one of the most polarizing issues in technology today. Pretty much everyone who's paying attention has already picked a side. In short, this is a situation ripe for confirmation bias, and you couldn't ask for a better example than Battelle's take on the situation. It's all Apple's fault, he says, because the privacy settings on the iPhone are so extreme as to break web standards, it's arrogant of Apple to assume that users want that level of privacy (Battelle even goes to the point of arguing, somehow, that assuming you want your information private is the opposite of privacy), and Google is the victim because it's been forced to trick users into handing over information to which Google -- and others -- should be entitled.

This article, in a nutshell, is why tech journalism is in the toilet today: everyone has picked sides within the world's largest pissing match, and they're all preaching the gospel from their various pulpits. We readers selectively listen by subscribing to certain feeds, and the confirmation bias just gets deeper and deeper, until we arrive at the point of absurdity where someone will argue in a public forum that strict privacy settings are a violation of privacy.

Reporters were never as objective as they claimed to be, but if this is where journalism is headed, we're all screwed.

Monday, February 13, 2012

Meet the strawman

The Globe and Mail has an interview with Roger Martin, a Canadian business professor who is on RIM's board of directors. Martin angrily dismisses criticisms of the company and the way it's been handled, and offers this rebuttal to the idea that RIM should seek to be more like Apple:
“They ask ‘Why can’t you be more like Apple?’ So we should go bankrupt and fire our founders and bring in a moron? That’s what we should do?” Mr. Martin says.
Perhaps from his business school days, Martin appears to be familiar with the strawman argument. This is the style of argument in which you confront your opponents by pretending that they are arguing a ludicrous point, and then debunk it.

In this example, those who point to Apple as a model for RIM are probably suggesting that RIM should have paid more attention to the consumer market, and that they should probably not have released a tablet so unpolished that it lacked a native email client. In the smartphone space, Apple has developed a clear and focused strategy -- develop the best, most user-friendly smartphone in the world -- and pursued it aggressively, while RIM has alternately looked arrogant, complacent, and half-assed. There are Apple characteristics that RIM would do well to emulate.

Martin, however, doesn't respond to that argument. Instead he pretends that critics are arguing the absurd point that RIM should emulate the worst period in Apple history, rather than the best. The fact that Martin offers a strawman argument in a public forum speaks poorly of him; the fact that the Globe and Mail reporter  apparently didn't challenge him on the point speaks poorly of his publication, as well.

Strawman arguments accomplish nothing; they don't move the conversation forward, they don't assert the strength of your position, and only morons would think you can win an argument that way. This is one more example of what happened to RIM: they respond to outside challenges with scorn and contempt, and meanwhile the market passes them by.

Friday, February 10, 2012

Arbeit macht frei

There's a nice post on the 37 Signals blog on the distressing -- and damaging -- perception that you must work punishing hours if you work for a startup. I haven't worked for a startup in several years, but you can't follow tech culture without seeing this ideology on a daily basis.

I've never been a long-hours guy, though I work today in an org where the always-in-the-office people are celebrated and people look at me a little funny when I mention that I usually go home before 5:30. Personally, I've never seen much of a benefit to working impossible hours. In college, when I tried to pull all-nighters, I realized that I was essentially useless after midnight; I was so tired, and my mind became so clumsy and slow, that I would have been better going to bed and hitting the books fresh the next morning. Since then I've never had a job that I couldn't excel in within the confines of 40 hours per week.

Could I have done a little better if I had forgotten about my wife and my life and lived in the office? Maybe I could have found another 10% or so at the bottom of the barrel, but the cost would have been very high, and sooner or later I would have burned out and quit. On the other side of the coin, the people I've known who work evenings and weekends also call in sick a lot more often than I do, they disappear for long lunches, and some of them take cigarette breaks that last for hours. When I'm here, I'm here.

Overtime is overrated. It's too easy to show your commitment by staying late or coming in on Saturdays. You don't need to be talented to work long hours, you just need to be willing. The more significant achievement is to do excellent work during business hours, go home at the end of the day, and come back the next day ready to go.

Thursday, February 9, 2012

Silence is golden

As I type this, the internet is abuzz (well, moderately so) over two announcements that Apple may or may not soon be making: the iPad 3 (a pretty good bet for March) and the long-rumored Apple television (which might never be more than vaporware).

It's often been remarked how Apple's strategy differs from Microsoft's in this area. Microsoft announces technical roadmaps well in advance; their commitment to enterprise IT departments require them (or so they believe) to announce their intentions far enough in advance for IT admins to run security tests and develop their upgrade roadmaps. As a result, Microsoft frequently talks about the features of a product before it is fully developed, and the product that ultimately ships is often missing one or more core features included in the original vision.

Apple, on the other hand, don't say nothin' to nobody. Depending on the source, this has been interpreted as either symptomatic of Steve Jobs' maniacal paranoia and control-freakishness, or as part of a masterful advertising strategy. I fall into the latter camp: here we are weeks in advance of an Apple event that hasn't even been announced yet, and we're already seeing articles every day in the technical press talking about what they might or might not talk about. Apple's ads are masterful, but they receive untold benefit from the advertising that is handed to them by reporters and bloggers who can't help but talk about what might be coming next.

All well and good, but I believe there is more to it than that. Researchers have long known that humans experience gains and loss differently. You will be pleased if I come to your house and give you $10 for no reason, but you will be much more displeased if, instead, I steal $10 from your wallet. Even more, if I give you $10 and then suddenly take $5 back, you may well experience the loss of those $5 more acutely than you experienced the $10 gain. That's why investors hold onto stocks that are in the toilet and why it's so hard to accept that the used car you bought is a lemon: because dumping that stock, or giving up on the car, would require you to accept a loss, and we humans really, really hate to experience a loss.

So with that in mind, consider Microsoft's strategy. They come out in a tech event and announce an amazing new product, with 10 exciting features. One year later the product that ships contains five of those features, and two don't work nearly as well as you hoped. Are you going to be happy about the three good features that you got? Maybe, but not nearly enough to compensate for the seven features that you "lost."

Meanwhile, Apple says nothing until they have a polished product, where all three features work like a dream. There's no loss, because they never promised us anything in the first place; all we experience is gain. If you ever find yourself wondering why Apple products are so well-received by the marketplace, consider human psychology, and how Apple has managed to erase loss from the equation.

Friday, February 3, 2012

If in doubt, start with honesty

One of the biggest stories this week has been the absolute train wreck that ensued when the Komen Foundation announced that it would cease funding of Planned Parenthood, which uses Komen funds to offer breast exams to lower-income women.

This is not a political blog, and I won't weigh in on either side of the abortion debate. Instead we're going to talk about messaging, particularly how Komen could not have made a worse mess of their communications if they tried. Komen is a private foundation, and they can fund whoever they want. But when they draft a new policy with the specific intent of targeting Planned Parenthood, and then go on record saying that they are not bowing to pressure from anti-abortion groups but merely responding to a Congressional investigation ... that was instigated by anti-abortion groups, their cause was lost. When their clumsy, disingenuous explanation of the change resulted in public outcry, Komen crucially had no response.

Communications are different today. Controlling the message involves active engagement across a variety of media from day one, and doing so requires that you have your messaging down, tightly written, and are prepared to respond to skepticism and criticism. Komen did none of these things, but those were secondary mistakes. Their first and fatal mistake was to build the initial message on a lie: that they were not doing what they were quite obviously doing. After that first, transparent falsehood, everything else that came out of their offices was viewed with suspicion -- and rightly so.

If in doubt, start with honesty. Respect your audience well enough to level with them. If Komen had come out and said, "Planned Parenthood is a valued partner, and we respect the work that they do, but unfortunately the time has come when our support of their efforts is costing us too much and distracting us from our other operations," things might have been very different. There still would have been some damage control, but it would have been a lot less than what we've seen develop.

The Komen brand was severely damaged this week, and it all started with the failure to respect the audience.

The last 16%

The official Yammer blog has an interesting post asking "how much adoption is enough" for enterprise social networks. The nifty charts define the last 16% of the workforce as "laggards"/"naysayers" who might never adopt a social network, and so pouring resources into getting them into the network is likely a futile task that drags down the ROI for your strategy as a whole.

The most valuable insight comes about halfway down the page: "Keep focusing on the people who drive value out of the solution and the others will eventually catch on." Indeed, it is crucial to focus on value when trying to convert users. They won't sign up -- and come to habitually use -- something like Yammer because of the features or potential that the tool has. Many of them will sign up if an executive urges them to do so, but they won't become steady users of the service for that reason alone. If they see value in using Yammer -- value that exceeds the potential value of the time spent doing so -- they'll use it. If they don't, they won't.

That being said, be careful not to focus too tightly on one particular form of value. Value varies across users. Innovators find value in the new, latest thing because it's new and cool. Highly social users find value in social media because it allows them to socialize without leaving their desks. Utility-focused users find value in Yammer because it's a quick and effective way of asking questions and getting answers. Others will find value in the ability to keep better informed than they would be otherwise, or tune into certain people and topics (and tune out of others).

Others will see no value at all, and they may be right. Even if they're wrong, it's too much trouble trying to convince them to change their mind. Better to focus on supporting and enhancing the value that regular users do find in the service; their success is ultimately the best argument that you can make.

Thursday, February 2, 2012

Think outside yourself

AAPL Orchard makes the point that "not everyone copies Apple" by highlighting the keywords in two recent statements by the CEO's of Apple and Sony: Tim Cook's statement featured the words "best," "world," "delight," and "proud," while Sony's emphasized "growth," "businesses," "accelerate," and "domains."

In part this is a communications problem: Cook has the advantage of working for a company that -- largely due to the influence of Steve Jobs -- has its messaging down cold. Whatever Apple employees may say internally, when they address the public they know the script: "We're about making the best products in the world and delighting our customers." Kazuo Hirai doesn't have that advantage at Sony.

Partly, though, this is a challenging problem for any business: how to get outside your own head to the extent that you can see beyond your own wants and needs and understand things from the viewpoint of your customers. The need is clear, because your customers don't care what you want. They don't care about what your shareholders require, they don't care about your profit margins, they don't care about your market share, unless that share gets so low that it starts to affect their experience of your product. The businesses that succeed magnificently, the ones that inspire brand loyalty, are the ones that understand what their customers (current and future) want and need, and put those wants and needs at the top of their priority list.

Apple gets it. Sony does not get it, and never has -- a company that puts its customers first would never force as many proprietary, second-rate technologies on those customers as Sony has. Kaz Hirai might prove to be a brilliant CEO, and he's certainly done well with the PlayStation, but it's a troubling sign that one of his very first public statements as CEO was to define what Sony wants and Sony needs, and expect anyone other than the company's most ardent fanboys to care.

Monday, January 30, 2012

Things that money can't buy: (1) love; and (2) innovation

Horace Dediu posts a typically informative, data-heavy blog on the theme: "You cannot buy innovation."

The new data from Apple is interesting, but it's a well-established fact: companies that spend a ton on R&D often see little to no return on that investment, in terms of world-changing products. (Though, that being said, it may be that the true ROI on R&D funding comes not from products, but instead from patent licensing fees.)

Dediu defines innovation as "disruptive growth," and in fact disruption is exactly the sort of thing to resist R&D budgets. Research and development is top-down; it's the deliberate, methodical investigation of possibilities that have already been identified and embraced by company management. Disruption, on the other hand, tends to come via products that almost never gain necessary support within large, established corporations; disruptive products are the ones that management tends to kill because they're too risky, they're not aligned with current customer needs, and the upside is not well understood.

Even when we adopt a wider definition of "innovation," though, the problem remains: it's not an easy thing to plan or manage. There's more to innovation than a good idea -- there's also a lot of hard, painstaking labor involved -- but the good idea is where the ball starts rolling, and ideas can and do come from any part of the organization, top or bottom. Ideas that start at the top have an excellent chance of making it through to execution, but ideas that start at the bottom face a much tougher road. Someone needs to believe in it strongly enough to take it to his manager, and that manager needs to believe in it enough to take it to her manager, and so on, ad infinitum, until it reaches a high enough point in the food chain that the necessary resources can be put together. Every step along the way the idea can be killed -- because it's poorly understood, because it's competing for resources with another idea, because someone fears failure, because someone doesn't respect the person who supports the idea -- and so it's only the occasional idea that survives the gauntlet and makes it through. Even then there are likely to be so many design reviews and approval loops that what's built barely resembles the original inspiration. How likely is it, then, that the company follows through on its very best ideas?

Hierarchy varies inversely with innovation; the more layers to an organization, the less likely new ideas are to flourish. This, in fact, is the primary reason that small companies and startups tend to be more innovative: when you're only five people working out of a single room, there's more opportunity for ideas to be heard, discussed, weighed, and approved or discarded. Occasionally, large organizations try to boost their innovation by pushing decision-making powers farther down the org chart, or by creating small, independent units within the larger organization, but these efforts tend to be short-lived because, sooner or later, hierarchy will reassert itself.

So what is there to do? Some large organizations manage to innovate despite themselves. At Apple, the special sauce appears to be a fanatical devotion to quality; the focus throughout the organization is on building great products, not on profit. This allows them to develop and promote the iPad in the full expectation that it would disrupt the profitable Macintosh product line.

Other businesses have innovated via rebellion. The developers behind the very successful Forza Motorsport video game at Microsoft developed the game in secret until they had a product that was so good that no one would say "no" to it. The fact that they had to do this in secret speaks poorly of Microsoft Games, but it was a successful strategy. The risk in this scenario for individual employees is so high, however, that this can never be a standard approach to the problem.

Ultimately the best approach might be to hire really good people. The primary reasons for killing an idea -- fear, uncertainty, and rivalry -- are qualities you see in Grade B, mediocre employees. That sort of person was never in it to change the world in the first place. The best employees are the ones who can't help themselves; they solve problems and develop solutions because they can't stand the sight of failure. Fill your organization with people who are compulsive about going above the bare requirement, and innovation is almost certain to follow.