Monday, January 30, 2012

Things that money can't buy: (1) love; and (2) innovation

Horace Dediu posts a typically informative, data-heavy blog on the theme: "You cannot buy innovation."

The new data from Apple is interesting, but it's a well-established fact: companies that spend a ton on R&D often see little to no return on that investment, in terms of world-changing products. (Though, that being said, it may be that the true ROI on R&D funding comes not from products, but instead from patent licensing fees.)

Dediu defines innovation as "disruptive growth," and in fact disruption is exactly the sort of thing to resist R&D budgets. Research and development is top-down; it's the deliberate, methodical investigation of possibilities that have already been identified and embraced by company management. Disruption, on the other hand, tends to come via products that almost never gain necessary support within large, established corporations; disruptive products are the ones that management tends to kill because they're too risky, they're not aligned with current customer needs, and the upside is not well understood.

Even when we adopt a wider definition of "innovation," though, the problem remains: it's not an easy thing to plan or manage. There's more to innovation than a good idea -- there's also a lot of hard, painstaking labor involved -- but the good idea is where the ball starts rolling, and ideas can and do come from any part of the organization, top or bottom. Ideas that start at the top have an excellent chance of making it through to execution, but ideas that start at the bottom face a much tougher road. Someone needs to believe in it strongly enough to take it to his manager, and that manager needs to believe in it enough to take it to her manager, and so on, ad infinitum, until it reaches a high enough point in the food chain that the necessary resources can be put together. Every step along the way the idea can be killed -- because it's poorly understood, because it's competing for resources with another idea, because someone fears failure, because someone doesn't respect the person who supports the idea -- and so it's only the occasional idea that survives the gauntlet and makes it through. Even then there are likely to be so many design reviews and approval loops that what's built barely resembles the original inspiration. How likely is it, then, that the company follows through on its very best ideas?

Hierarchy varies inversely with innovation; the more layers to an organization, the less likely new ideas are to flourish. This, in fact, is the primary reason that small companies and startups tend to be more innovative: when you're only five people working out of a single room, there's more opportunity for ideas to be heard, discussed, weighed, and approved or discarded. Occasionally, large organizations try to boost their innovation by pushing decision-making powers farther down the org chart, or by creating small, independent units within the larger organization, but these efforts tend to be short-lived because, sooner or later, hierarchy will reassert itself.

So what is there to do? Some large organizations manage to innovate despite themselves. At Apple, the special sauce appears to be a fanatical devotion to quality; the focus throughout the organization is on building great products, not on profit. This allows them to develop and promote the iPad in the full expectation that it would disrupt the profitable Macintosh product line.

Other businesses have innovated via rebellion. The developers behind the very successful Forza Motorsport video game at Microsoft developed the game in secret until they had a product that was so good that no one would say "no" to it. The fact that they had to do this in secret speaks poorly of Microsoft Games, but it was a successful strategy. The risk in this scenario for individual employees is so high, however, that this can never be a standard approach to the problem.

Ultimately the best approach might be to hire really good people. The primary reasons for killing an idea -- fear, uncertainty, and rivalry -- are qualities you see in Grade B, mediocre employees. That sort of person was never in it to change the world in the first place. The best employees are the ones who can't help themselves; they solve problems and develop solutions because they can't stand the sight of failure. Fill your organization with people who are compulsive about going above the bare requirement, and innovation is almost certain to follow.

Friday, January 27, 2012

Us vs. them

Google: "Ads Are Just More Answers"

There's a challenge that every business has to struggle with: the need to see things from the customer's point of view. Even in communications, I run into this all the time -- my colleagues are asking themselves what Communications needs, what serves the ends of Communications as a department, and they have to be reminded to ask whether the actions they're contemplating also serve the needs, desires, and aspirations of the people they're addressing. 

Google seems to be really struggling with this of late. They started out as a company with a clear and consistent customer focus, and people loved them for it. Lately, though, Google's decisions are about what Google needs and wants, and not so much about what the people who use Google's services want and need. 

Of course, in Google's case the situation is complicated by the fact that, over time, it's become clear that people like you and me are longer Google's customers (if we ever were). Google's customers today are the businesses who buy advertising across Google's services, and to a very real extent Google's users are the product that they're selling to advertisers. This, however, is a situation at tension with itself, for if Google's users become alienated from Google's services, Google loses its product. 

It's not going to happen quickly. It may not happen at all. But whereas it once seemed that search as a problem was solved, and that Google was the winner once and for all, now it seems increasingly possible that someone will come along and steal Google's users (and, with them, their customers) out from under them. The more Google sees things only from its own perspective -- only in terms of what Google wants and needs -- the more likely it is that this will happen.

The best ideas have to win

There's a Steve Jobs quote that's been making the rounds lately:
"If you want to hire great people and have them stay working for you, you have to let them make a lot of decisions and you have to, you have to be run by ideas, not hierarchy. The best ideas have to win, otherwise good people don't stay."
(Found here.)

It's a great bit of wisdom and business insight, and certainly describes one of the differences between great work environments and places that just think they're great work environments. I've learned over the years, though, that the devil is in the details; more precisely, every boss thinks that s/he is creating an environment in which ideas triumph over hierarchy, but the basics of human psychology almost always get in the way.

It's kind of like favoritism. If you work in an organization where the boss appears to be playing favorites, it can be infuriating. The entire office is full of people who are busting their asses to get the job done, but the same few people somehow always end up with the rewards and recognition. Nothing can kill employee morale faster than favoritism, so why do bosses do it? Because, to them, it's not favoritism -- they're just recognizing excellence. From their point of view the "favorites" are simply the office's highest achievers.

And so it is for ideas. Show me an office in which ideas are stifled by hierarchy, and I'll show you an office in which the boss is honestly convinced that s/he is only saying "no" to bad ideas. It could be that s/he honestly believes that the most talented people are also the most high-ranking, and so the higher you go in the org chart the better the ideas become. It may be that the boss listens to ideas from all corners but unconsciously gives the most credit to ideas that come from the people s/he trusts -- which is to say, the same people s/he's already promoted into leadership positions.

Despite what generations of bad movies have told you, no one is a villain in their own mind. There are plenty of business leaders who build organizations in which hierarchy, not ideas, triumph, but they do so in the full belief that they're doing the right thing every step of the way. To realize Steve Job's ideal, therefore, it isn't enough to just tell yourself that ideas should win. You need the self-awareness, the perspective, and the capacity for self-criticism to consider ideas even when they come from the least promising people.

Jobs was famous for criticizing his employees harshly; he had a "Bozo bit" that would flip in his mind and convince him that a valued employee had suddenly become an idiot. Jobs was also famous, though, for coming around to accept and advocate ideas that he initially criticized. Maybe his real gift was the ability to separate the idea from the person who suggested it, and place it in a mental space in which he could see its full potential.

Thursday, January 26, 2012

The year of the iPad

Apple's financials are in and, as anyone who cares is certainly aware, they were off the charts. One particularly interesting aspect of the recent quarter is noted in a post on Asymco: namely, that if the only thing that Apple sold was the iPad, it would be the largest PC vendor by unit volume. This fact lends credence to Apple CEO Tim Cook, who on the earnings call revealed that he and others expect tablets to be a bigger market than PC's.

My own experience with the iPad is it tends to take over your computing life from the bottom up. You start out using it for specific tasks -- reading on the couch, browsing your favorite blogs, watching movies -- but over time more and more of your computing tasks end up on the iPad. This is despite the fact that the iPad is obviously and unapologetically limited in certain respects; typing on its on-screen keyboard is not as pleasant (and, in my experience more prone to error) than on a regular keyboard, and getting files onto and off of an iPad can be a bit of a pain in the ass.

This, however, is the standard path of disruption, called out in "The Innovator's Dilemma" and noted repeatedly in the years since. Disruption does not come from more powerful or more capable products, it comes from products that are sorely lacking in one respect but make up for that lack in other areas. The iPad isn't disrupting PC sales because you can do more on it; it's disrupting sales because the advantages of the iPad -- much less weight, far more mobile, much longer battery life, less hot, pleasant and intuitive interface -- are so strong that the customer comes over time not to care about the other things (or, perhaps, s/he cares less about what the iPad can't do than s/he cares about what it can do). Moreover, as is the case with other disruptive products, there are ways that Apple can tweak and refine the iPad to significantly address its weaknesses, but the Dell laptop I use at work is pretty much what it is. The next Dell I use will probably be faster, and maybe the battery life will be a little better, but those are iterative upgrades, and iterative upgrades are what disruptive products eat for lunch.

Note that this does not mean that Apple owns the future, necessarily. The arguments I've heard for why the Kindle Fire won't disrupt iPad sales all pertain to how much less you can do with a Kindle Fire, and as we've seen lesser capabilities are not a barrier to disruption. An iPad killer doesn't need to be better than an iPad, it just needs to be good enough at what it does well that consumers don't care so much about what it can't do, and it needs to have an upgrade path that will allow it to match and exceed the iPad over time. Does the Kindle Fire fit the bill? Time will tell; in the meantime, Apple is -- and, it seems, will remain -- king of the world.

Wednesday, January 25, 2012

Retroactive confirmation

Watching Apple win the world

I bought my first Macintosh in the late 80's. I was a college student writing my senior thesis and it seemed like the thing to do. My roommate -- a future MBA -- suggested I buy a Windows machine, but I went with a Mac because something about the look and feel appealed to me.

For the next 30 years I stuck with that decision. It wasn't always easy. There was a period in the 90's where I practically had to hide the fact that I was a Mac user while at work; mention that you had a Mac at home and you opened yourself to smirks and open disrespect. Even my father referred to Macs as "toys." (He's a committed Windows user to this day.) There were many long years in which it was nearly impossible to find Mac shareware for certain tasks; the developer community was supporting Windows and basically nothing else.

There are a number of people like me, people who owned Macs when that seemed ludicrously stubborn in the face of market inevitability, people who smile a little bit when they hear the latest bit of Apple world-beating financial returns. Back in the day we knew our "toys" were the best computers in the world, and now the rest of the world knows it, too.

Design excellent products, focus on the user experience, and you can make absurd amounts of money; I'm happy to live in a world where that's still true.

Tuesday, January 24, 2012

His way or the highway

Larry Page to Googlers: If You Don’t Get SPYW, Work Somewhere Else

Word this afternoon is of an email that Larry Page sent to Google staffers, telling them to get on board or get out; if they don't understand that Google is all-in on the integration of products -- one effect of which is the promotion of Google social properties to the exclusion of competitors in the social space -- then they "should probably work somewhere else."

It's a delicate thing, internal communications. On the one hand, company founders rightly feel that the company is theirs to do with as they choose; if they make a strategic decision, employees are required to support that decision. On the other hand, though, Google has in the past sought to inspire its employees with a sense of values and higher purpose. That inspiration served a business purpose. High-morale employees can be expected to work harder, stick with the company for a longer period, and do it for less than top dollar. High-morale employees are a positive business asset. So the question: in the wake of that email, how is that morale trending?

Founders forget this at their peril: they can treat their employees as cogs in the wheel, as temporary work-for-hire laborers who won't be around long enough to care about them, and in return they can expect a minimum level of effort and quality from those employees. When they expect more than the minimum, it is incumbent on the founder to invest more: respect employees more, value them more, listen to their opinions and give sufficient weight to their perspective on business-critical issues as to allow that perspective to influence important decisions.

Those are two business paths -- the low road and the high road -- and either can be profitable. The worst path is the one that Larry Page chose, to start down the high road and then double back with internal communications more appropriate to the low road. Employee disillusionment is not an easy thing to fix.

Monday, January 23, 2012

Change the story

It's not surprising that so many companies are so bad at corporate communications. While most people will allow that communication is an art, and that some people are better communicators than other's, it's a rare CEO who will allow himself to be overruled by Communications when s/he has something to get off of his/her chest. Still, it's remarkable that RIM could present their new CEO in a fashion so uninspiring as to prompt a Wall Street sell-off.

Just for fun, let's look at what RIM should have done, and compare it to what RIM actually did.

What RIM should have done:

  1. Accept that the company is facing severe challenges.
  2. Accept that management is at least partly at fault for these challenges.
  3. Accept that no one outside the building is inspired by the products they're making now or have announced for the near future.
  4. Break with the past: introduce a new leader with strong credentials, recruited from a rival, who comes in with a bold plan for the future.
  5. Continue to change the story by talking about how future products will be different, radically different, in a way that holds out the prospect of leap-frogging the competition and competing with iOS and Android in the next generation.
What RIM actually did:
  1. Continue on with the unconvincing story that there's nothing wrong with the Blackberry product line, they just need to improve on execution.
  2. Move the co-CEO's into new positions where they will be able to continue to exert a great deal of influence on company direction.
  3. Bring in a CEO from just down the hall -- he was formerly the COO -- and allow him to talk with no visible passion about how he's going to maintain the company's direction.
  4. Offer extremely unconvincing arguments that the company's problems can be solved with better marketing. 
Ultimately a company's communications come from the C-level heart; even if you have a strong story to tell, you can't keep your CEO on message if he really, truly believes something different. And this is why RIM is doomed: they've lost 75% of their market value over the last four years, but they still think that they're right and Apple is wrong. 

Don't be evil

The word this morning is about a "don't be evil" bookmarklet, developed by engineers at Facebook, Twitter, and MySpace, that removes Google's hard-coded Google + social results from search and replaces them with the social links that Google's own algorithm recommends.

It's a brilliant satire of Google's specious claim that they feature Google + results only because they lack the data to show Twitter or Facebook results -- they have the data, and their own engine demonstrates that fact -- but there's another interesting twist to this story encapsulated in the phrase: "Don't be evil."

That was, of course, Google's original mission statement. The original idea was that Google would be a company apart, one focused more on the coolness of the product than on crushing, killing, and destroying a la Microsoft in its most monopolistic days. They conceived the vision and they put the statement out there, and now it's become self-parody. "Don't be evil" is a slogan that Google haters can throw in the face of today's Google, which is far more concerned with profit and market share than they are with being on their best behavior.

There's an interesting dynamic when you wear your values on your sleeve; if your company becomes prominent, that value statement enters the public domain. As your business matures you may feel that the values need to shift (or maybe it's more accurate to say that you'll stop caring so much about what seemed important when you were just starting out), but the value statement will not shift along with it. Eventually you can find yourself in the position of Google, acting in a way that seems to manifestly contradict its own value system.

So is it a mistake to put those values out there? To my knowledge, Microsoft never articulated a particularly idealistic vision for the company; they were about selling software, and there was nothing about the destruction of Netscape that was obviously in contradiction to that self-identity. Did Microsoft make things easier for itself by keeping its collective mouth shut?

It may be so, and it may be that Larry and Sergey rue the day when they looked into the future and saw no reason why their company should ever resemble Microsoft. (Or, more likely, they're too busy being billionaires to care.) But I would like more small companies to wear their values on their sleeves. When some of them become big companies, their board rooms will begin to be filled with M.B.A.'s and marketing consultants, and by then it may help to remember that they once thought that their company stood for something good, and pure, and honest.

Update: Sarah Lacy weighs in with a similar thought: "Quibbling and asterisks aren’t going to work, because Google is the one who made the unequivocal statements to begin with."

Saturday, January 21, 2012

The enemy of my enemy

The enemy of my enemy

Watts Martin points out a simple truth often forgotten on the web: when the people you hate target someone, that target is not necessarily worthy of your love. Loving WikiLeaks should not require you to love alleged rapist (and demonstrable egotist) Julian Assange, but that fact appears to be lost on thousands of online protestors. Likewise Megaupload: it was a company designed to profit off of the illegal activities of its users. It's not a huge surprise to discover that the company's founders were pretty unethical.

These are important distinctions to be made. Hating the actions of a political dictatorship should not mean that you embrace the actions and methods of the terrorists fighting that government. Hating when government and the police over-reach does not, and should not, require that you celebrate criminals. The world is full of important, complex issues and you do them a disservice by reducing them to white hat-black hat binary poles. Look for the truth in the middle; that's almost always where you'll find it.

Friday, January 20, 2012

Playing into their hands

Anonymous Reacts to Megaupload Takedown With “Largest Attack Ever”

Anonymous reacts to the feds' shutdown of Megaupload.com by taking down a suite of government and entertainment industry sites, and they could not have done a bigger favor for their opponents. Most people out there don't know about the subtleties of this case, and most of them probably don't care. What they do know, or think they know, is that the internet is full of people who do whatever they feel like -- lie, cheat, or steal. Anonymous' actions look to the majority of the public like nothing more than mean-spirited vandalism. Their actions are confirming the preconceptions of their ideological opponents.

Anonymous is a gift to every politician out there who would like to shut down the internet. They could not ask for better rhetorical material. Anonymous is playing right into their hands.

Thursday, January 19, 2012

Stupid social tricks

Facebook promotion of Timothy’s coffee brews social media backlash for deluged Toronto company

Click through on the link to read the sad story of a coffee shop that screwed up a social media freebie offer about as badly as can be imagined. There are many layers of stupid here: creating a contest without allowing for the fact that the response might exceed their expectations, failure to communicate promptly (or, really, at all) when things went awry, and offering a material reward in return for an online action as frictionless as a Facebook "like."

The last mistake was the biggest, and is the latest example of the strange value that some companies place on "likes." Liking a product or cause on Facebook is the online equivalent of smiling at someone on the street -- it is an action, it does represent some sort of connection (though possibly a very weak one), and there are people out there who will refuse to do it, but ultimately it is not a meaningful form of engagement. The bar is simply too low. It's too easy, and therefore signifies very little.

If you want to know who your true fans are, ask them to do something for you that requires a little investment. You'll get a much smaller response, but it will come from the people who really care, and they're the ones who should get the free coffee.

Wednesday, January 18, 2012

Fail fast

The conditions for survival and prosperity

Horace Dediu charts the longevity of technology companies, and notes toward the close that the longest  period in which any hardware company has remained in the spotlight -- Apple's 37 years -- is shorter than the career of any of its employees. In short, if you work for a company that makes computers, your employer will almost certainly fall on hard times long before you contemplate retirement.

I don't doubt it, but I think the point goes farther than that. Since I left graduate school in 1997, I have been more or less steadily employed (with the exception of some hard times following the bursting of the first tech bubble), and yet I've never worked for any company longer than 3.5 years. For the last ten years or so, it's been explicitly in my mind that I want to put down roots and focus on the long term in my job, but still: I have a hard time staying in one place for more than two or three years.

When I look around me, I find that the story is similar for the people I work with. On my current job only a few of my colleagues have been there for more than three years. It is now normal to leave a position -- to jump or be pushed -- after less than five years. It wasn't so long ago that this was rare; I worked with some people at Britannica who had been there for more than 30 years. But today? Not at all the case.

Everything is faster today. Entire product segments rise and fall in a handful of years. Businesses pop up, fight the good fight, and then disappear. As a worker, it means the only constant is change: you're always working on your resume, always getting to know new co-workers, always keeping an eye out for the next bit of disruptive change. It's the modern way; even the best jobs are temporary.

Change is not just inevitable, it's mandatory. That can be exhausting, but it can also be exhilarating.

Tuesday, January 17, 2012

The artist vs. the engineer

The artist cuts. S/he carves away whatever doesn't look right, like Michelangelo apocryphally cutting away anything that doesn't look like a horse. The artist seeks focus and elegance.

The engineer adds. S/he looks at a product with ten features and imagines what it might be like with fifteen. The engineer experiments, iterates, and augments.

Both have their place. Which one are you?

Friday, January 13, 2012

"A Team of People Working on a Google Project"

Facing Another PR Disaster: Google Accused Of Fraudulently Undermining A Kenyan Startup

Click through on the link to read about the latest example of brazen misbehavior by Google. Predatory business practices by a company once founded on the pledge to "do no evil": nothing new there. There's a particular aspect to the story, though, that I find particularly irksome, speaking as a communications professional. Namely, Google's confession, which included the phrase: "We are mortified to learn that a team of people working on a Google project improperly used Mocality's data and misrepreseted our relationship with Mocality to encourage customers to create new websites."

Of course any intelligent reader understands who these "team of people" are -- they're Google employees. So why did Google not simply say, "We are mortified to learn that some of our employees" did these things? Because someone at Google thinks that they can partially cover their asses by phrasing the sentence in such a way as to suggest that maybe, possibly, these were people who wandered in off the streeet, and not employees on Google's payroll who were managed by Google managers on projects approved by Google executives.

This is an egregious example of something that is, unfortunately, widespread. If you screw up, don't admit it -- say instead that "mistakes were made." Use the passive voice wherever possible, and for God's sake turn all your verbs into nouns; you didn't crash that car, "a collision occurred." Maybe no one will notice that you don't appear within the grammar of your own confession. Maybe your deliberate vagueness will be so compelling that everyone will think you're innocent.

Except we're not that stupid. We read Google's weasely prose and know exactly what they're doing. So now, in addition to knowing that Google's Kenyan unit really needs some remedial work in business ethics, we also know that Google U.S.A. would prefer to play games than admit, in the spirit of honesty and integrity, that they made a mistake. We know what they've done in the past, and we know their hopes for the future: that next time, they won't get caught.

Good communications begin with honesty. If you approach the project with the intent to lie, to obscure, to avoid repercussions, no amount of word-smithing can save your ass.

Tuesday, January 10, 2012

Expect the Unexpected

OnLive Desktop brings Windows 7, Office apps to your iPad

I was going to write about how this is the latest sign that we're all inevitably going to be sucked into the cloud, and how at last we're seeing the realization of that vision from about five or ten years ago of network devices accessing our data and programs via the interwebs, when it occurred to me that this is actually the least interesting angle of this story.

What's truly interesting about this story is that it illustrates how innovation works. Namely, it's largely unpredictable. No one can tell me that, when they were drawing up the plans for the original iPad, anyone in the room ever looked up and said, "And then someone will start streaming application data directly to your iPad, and you won't even need to have Office installed!" No one saw this happening, because services like OnLive don't make any commercial sense until there's already a ready installed base of iPad-like devices waiting to be served. It's a chicken waiting for an egg.

There's a concept that Steven Johnson writes about in Where Good Ideas Come From, called "the adjacent possible." This is the idea that innovation from A to B to C to D, which in retrospect looks completely inevitable and self-evident, is in fact quite unpredictable, because A made B possible, and B made C possible, and C made D possible. The pathway from A to D did not become evident until each intervening step was worked out, and all those business professors writing articles about how corporate leaders failed because they didn't see D coming are benefiting by hindsight; within the moment, we're all equally blind.

In short, we couldn't go straight from the idea of network-capable devices to "Office in the Cloud" because the idea of putting Office in the cloud depends on the network effects of there already being a large installed base of people who might want that sort of service, and that didn't exist until the iPad became the first mass-market tablet device. Apple didn't see this coming; they were too busy making the iPad something that you and I might want to buy on its own merits. OnLive, I'm pretty sure, didn't see this coming; they were too busy building a video game streaming service, then later porting that to iOS devices, and then someone looked up from the table and realized that there's another market out there. A lot of adjacent possibles needed to be worked out before this became a viable marketing opportunity.

It still might fail, mind you, but it's undeniably cool to see the ecosystem evolve in this way. We live in strange and wonderful times.

Monday, January 9, 2012

Enterprise Through the Back Door

Enterprise Will Spend $19 Billion on Apple Hardware in 2012

Things have changed greatly in the time I've been in the job market. When I first sat down at a corporate desk, it was a serious offense to bring your own hardware into the workplace. I'm not sure that anyone ever actually got fired over that, but the warnings coming from IT were strongly-worded. In essence, if you brought your Mac to work you could expect to be mocked in the short term and out of work in the long term.

What changed this was the mobile revolution; corporate IT can control what you put on your desk, but they can't mandate what you carry in your pocket. iPhones were obvious even in Windows-heavy environments, and then when the executives caught the wave and started wondering out loud why they couldn't get their business email on their smartphone, the genie was out of the bottle.

Never say never, but I'll say it anyway -- we're never going back. Partly this is due to corporate cheapness; most people I know have better hardware at home (and in their pocket) than the cheapo HP/Dell they use on the job, and unless the business wants to step up and improve the quality of the computers (and phones) they hand out, they are never going to convince the workforce to take a step back in terms of speed, stability, and quality of user experience. Better instead to bow to the inevitable and find a way to support what employees will be doing anyway: using their technology of choice, in the ways that they choose to do so.

Friday, January 6, 2012

The Specific Visionary

There are two types of visionary in the world: the specific and the general. Bill Gates is a general visionary. Years ago he predicted that the next big thing in computing would be tablets, and he was right. But being right about tablets didn't allow Microsoft to create a form of tablet computing that anyone wanted to use.

Steve Jobs was a specific visionary. He famously said about tablet computing, "If it uses a stylus, you're doing it wrong." His vision was of specific things: not the tablet, but how the tablet would work and how you would use it.

Each have their place. "Some day polio will be nothing more than a memory" is a general vision, but in the right hands it's a vision that can and will change the world for the better. Specific visions, though, are the ones that help you build great products. They focus your attention; they tell you what not to do.

General visions give you the Gates Foundation. Specific visions give you the iPad.

Thursday, January 5, 2012

After A Breakout 2011, Yammer Is Working On A Big New Funding Round

After A Breakout 2011, Yammer Is Working On A Big New Funding Round

We've been using Yammer for the last couple years, so I have some perspective here. (And time for a disclaimer: all opinions stated are my own. I don't speak on behalf of my employer.)

Yammer's kind of a good news - bad news scenario. On the one hand, the basic functionality is great, and it really scratches an itch for organizations that have grown past the point where you can keep track of your coworkers via face-to-face interaction. It's relatively easy to use, basically reliable, and adds a new way of communicating that can be very effective in breaking down siloes.

On the other hand, Yammer can really break down when you start asking it to perform more grown-up tasks. To cite one example, how about analytics: a set of charts and graphs that show usage? Yammer doesn't really have that; there are some built-in statistics, but certain core stats (for instance, most-followed users) aren't included, you can't export data in Excel or CSV format, etc. The Yammer guys know that their customers need this (badly -- try to demonstrate R.O.I. without data) and they're "working on it" but there's no timetable for improvements. Likewise with the mobile products, which are sometimes great and sometimes a little buggy. I love the product and have advocated for it strenuously, but I've also spent more time than I'd have liked explaining Yammer's limitations to my colleagues.

My sense in dealing with the guys at the company is that they're young, smart, enthusiastic, and maybe a little overwhelmed. Hopefully they'll take this money and apply it to beefing up personnel and building out the product where it's still weak. (The linked article suggests the money might go instead into marketing and international expansion, which isn't going to solve anything.)

(Update: looks like I'm not the only one with reservations.)

Deloitte: 9% have cut cable, another 11% are considering it

Deloitte: 9% have cut cable, another 11% are considering it

We cut the cable at home a couple years, and haven't looked back. I can't say that I've never missed cable -- ESPN would be very nice, particularly during the football playoffs/bowl season. But that's just the thing: I have never missed cable TV and the 99% of stations that never had anything worth watching. I miss channels. ESPN. HBO. Showtime. I'd happily pay for those, I just can't justify paying for all the crap that goes with it.

This is how you disrupt television: give consumers direct access to stations. In a nutshell, make an ESPN app and put it up for sale in iTunes and the Android Marketplace. Give me the opportunity to pay to watch the television that I love. I'd be on that like white on rice. If Apple -- or anyone -- wants to reinvent the television, that's all they need to do; the rest falls into place behind it.