Home Button

Let’s discuss, for the moment, the Home button. It’s the quintessential interface on smartphones.

feature-iphone.png

As pioneered by the iPhone, its behavior arguably defined the user experience of the rest of the operating system. It suggested apps were a modal experience, and that a press of the button would always exit the app and take you home. In-app navigation would have to happen elsewhere. The home button conveyed a simplicity of interface that immediately resonated. Despite doing only a single thing, it was worth spending an entire physical button on it.

Since then, the button has taken on many new behaviors. First off, the double-click happened as something you could tie to a custom shortcut. Eventually that would open the multi-tasking tray. We got a long-press that would fire up Siri. When the form factor grew beyond 4 inches, we received an optional “reachability” feature that would require a double-tap (not click). Triple click could be mapped to an accessability feature.

Android

Before we go into the feasibility of adding more gestures to the previously single-purpose iPhone home-button, let’s look at what Android did. In the early days, it wasn’t pretty. There would be physical buttons for Back, MenuHome, and Search. Home and Back buttons worked (pretty much) like you’d expect, but Menu and Search were problematic. The former, in particular, meant that apps had no visible UI for extra features. You’d open an app, and some aspects would be buried in an invisible menu you had to click the menu button to see. It was classic mystery meat navigation, and search was almost as bad. Search would open an apps contextual search feature, if it existed. Otherwise it wouldn’t do anything. On the home screen, search would fire up Google. In the browser it would set focus on the addressbar. I may be misremembering bits here, can you blame me?
Oh, and most buttons had longpress. Longpress back, I believe, would open up history in the browser. Longpress home would access voice navigation, I think. Longpress search, for whatever reason, would invoke the multi tasking menu.

There were a lot of qualifiers there, suggesting I might be misremembering. That’s because invisible features are almost always a bad idea. Hiding possibly critical app features in an invisible menu reduced the discoverability to those willing to play whack-a-mole. Same with search, and frankly, longpress.
Before we get to that, though, Android 4.0 Ice Cream Sandwich (technically Android 3.0 Honeycomb, though I’m not sure anyone will recognize that version) improved things a fair bit and the Android systembar has been fairly stable ever since. Here’s the latest iteration:

feature-android

For the purposes of this post, we’ll be discussing the Android “spec” version of the systembar, which features these three onscreen buttons in “back, home, overview” order. Some OEMs, like Samsung or HTC, would make the buttons physical, or flip around their order.

Android has a Back button, a Home button, and an “Overview” (multitasking) button. Initially, there were no longpress actions tied to any of the buttons. Instead, a vertical swipe would invoke Google Now, or Search, whatever your phone came installed with.

Redesigning an interface is like dancing a waltz. It’s usually two steps forward, one step back. In the case of the Android systembar, the swipe up to enter Google Now gesture was crazy making. I would accidentally invoke it every once and again, but my daughter playing with drawing apps would invoke it constantly and lose her place.
Eventually (5.0 Lollipop I believe) the swipe gesture would be replaced by a longpress, and in 6.0 Marshmallow you could even disable the longpress.

The Goldilocks Principle

The elephant in the room is the burning question: which is better? What’s the right amount of system buttons? Should buttons be physical or on-screen?

In the end I think that’s mostly a question of personal taste, preference, who’s using it, and what one is used to. However if the purpose of an interface design is finding a beautiful balance of usability and discoverability, we can probably extrapolate a few general guidelines.

First of all, basic usability favors visibility. Anything you bury under non-standard gestures such as double-click or longpress are going to go unnoticed by most people, and only utilized when such a gesture is learned to be necessary. Ever see someone doubleclicking a hyperlink in a web-browser? That’s arguably a bad user experience design resulting in the wrong lesson: double-click to make sure it works. Online stores are still paying the price for this, having to disable the buy button once pressed a single time, or verbosely spelling out: “only press once”.

The swipe gesture is a powerful gesture when used right. In older Androids it invoked search, which was a terrible decision — as mentioned it was too easy to invoke, and just not useful enough to warrant such an action. When used to swipe between homescreens, or scroll a page, however, there are few better gestures.

Which brings us to the Back action. On Android it’s a permanent systembutton. It mostly does what you expect it to do, except for a few cases where it doesn’t. It’s supposed to always take you back to the previous screen, but what if you just launched an app and accidentally hit back, should you exit out to the homescreen?
On iPhone, there isn’t a system back button. Instead, back-buttons are added by the developer to the top-most toolbar of an app, ensuring they are contextual and pretty predictable. They’re even labelled (most of the time), which is hard to beat from a usability perspective.
On the flipside, they can be far away from your thumb, especially on large screen phones like the Plus models. To alleviate that, there’s the side-swipe, for lack of a better term: you swipe right from the left edge of the screen to go back to the previous screen you were on. Not quite as predictable, perhaps, as swiping left and right inside a photo gallery, but certainly convenient. The main downside of the side-swipe is that it can limit what apps can do with the same gesture. Once a gesture is reserved by the system, you should probably be mindful to not interfere with it.

Incidentally the side-swipe gesture suggests when it might be appropriate to add multiple actions to systembuttons: poweruser features. If a feature is complementary but not required learning in order to use an interface, it can add a lot of value to the experience. The side-swipe seems to do that — you don’t have to learn it in order to use an iPhone, Back buttons are still there. Android 7.0 Nougat has a similar power-user feature that is absolutely not required learning: double-tap the multitasking button to quickly switch to your last used app. It’s like alt-tab for your smartphone.
I’m not finding it easy to defend mapping the multitasking feature to double-click on the iPhone, however. It seems like a feature so valuable you’d want to make it esaily discoverable.

Siri vs. Google Assistant

We come to it at last, the long-press features. On the iPhone, holding the home button opens Siri. Same with Android 5.0+ which would fire up Google Now, and with the upcoming Pixel phones, Google Assistant.

Pixel phones feature a little colored-dot animation as you press and hold, which seems to be an attempt at adding discoverability to the feature. I don’t buy it. The onscreen buttons are small already so you’re likely to cover the animation with your thumb, and nothing beats a label regardless.
The Google Assistant is available through other means, though. The homescreen — the launcher as it’s called — has a big Google logo tab in the top left corner, which also invokes it. You can also use the “OK Google” hotword. So while the home-button longpress isn’t very discoverable per se, the Assistant itself should be. That embraces the poweruser nature of adding longpress features to systembuttons.

Siri is less discoverable. You have to enable it first (otherwise you just get “voice control”). Then it’s there on the longpress. You can also enable a hotword detection, “Hey Siri”, to invoke it. There isn’t a widget you can add, or an app icon you can tap.
Perhaps that’s okay, though. Perhaps the assistants don’t have to be discoverable, yet. They pass the litmustest of can you use and enjoy the phone without it, and in both cases I’ve never used them for much other than setting timers and reminders.

I can’t help but feel like that’s about to change. Assistants, neural networks and AI seem to be evermore encroaching on the smart devices we use. Today it’s smart replies in Google Inbox and on smart watches. But what do the interfaces of tomorrow look like?

It feels like most of the design of the original iPhone sprung from the concept of having that singular home button. What would a phone look like if that button, instead of taking you home, was your assistant?

Cardboard Coffee

One thing that’s always boggled me is the increasing fascination with drinking coffee out of cardboard cups. Part of my fascination stems from the fact that it seems like a huge waste of resources — you could be making virtual reality goggles from that cardboard! But most of my curiosity has to do with the indisputable fact that coffee tastes worse out of cardboard.

I consider that truth to be self-evident. The delicious nectar that is coffee deserves better than to be carried in laminated paper and guzzled through a tiny hole in a plastic lid.

I get it, I get it, you can take a cardboard mug with you on your commute, or as you’re walking down the street, or as you’re waiting in line, and coffee through a plastic lid is better than no coffee at all.

It’s true.

But I’ve seen things you people wouldn’t believe. I’ve seen people in non-commute non-transit situations pick cardboard over glorious porcelain — intentionally and of their own volition and not while under duress (I checked to make sure).

Usually I can even all I need to. But when it comes to coffee, and porcelain is an option, and you still choose cardboard… I’m all out of evens. At that point, I can’t even.

One time I stopped a good friend as he was doing it, and like a concerned parent I asked him why he would pick the cardboard over the porcelain when both options were right there in front of him, in plain sight, literally on the table. He answered:

I like the idea that if I need to go somewhere, I can take the coffee with me.

Okay, that’s actually a fair point.

I mean, I’d just gulp down the coffee from the porcelain and then go ahead and grab another one in cardboard to bring along, but sure, the above is a cogent argument.

Still, I can’t help but feel like cardboard coffee is becoming a status symbol outside of just mobility: “Look at me, I’m on the move!” And that makes me sad. It makes me even sadder than it does when people add sugar to their coffee.

What is wrong with you people‽


This was originally posted elsewhere.

A Culture of Antagonism

It was a day like any other, and it was an innocently looking article like any other. But on this one day, this one particular article and this one paragraph in particular that just missed my good side entirely:

Jim Carrey brought us our first live-action taste of Count Olaf from A Series of Unfortunate Events, but Netflix's upcoming TV series adaptation is (thankfully) going in a different direction.

It's an article from The Verge. It's nothing out of the ordinary, and I like this publication perfectly fine. I hold no grudges against the author either — this could've been published in any fine recent publication. All's good on that front. I'm also not a particular Jim Carrey fan these days, that doesn't change a thing. It's just, this one day, that last sentence got to me.

but Netflix's upcoming TV series adaptation is (thankfully) going in a different direction

On this day — it'd been snowing, by the way, it was rather pretty outside — this one sentence reminded me of everything I loathe about modern online discourse. I read this sentence — and I invite you to correct me, gosh I hope so much that I'm wrong about this — as an off-hand dismissive critique of the older film Lemony Snicket's A Series of Unfortunate Events, starring Jim Carrey, Emily Browning and Liam Aiken. The sentence seems to suggest that this older film is so atrociously bad that the new Netflix series (which I welcome) thankfully goes in a different direction. THANKFULLY! THANKFULLY!!

It's fun how bright the day can look when snow covers the ground. Yet inside of me, my heart held only darkness.

The 2004 movie is one of my favorite movies of all time. I dare you to watch the following deleted scene, and not at least have a tiny appreciation for the music and the visuals. Gorge on those trees.

This scene ends on a simple note: There's always something. And this film has just that: something. I heartily recommend it to you. Watch it tonight, I'm sure you can stream it.

While I mourn the lack of a sequel, I'm not objecting to a Netflix remake, I welcome it. It's a wonderful story, I'd love more. What I mourn is that we can't respect creative work for what it is. Today we apparently have to hate something that wasn't a runaway box office success and the beginning of an endless franchise.

In fact Lemony Snicket did reasonably at the box office and got a solid 72% at RottenTomatoes. It featured amazing performances by two of my favorite actresses, Meryl Streep and Jennifer Coolidge, not to mention the protagonist kids themselves. The soundtrack is amazing, and the end titles… oh the end titles. Take it in:

There are no levels on which I don't adore this film. It's okay if you don't. 

Lately it just feels like everyone hates everything. Because it's easier to dismiss something, than to like it. Because if you like something, you put yourself out there. You reveal to the world what makes you happy, what makes you cry, what makes you reflect, cope with, or just enjoy life. Someone might make fun of you for loving something, so it's easier to hate it. It's breeds a culture of antagonism, and while it might protect you from occasional ridicule by people not worth your time, it also insulates you from possibly discovering something amazing.

If only we could see past arbitrary notions of what's cool to like, and judge movies and music and books on their own merits. Because there's always something.

Thing I Learned About My Little Pony, By Watching My Daughter Watch My Little Pony

It happened. My 4 year old has found a franchise to latch on to. It’s not ideal: the one thing I’m the most allergic to in the world is horses. But if she’s into ponies she’s into ponies and there’s nothing I can do about that except embrace it. She’s got the toys, she’s got the bed-blanket, she’s got the t-shirt, and her favorite pony is Rainbow Dash. It’s a thing.

As an overprotective curling-dad, I consider it my solemn duty to learn about this thing that’s absorbing her attention. So I have been watching the show with her, trying to soak up the pony lore, learn of the details that make out this equestine construct.

The show follows Twilight Sparkle, a purple unicorn, as she visits “Ponyville” — the shining gem of the land of Equestria. You know… from equo in latin? Horse-land? Get it?

Moving on.

Twilight makes friends in Ponyville. Several of them. And she’s taught that though they are all different in appearance, interests, personality and even race, their friendship is the most important thing there is. When they’re all together, their friendship is literally magic. It’s in the tagline.

Sounds good right? It’s perfectly fine that my daughter watches such a diverse, female-positive and all-embracing show, right?

One of my favorite episodes of Lost — bear with me — is the one where wheelchair-bound John Locke cries “Don’t tell me what I can’t do!” and then goes on a walkabout. This is at the core of the values I want my daughter to learn: if she can dream it, she can do it. For that reason I already know the answer to questions she might one day pose to me: “Can I be an astronaut, dad?” YES. “Can I be at the Olympics, dad?” YES. She’ll learn eventually that it might not be a walk in the park, but there’s no reason she should have some sort of arbitrary mental block put in place by me, preventing her from even trying.

Which brings me to Equestria. In Ponyville, there are three races of ponies. The ponies you know, unicorns who have magical powers, and pegasi who can fly and make it rain. They all live and work together seemingly in perfect glittering harmony.

How does this even work? How aren’t the only-ponies perpetually jealous of the other two races?

Ponies are literally born with predisposed skills. Unicorns have magic powers, one of them being that they can write. Pegasi can fly. Sorry Applejack, I suppose you have to manually pluck those apples for selling on the market to make ends meet. If only you were a unicorn you could just use magic, but hey, life’s tough right? Applejack is basically caste-blocked from ever advancing beyond her racially defined place in society.

The fact that only unicorns can write has its own problems. History is written by those who can, well, write… right? I hope everyone trusts the unicorns to be truthful. Better not upset them.

Ever noticed how My Little Ponies have back-tattoos? Applejack has apples, Pinkie Pie has balloons. Those are literal coming-of-age tattoos. Puberty isn’t mentioned, but it’s implied that once a pony reaches that age, whatever “talent” they have is stamped on their back. Forever. A visual indicator of what you are.

The stamps are called cutiemarks.

Back-tattoos aside (some of those are really lovely, I’m sure) I don’t know that I appreciate the idea that you even can have a talent as such—how about those 10,000 hours? What about multiple “talents”: which one gets stamped on you? And why does your one talent need to be permanently advertised to the world? What if your talent is not showering? If you’ll indulge me as I recall a history lesson about mechanical vs. organic societies, this “know your place” undercurrent that permeates Ponyville is a trait I do not find attractive. Also, if I am to ever get a back-tattoo I want it to be something I choose to get. Probably a japanese glyph I think means “fire” but in fact means “toast”. Something I can laugh at years down the line, not something that forever defines my place in the world.

Another observation was that every single pony in Ponyville is either beautifully styled and coiffed at all times. Or an unsightly donkey dragging a cart with a grumpy look on their face. In fact I don’t think I’ve seen a single handsome donkey on the show. They’re like morlocks.

One of the dude-ponies was called “Shining Armor”. A bit on the nose, eh, Lauren Faust? Also, why weren’t there any any girl knights? My daughter happens to love playing knights and princesses. She’s the knight, I’m the princess.

I don’t know what the lesson is. I think I wanted to vet the show, but having now watched one too many episodes with my daughter on the couch, I’m not sure there’s really a lesson to learn here.

Selma likes ponies, she likes watching them on the television with me. Perhaps she doesn’t have to learn about societal norms and expectations and caste systems and harmful stereotypes through a kids show about magical ponies, at age 4. She likes Rainbow Dash, and I think it’ll start and end with that.

As you were.


This post originally appeared on Medium, but is reposted here so I can laugh at it in 10 years. 

“Because You’re Worth It”

A popular brand uses this as their tagline, and it’s always annoyed me terribly.

I was brought up to know that as a human I have inherent value. I try to raise my daughter the same way, so I keep reminding her how much she means to me, bolster her heart to protect her against inevitable douchebags. In that vein, everyone is worth it.

Is worth what, exactly?

This brand sells … perfume? Face cream? I can’t even recall, and I don’t even care. The point is, their tagline is pointing out that you deserve to spend your money on their product. Well what if I can’t afford the product? Does that mean I’m not worth it?

We’ve been over this. The human condition is tough. Things don’t always go as planned. Some people get a particularly short end of the stick of life. There’s no justice to it, just wanton cosmic random chance. Whether you end up able to afford the face cream you’re worth is entirely up to a unique combination of the absence of bad luck, decades of hard work, and growing up in a place where such hard work pays off as it should.

I don’t usually watch TV, so I’m mostly spared zapping by beautiful models parrotting off the tagline in a bubbly tenor. Thankfully, because I think I’d go insane. In a world with people who would take medicine, antibiotics or clean water over a goddamn face cream, the phrase cuts me like a knife on a blackboard.

Everyone is worth it. I believe it’s in a charter somewhere.

All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.

How’s that for a face cream tagline?

Ethical Adblocking

Apple just released iOS 9 yesterday, and with it allowed adblockers into the app store. Since the mobile web is increasingly a Big Deal, this fact heralds a sea change for the web.

An article about adblocking made the rounds a few weeks ago. Here's a pullquote:

If blocking becomes widespread, the ad industry will be pushed to produce ads that are simpler, less invasive and far more transparent about the way they’re handling our data — or risk getting blocked forever if they fail.

That's a load of manure.

A big part of the problem is how slow the ad industry itself has been to adapt. To this day most ads are still big squares (300×250) or giant skyscrapers (120×600). They're not hi-DPI, they're not responsive, and they're usually ugly blinking GIFs. With all the technology we have available to us today, you'd think we'd be able to see better ads at this point.

Ads don't offend me. Well some specific ads do, but the idea of exchanging my attention for a free service such as reading news on the web, that doesn't offend me. I'm an adult, I can make an informed decision as to which services I will leave my data with, whether those services are free through ads or are entirely paid.

The problem creeped up on us slowly: the more attention you could sell, the more money you could get. Ads became bigger and more plentiful. First came popups, then they were blocked. Now we're dealing with full take-over ads, interstitials, lightbox ads, and if you dare browse the mobile web, you'll be looking through blinds in the form of social sharing links at the top, and "dismiss" buttons that don't actually work. It's pretty bad, and it makes browsing websites slower.

In the end, it only takes a few horrible ads to poison the well, and adblocking would eventually become inevitable. It's like television, and Ghostery is the Tivo of the web. With iOS 9 content blockers, adblocking is going to be mainstream fast, and this is where the pullquote above falls apart: ad networks aren't going to get better, probably the opposite.

Today it's possible to make a living running a site that's free to read, solely because of ad revenue. Some can even make a good living. As adblocking grows more widespread, ads are going to be more intrusive to get around this, more guerilla, and even bigger, all in a fight to make the same income off the dwindling flock that still aren't blocking ads. It'll happen to good people that run these sites. Despite their best intentions, their staff have families to feed, and if they just use this slightly larger ad and add an interstitial, things can stay the same for a while and no-one has to be fired.

It would be unfair to blame them. It's human nature: millions and millions of sites aren't suddenly going to see the light at the same time and change their ways all at once. Even if they did, it's unlikely everyone would suddenly stop using adblockers because of this. Once the adblocker is installed, once web-ads have been poisoned by years of bad practices, ads aren't coming back.

John Gruber tweets:

I feel your pain, John. It's the same pain GigaOm felt when they died this year. It's not pretty. And I like Deck ads. They're nice. I agree they shouldn't be blocked. But they're still ads, and adblockers block ads. It's not your fault, it was that monkey ad, remember? Shoulder to shoulder, we stand. Love is a battlefield.

There is no ethical adblocker which blocks only the bad ads and leaves the "good" ads. I'd like to feel like an activist fighting for pure content when I install Marco Arments $2.99 "Peace" ad blocker. I want to believe that by blocking ads, I help force positive change on the advertising companies (and the livelyhoods that depend on them), force them to adapt.

But that's a beautiful illusion. What's more likely is that web ads are going to get way worse, adblocking is going to go way up, and at some point in this arms race, after the death of many a media company, eventually some will indeed have adapted. The big question is whether you'll like the alternatives. It can be apps. It can be inside Apple's Newsstand (featuring unblockable ads). It can be inside Facebooks instant articles. It can be subversive native ads. It can be paywalls. Think in-app purchases: "Pay $1 for this article, or pay by watching a video."

Nature will find a way. But we aren't suddenly going to wake up to rainbows and unicorns. No matter how cool that would be.


A version of this post originally appeared on Google+. Yes, that ghost town you may have heard of. Bring chains and white blankets, let's haunt things.

Apparently I Like Bad Movies

I watched Jupiter Ascending yesterday, and from the moment I saw flying roller blades, I was in love. The film is saturated with color, culture, style and fashion and detail. It has layers and layers and layers, it's creativity all the way down! Did you notice the design of the wooden bill the robot servitor bribed the bureaucrat with? It had the slickest little design and it was on screen for barely two seconds. The amount of work that went into this film was astounding, and apparently Rotten Tomatoes doesn't care, and that makes me sad.

It's not that I'd prefer everyone like the things I like. I'm routinely made fun of for thinking Time Cop is a good movie, and for ranking Sky Captain and the World of Tomorrow close to Raiders of the Lost Ark on the list of my all time favorites. It's fine, we don't all have to agree, I'm comfortable with my taste in movies.

What gets me is that that we'll probably never see another movie like Jupiter Ascending. We'll certainly never get a sequel. Neither did Serenity, or John Carter, or A Series of Unfortunate Events. Or Ninja Assassin. Yet they made Transformers 2, Transformers 3, Transformers 4, and they're making Transformers 5. That seems so wrong to me.

I understand how it works. The movies I mentioned either did bad at the box office, or critically, or both. Transformers 2 on the other hand pulled in $836m on a $200m budget. Little did it matter that it is almost universally deemed bad. I did see the full thing and to this day regret not staring at drywall for 2h30 instead. I don't often criticize things — okay actually I do that a lot — but Transformers 2 deserves it. You could cut it down to a 30 minute short, and not only would the film be better, but there might actually be enough story to warrant its runtime.

Jupiter Ascending really didn't deserve the critique it got. Even if the film wasn't for you, it had so many other things going for it: the elaborate space backdrops, the castle-like spaceships, the dresses, the sets, hell even the spacesuits that looked like they were ornately carved from wood. Did I mention the flying roller-blades? Jupiter Ascending oozed creativity and worked on so many levels. I still can't think of a single level Transformers 2 worked on, and I played 100+ levels worth of Desert Golfing.

Successful movies get sequels, and the Transformers franchise is like a pinata filled with money and shame. It's only natural that studio execs want to keep wailing on it with 2-by-4s. It's just so unfair.

Windows 11

I'm not sure Microsoft Windows will be around in a decade, and that makes me sad.

I used to pick Windows computers. I used to like the operating system and feel more productive on it. I'm sure the price point helped. I still miss full-size arrow keys and having a functional text-selection model, but today I'm decidedly a Mac user. I like that the terminal is a Unix terminal, and I like that I can uninstall an app by throwing it in the trash. My phone runs Android, and I like how sharing information between apps work, enough that I'm willing to put up with phones that are too big and cameras that aren't great. But there's no longer a place in my life for Windows. Sure, I run it in a virtual machine to test things, but that hardly counts.

Although Windows 8 was a nightmare hellride to actually use, I really liked how starkly new it felt compared to how operating systems have looked and functioned for decades. The swiss design style1 is something I never thought we'd see in computer interfaces. Going all in with this on Windows 8 was a ballsy and rather couragous move, even though it obviously didn't pan out. Turns out you can't just throw out decades of interface paradigms between versions, who knew? Windows 8 was a glorious failure, but it did include a new application runtime that's shared with Windows Phone, and it looks like Windows 10 will be fixing the UI wonkiness. I'm still left wondering if it'll be enough to turn things around.

I've been a big fan of new CEO Satya Nadella's work in the past year. He seems to thinking what we've all been thinking for decades: it's weird that Microsoft hasn't been putting their apps on iOS and Android. Windows RT was stupid. No-one is using Windows Phone.

But that last one is disconcerting to me. While I'm a happy Android user and fan of iOS, a duopoly in smartphone platforms isn't good for anyone. I would prefer Microsoft to have a semi-succesful presence in the mobile space, if only to keep Google and Apple on their toes. Most developers aren't going to voluntarily maintain an app for a platform that only has 3% of the market, and without apps, no-one will adopt the platform. Recent news suggests Nadella understands this, and is giving their mobile efforts one final shot. The hope is that by making Windows 10 a free upgrade, app developers might have more incentive to use the new app runtime so their apps will run on desktop and mobile alike. I would think if this strategy fails, it's likely Microsoft will more or less be conceding the smartphone form factor entirely.

On the one hand this seems like exactly the kind of tough choice a forward-looking CEO needs to make in order to ensure Microsoft has a future at all, but on the other hand it leaves an even bigger question of where that leaves Windows for PCs if Microsoft concedes defeat on smartphones. While in the near term Windows for desktops and laptops is probably safe, in the longer term there are growing threats from Chrome OS, a potential Android on laptops, and apps running in the cloud. Even if Windows marketshare survives past these challenges, the price and therefore revenue of selling operating systems has been converging on zero for a while now. It's only a matter of time.

So what's Nadella's plan? When Windows revenue eventually drops to zero, and Microsoft has no platform (and therefore app store with a revenue cut) on smartphones, what will be their livelyhood? In order for Microsoft to stay in the consumer space and not become the next dull IBM, they'll need a source of income that is not Windows, and it's probably not hardware either, no matter how good the Surface Pro 3 was.

So what remains of Microsoft must be what Nadella bets on as the next source of income. So that's Office, Xbox, various cloud services and new things.

Microsoft has always been good at new things, but bad at productizing them. It seems Nadella has some skills in that area, so this will be an exciting space to watch in the next few years, but like all new ideas it's like buying a lottery ticket. You increase your chance of winning by buying a ticket, but you might still not win.

The rest is tricky. The problem is that without owning the platform it'll be orders of magnitude harder for Microsoft to sell their services. Unlike Google, Microsoft has to broker deals in order to have their apps preinstalled on Android phones, and though Android is pretty open, since they don't own the platform they'll always be subject to changing terms and APIs. Apple is a closed country entirely: you'll have to seek out and install their apps if you want them, and even if you do, Microsofts digital assistant will never be accessible from the home button. It's a steep and uphill battle, but I really hope Microsoft finds new footing. Because like how birds do, if life in one ecosystem turns miserable, I want to be able to migrate to another one, ideally a flourishing one. Oh, and I want to see how Windows looks when Microsoft turns it up to eleven.


  1. I refuse to call it Flat Design™ because that's a stupid term that suggests a flat sheet of color is somehow a recent invention.