Windows 11

I'm not sure Microsoft Windows will be around in a decade, and that makes me sad.

I used to pick Windows computers. I used to like the operating system and feel more productive on it. I'm sure the price point helped. I still miss full-size arrow keys and having a functional text-selection model, but today I'm decidedly a Mac user. I like that the terminal is a Unix terminal, and I like that I can uninstall an app by throwing it in the trash. My phone runs Android, and I like how sharing information between apps work, enough that I'm willing to put up with phones that are too big and cameras that aren't great. But there's no longer a place in my life for Windows. Sure, I run it in a virtual machine to test things, but that hardly counts.

Although Windows 8 was a nightmare hellride to actually use, I really liked how starkly new it felt compared to how operating systems have looked and functioned for decades. The swiss design style1 is something I never thought we'd see in computer interfaces. Going all in with this on Windows 8 was a ballsy and rather couragous move, even though it obviously didn't pan out. Turns out you can't just throw out decades of interface paradigms between versions, who knew? Windows 8 was a glorious failure, but it did include a new application runtime that's shared with Windows Phone, and it looks like Windows 10 will be fixing the UI wonkiness. I'm still left wondering if it'll be enough to turn things around.

I've been a big fan of new CEO Satya Nadella's work in the past year. He seems to thinking what we've all been thinking for decades: it's weird that Microsoft hasn't been putting their apps on iOS and Android. Windows RT was stupid. No-one is using Windows Phone.

But that last one is disconcerting to me. While I'm a happy Android user and fan of iOS, a duopoly in smartphone platforms isn't good for anyone. I would prefer Microsoft to have a semi-succesful presence in the mobile space, if only to keep Google and Apple on their toes. Most developers aren't going to voluntarily maintain an app for a platform that only has 3% of the market, and without apps, no-one will adopt the platform. Recent news suggests Nadella understands this, and is giving their mobile efforts one final shot. The hope is that by making Windows 10 a free upgrade, app developers might have more incentive to use the new app runtime so their apps will run on desktop and mobile alike. I would think if this strategy fails, it's likely Microsoft will more or less be conceding the smartphone form factor entirely.

On the one hand this seems like exactly the kind of tough choice a forward-looking CEO needs to make in order to ensure Microsoft has a future at all, but on the other hand it leaves an even bigger question of where that leaves Windows for PCs if Microsoft concedes defeat on smartphones. While in the near term Windows for desktops and laptops is probably safe, in the longer term there are growing threats from Chrome OS, a potential Android on laptops, and apps running in the cloud. Even if Windows marketshare survives past these challenges, the price and therefore revenue of selling operating systems has been converging on zero for a while now. It's only a matter of time.

So what's Nadella's plan? When Windows revenue eventually drops to zero, and Microsoft has no platform (and therefore app store with a revenue cut) on smartphones, what will be their livelyhood? In order for Microsoft to stay in the consumer space and not become the next dull IBM, they'll need a source of income that is not Windows, and it's probably not hardware either, no matter how good the Surface Pro 3 was.

So what remains of Microsoft must be what Nadella bets on as the next source of income. So that's Office, Xbox, various cloud services and new things.

Microsoft has always been good at new things, but bad at productizing them. It seems Nadella has some skills in that area, so this will be an exciting space to watch in the next few years, but like all new ideas it's like buying a lottery ticket. You increase your chance of winning by buying a ticket, but you might still not win.

The rest is tricky. The problem is that without owning the platform it'll be orders of magnitude harder for Microsoft to sell their services. Unlike Google, Microsoft has to broker deals in order to have their apps preinstalled on Android phones, and though Android is pretty open, since they don't own the platform they'll always be subject to changing terms and APIs. Apple is a closed country entirely: you'll have to seek out and install their apps if you want them, and even if you do, Microsofts digital assistant will never be accessible from the home button. It's a steep and uphill battle, but I really hope Microsoft finds new footing. Because like how birds do, if life in one ecosystem turns miserable, I want to be able to migrate to another one, ideally a flourishing one. Oh, and I want to see how Windows looks when Microsoft turns it up to eleven.


  1. I refuse to call it Flat Design™ because that's a stupid term that suggests a flat sheet of color is somehow a recent invention.  

Archive, Don't Delete

I'm one of the lucky … actually I have no idea how many or few have Google Inbox. In any case, I was graciously sent an invite, and have been using it on the web and on my Android phone since then. I love almost everything about it. I particularly love the fact that Inbox seems to be able to divine what archetype an email has. Is it spam? Don't show it to me. Is it travel-related? Bundle it up. Same with purchases, social network notifications, promos, etc. It even does a good job of prioritizing each bundle, and only showing notifications when it thinks it's urgent — configurable of course. It's pretty great.

I don't love how hard it is to delete an item. You have to dive down deeply into an overflow menu on a particular email to find the "Trash" button. I wish it was more easily accessible — I don't know man, I guess I'm a deleter. I remember buying a 320mb harddrive called "Bigfoot" because it was so humongous, but even then I had to manage my space in order to fit everything. So I can't help but feel like this is a generational issue, and I'm now a relic of the past. It had to happen eventually, and I'm getting a really strong vibe that the ceremonial burial of the trash button was very much intentional. It's behaviorism: teaching you not to delete, because archiving is faster and safer.

The crux of the Inbox app is the embracing of the idea that an email is a task. This is contrary to a very popular notion that you should very much separate those two paradigms as much as you can, so it's very interesting to see Google leaning into it. Combined with their concept of "bundles", I think it makes it work.

Let's walk through it: it's Monday morning and you just arrived at the office to open up your email. You received a couple of promos from Spotify and Amazon in one bundle, an unbundled email from mom, 9 bundled Facebook notifications, and two shipping notifications in a bundle. The one email worth looking at is immediately obvious, so you can either tap "Done" on the "Promos", "Purchases" and "Social" bundles to end up with only the one email, or you can pin moms email and tap the "Sweep" button. Everything but the email that needs your attention is archived and marked "Done", and it took seconds.

This is how Inbox is supposed to work. You archive tasks you're done with, you don't delete. If something important did happen to be in one of the tasks you quickly marked done, it's still there, accessible via a quick search. If you get a lot of email, I really do believe that embracing Inbox will take away stress from your daily life. All it asks is that you let go of your desire to manage your archive. You have to accept that there are hundreds of useless Facebook notification emails in your archive, emails you'd previously delete. It's okay, they're out of sight, out of mind, and no you won't run out of space because of them. Checking 9 boxes and then picking the delete button, as opposed to simply clicking one "Done" button — the time you spend adds up, and you need to let go.

I know this. I understand this. As a webdesigner myself, I think there are profound reasons for hiding the delete button. It's about letting machines do the work for you, so you can do more important things instead, like spending time with your family. It's the right thing to do. And I'm not quite ready for it yet. Can I have the trash button be a primary action again, please, Google?

Atheism

Every once in a while, the topic of religion (or lack there-of) comes up in discussion among me and my friends. I often try to explain what atheism is, or actually what it isn't, and almost like clockwork it comes up: sounds a lot like religion. It's an innocent statement, but it also means my explanation failed yet again. It's a rousing topic full of nuance and passion, no matter the religion, agnosticism or atheism of the participant in the discussion. And it fascinates me so because it's supposed to be simple! After all, it's just semantics:

atheism, noun disbelief or lack of belief in the existence of God or gods

religion, noun the belief in and worship of a superhuman controlling power, especially a personal God or gods.

Clearly just by looking at the dictionary, one seems incompatible with the other. All the delicious nuance stems from the fact that the term "god" is part of both definitions.

Quick intermezzo before we get into the weeds: I have many friends with a multitude of different religions, people whom I love and respect deeply. I'm not here to take anyones faith away. This is not about whether religion is a force for good or not, there are far more intelligent debates to be had elsewhere. I just like discussing semantic nuance.

What makes it so difficult to pin down is the fact that atheism is really just a word in the dictionary. We're not even very protective about such words, so we change its meaning from time to time. New information comes to light! The term evolves and mutates and comes to include more meaning still. Looking broadly, though, the definition of atheism forks in two general directions. One direction has it defined mainly as a disbelief in a god or gods, while the other considers it a lack of belief in a god or gods. Did you catch the difference between the two? It's quite subtle, yet substantial.

Disbelief means you believe there are no gods. You've put your two and two together, and decided hey — it just doesn't make sense to me. This is unlike religion in a couple of obvious ways, first of all the fact that there's no holy text that describes what you must or must not believe. There's no promise of an afterlife or lack thereof if you don't, err, not believe in god. There's no codex of laws you have to follow to be a "true" atheist. And there are no places you can go to to meet other atheist to, uh, not pray with. (Actually you can still say a prayer if you want to, it's not like The Atheist Police comes knocking on your door if you do).

The absence of belief, on the other hand, is a bit trickier to pin down. If for whatever reason you never learned about god, well, then you are without belief in god. How could you believe in something you never heard of? Take my daughter for instance. She's 3, and she's only talked for the past year or so. I don't think anyone has told her about religion, not that I know of at least. So she is, by definition, without belief in god. Literally atheos — greek for "without god(s)". It wasn't her choice, how could she even make one? I'm not even sure she'd understand what I was talking about if I tried — she'd probably ask for her juicebox and crayons. From this perspective, being an atheist is, in many ways, a default position. It's what you're born as. Even if you later in life find solace and happiness in religion, until you found that religion you were for all intents and purposes, an atheist. There's no shame in that, it's just a word.

I half expect some readers (thanks for reading 737 words discussing semantics by the way) to ask me: why so defensive, are you sure you're not describing a religion? Sure, once in a while you'll encounter someone who takes their atheism so seriously it borders on being a religious experience for them. But that's fine, they can call themselves atheists too. It's not like you get a badge at the door. Atheism isn't organized behind a hashtag, and it's not about ethics in games journalism.

You are an atheist until you choose not to be, and there's room for all of us.

The Old World Display

Maybe a decade ago, a web-designer and friend of mine told me a classic "client from hell" story. The details have since become fuzzy, but the crux of the story revolved around a particular design the client wouldn't approve. There was this one detail that was off, a particular element that just wouldn't center properly in the layout (it was insisted). Thankfully the client had come up with a seemingly simple fix: just draw half a pixel! Who would've guessed that just a decade later, "The Apple Retina Display" would herald the arrival of just that: the halfpixel?

While the term "retina" is mainly marketing chatter meant to imply that you can't see the pixels on the screen, it's not just about making sure the display is arbitrarily high resolution. In fact, it's pixel-doubling. The 1024×768 iPad doubled to 2048×1536 when going retina, and while the implicit goal of this was clarity and crispness, the exact doubling of screen dimensions means UIs elements scale perfectly: 1 pixel simply becomes 4 pixels. It's quite brilliant, and there's only one pitfall.

For a designer it's more than easy to get carried away with quadruple the canvas. In the plebean resolution of yore, tiny UI elements had little room for error: icons had to be perfectly aligned to even pixels, and ideally their stem weight would be a whole pixel-value. The atom of the display — the pixel — was the tiniest unit of information possible, and if it wasn't mindfully placed it would blur and artifact under heavy antialiasing. In The Old World we had a term for this: pixel-perfect.

However inside the retina display lives the halfpixel siren, and her song is alluring. She'll make you forget about The Old World. She'll draw tiny tiny pixels for you and she'll make your designs feel pixel-perfect, even when they're not. It's an irresistable New World.

David Pierce reviewing the new iMac 5K for The Verge:

I drove an Audi and never looked at my Saturn the same way again. Remember the first time you used a capacitive touchscreen, threw your 56k modem out the window and switched to broadband, or switched from standard-def TV to 1080p?

It only took about ten minutes of using Apple’s new iMac with Retina display to make me wonder how I’m ever supposed to go back. Back to a world where pixels are visible on any screen, even one this big.

It's a good life in The New World. It's a good life here in the first world. It's so true: no-one should have to endure the pin-pricking misery of looking at old-world 1x screens! (Actually, my 35 year old eyes aren't good enough to see pixels on my daughters etch-a-sketch, but I can still totally empathize with how completely unbearable the pain must be).

It gets worse — are you sitting down? Here it comes: most screens aren't retina. One wild guess puts the population of non-retina desktop users (or old-worlders as I call them) at 98.9%.

That's not good enough, and it's time to fight for the halfpixel. We're at a fateful crucible in history, and I can see only two possible paths going forward. Either we start a "retina displays for oil" program to bring proper high resolutions to the overwhelming majority of people who even have computers, or we just have to live with ourselves knowing that these old-worlders will have to endure not only disgustingly low resolutions, but indeed all of the added extra blur and artifacts that will result of future computer graphics being designed on retina screens and not even tested for crispness on 1x screens1.

Oh well, I suppose there's a third option. I suppose we can all wake up from this retina-induced bong haze and maybe just once in a while take one damn look at our graphics on an old world display.


  1. Hey Medium… We need to talk.  

The One Platform Is Dead

I used to strongly believe the future of apps would be rooted in web-technologies such as HTML5. Born cross-platform, they'd be really easy to build, and bold new possiblities were just around the corner. I still believe webapps will be part of the future, but recently I've started to think it's going to be a bit more muddled than that. If you'll indulge me the explanation will be somewhat roundabout.

The mobile era in computing, more than anything, helped propel interface design patterns ahead much faster than decades of desktop operating systems did. We used to discuss whether your app should use native interface widgets or if it was okay to style them. While keeping them unstyled is often still a good idea, dwelling on it would be navelgazing, as it's no longer the day and night indicator whether an app is good or not. In fact we're starting to see per-app design languages that cross not only platforms, but codebases too. Most interestingly, these apps don't suck! You see it with Google rolling out Material Design across Android and web-apps. Microsoft under Satya Nadella is rolling out their flatter-than-flat visual language across not only their own Windows platforms, but iOS and Android as well. Apple just redesigned OSX to look like iOS.

It feels like we're at a point where traditional usability guidelines should be digested and analyzed for their intent, rather than taken at dogmatic face value. If it looks like a button, acts like a button, or both, it's probably a button. What we're left with is a far simpler arbiter for success: there are good designs and there are bad designs. It's as liberatingly simple as not wearing pants.

dogma (noun) a principle or set of principles laid down by an authority as incontrovertibly true

The dogma of interface design has been left by the wayside. Hired to take its place is a sense of good taste. Build what works for you and keep testing, iterating and responding to feedback. Remembering good design patterns will help you take shortcuts, but once in a while we have to invent something. It either works or it doesn't, and then you can fix it.

It's a bold new frontier, and we already have multiple tools to build amazing things. No one single technology or platform will ever "win", because there is no winning the platform game. The operating system is increasingly taking a back seat to the success of ecosystems that live in the cloud. Platform install numbers will soon become a mostly useless metric for divining who's #winning this made-up war of black vs. white. The ecosystem is the new platform, and because of it it's easier than ever to switch from Android to iOS.

It's a good time to build apps. Come up with a great idea, then pick an ecosystem. You'll be better equipped to decide what type of code you'll want to write: does your app only need one platform, multiple, or should it be crossplatform? It's only going to become easier: in a war of ecosystems, the one that's the most open and spans the most platforms will be the most successful. It'll be in the interest of platform vendors to run as many apps as possible, whether through multiple runtimes or just simplified porting. It won't matter if you wrote your app in HTML5, Java, or C#: on a good platform it'll just work. Walled gardens will stick around, of course, but it'll be a strategy that fewer and fewer companies can support.

No, dear reader, I have not forgotten about Jobs' Thoughts on Flash. Jobs was right: apps built on Flash were bad. That's why today is such an exciting time. People don't care about the code behind the curtain.

If it's good, it's good.

Mobile

The future of computing is mobile, they say, and they're not referring to that lovely city in Alabama. Being a fan of smartphones and their UI design, I've considered myself mostly in the loop with where things were going, but recently it's dawned on me I might as well have been lodged in a cave for a couple of years, only to just emerge and see the light. The future of computing is more mobile than I thought, and it's not actually the future. (It's now — I hate cliffhangers).

I had a baby a few years ago. That does things to you. As my friend put it, parenthood is not inaccurately emulated by sitting down and getting up again immediately. It's a mostly constant state of activity. Either you're busy playing with the child, caring for it, planning for how you're going to care for the child next, or you're worrying. There's very little downtime, and so that becomes a valuable commodity in itself.

One thing parents do — or at least this parent — is use the smartphone. I put things in my calendar and to-do list because if it's not in my calendar or to-do list I won't remember it. I don't email very much, because we don't do that, but I keep Slack in my pocket. I take notes constantly. I listen to podcasts and music on the device, I control what's on my television with it, and now I'm also doing my grocery shopping online. (I'd argue that last part isn't laziness if you have your priorities straight, and having kids comes with an overwhelming sense that you do — it's powerful chemistry, man.)

So what do I need my laptop for? Well I'm an interface designer, so I need a laptop for work. But when I'm not working I barely use it at all. To put it differently, when I'm not working, I don't need a laptop at all, and if I were in a different business I'd probably never pick one up. There's almost nothing important I can't do on my phone instead, and often times the mobile experience is better, faster, simpler. By virtue of there being less realestate, there's just not room for clutter. It requires the use of design patterns and a focus on usability like never before. Like when a sculptor chips away every little piece that doesn't resemble, a good mobile experience has to simplify until only what's important remains.

It's only in the past couple of years that the scope of this shift has become clear to me, and it's not just about making sure your website works well on a small screen. Computers have always been doing what they were told, but the interaction has been shackled by lack of portability and obtuse interfacing methods. Not only can mobile devices physically unshackle us from desks, but their limitations in size and input has required the industry to think of new ways to divine intent and translate your thoughts into bits. Speaking, waving at, swiping, tapping, throwing and winking all save us from repetitive injuries, all the while being available to us on our own terms.

I'm in. The desktop will be an afterthought for me from now on — and the result will probably be better for it. I joked on Twitter the other day about watch-first design. I've now slept on it, and I'm rescinding the joke part. My approach from now on is tiny-screen first and then graceful scaling. Mobile patterns have already standardized a plethora of useful and recognizable layouts, icons and interactions that can benefit us outside of only the small screens. The dogma of the desktop interface is now a thing of the past, and mobile is heralding a future of drastically simpler and better UI. The net result is not only more instagrams browsed, it's more knowledge shared and learned. The fact that I can use my voice to tell my television to play more Knight Rider is just a really, really awesome side-effect.

The Plot To Kill The Desktop

As a fan of interface design, operating systems — Android, iOS, Windows — have always been a tremendous point of fascination for me. We spend hours in them every day, whether cognizant about that fact or not. And so any paradigm shifts in this field intrigue me to no end. One such paradigm shift that appears to be happening, is the phasing out of the desktop metaphor, the screen you put a wallpaper and shortcuts on.

Windows 8 was Microsofts bold attempt to phase out the desktop. Instead of the traditional desktop being the bottom of it all — the screen that was beneath all of your apps which you would get to if you closed or minimized them — there's now the Start screen, a colorful bunch of tiles. Aside from the stark visual difference, the main difference between the traditional desktop and the Start screen, is that you can't litter it with files. You'll have to either organize your documents or adopt the mobile pattern of not worrying about where files are stored at all.

Apple created iOS without a desktop. The bottom screen here was Springboard, a sort of desktop-in-looks-only, basically an app-launcher with rudimentary folder-support. Born this way, iOS has had pretty much universal appeal among adopters. There was no desktop to get used to, so no lollipop to have taken away. While sharing files between apps on iOS is sort of a pain, it hasn't stopped people from appreciating the otherwise complete lack of file-management. I suppose if you take away the need to manage files, you don't really need a desktop to clutter up. You'd think this was the plan all along. (Italic text means wink wink, nudge nudge, pointing at the nose, and so on.)

For the longest time, Android seems to have tried to do the best of both worlds. The bottom screen of Android is a place to see your wallpaper and apps pinned to your dock. You can also put app shortcuts and even widgets here. Through an extra tap (so not quite the bottom of the hierarchy) you can access all of your installed apps, which unlike iOS have to manually be put on your homescreen if so desired. You can actually pin document shortcuts here as well, though it's a cumbersome process and like with iOS you can't save a file there. Though not elegant, the Android homescreen works reasonably well and certainly appeals to power-users with its many customization options.

Microsoft and Apple both appear to consider the desktop (and file-management as a subset) an interface relic to be phased out. Microsoft tried and mostly failed to do so, while Apple is taking baby-steps with iOS. If recent Android leaks are to be believed, and if I'm right in my interpretation of said leaks, Android is about to take it a step beyond even homescreens/app-launchers.

One such leak suggests Google is about to bridge the gap between native apps and web-apps, in a project dubbed "Hera" (after the mythological goddess of marriage). The mockups posted suggest apps are about to be treated more like cards than ever. Fans of WebOS1 will quickly recognize this concept fondly:

The card metaphor that Android is aggressively pushing is all about units of information, ideally contextual. The metaphor, by virtue of its physical counterpart, suggests it holding a finite amount of information after which you're done with the card and can swipe it away. Like a menu at a restaurant, it stops being relevant the moment you know what to order. Similarly, business cards convey contact information and can then be filed away. Cards as an interface design metaphor is about divining what the user wants to do and grouping the answers together.

We've seen parts of this vision with Android Wear. The watch can't run apps and instead relies on rich, interactive notification cards. Android phones have similar (though less rich) notifications, but are currently designed around traditional desktop patterns. There's a homescreen at the bottom of the hierarchy, then you tap in and out of apps: home button, open Gmail, open email, delete, homescreen.

I think it's safe to assume Google wants you to be able to do the same (and more) on an Android phone as you can on an Android smartwatch, and not have them use two widely different interaction mechanisms. So on the phone side, something has to give. The homescreen/desktop, perchance?

The more recent leak suggests just that. Supposedly Google is working to put "OK Google" everywhere. The little red circle button you can see in the Android Wear videos, when invoked, will scale down the app you're in, show it as a card you can apply voice actions on. Presumably the already expansive list of Google Now commands would also be available; "OK Google, play some music" to start up an instant mix.

The key pattern I take note of here, is the attempt to de-emphasize individual apps and instead focus on app-agnostic actions. Matias Duarte recently suggested that mobile is dead and that we should approach design by thinking about problems to solve on a range of different screen sizes. That notion plays exactly into this. Probably most users approach their phone with particular tasks in mind: send an email, take a photo. Having to tap a home button, then an app drawer, then an app icon in order to do this seems almost antiquated compared to the slick Android Wear approach of no desktop/homescreen, no apps. Supposedly Google may remove the home button, relegating the homescreen to be simply another card in your multi-tasking list. Perhaps the bottom card?

I'll be waiting with bated breath to see how successful Google can be in this endeavour. The homescreen/desktop metaphor represents, to many people, a comforting starting point. A 0,0,0 coordinate in a stressful universe. A place I can pin a photo of my baby girl, so I can at least smile when pulling out the smartphone to confirm that, in fact, nothing happened since last I checked 5 minutes ago.


  1. Matias Duarte, current Android designer, used to work on WebOS  

Penicillin

My baby has an inner ear infection. Often times these ailments disappear on their own. Other times they get real bad. Thankfully we have Penicillin, which fixes it right up.

For now.

One day in 1928 — it was a Friday — the scotsman Alexander Fleming went about his daily business at St. Mary's Hospital in London. He was working in his laboratory when he discovered he'd forgotten to close up a petri dish of bacteria from the night before. What he noticed would change the world: a mould had grown in that petridish, and in a halo around that mould the bacteria had stopped growing. What Alexander Fleming had discovered would save tens of millions of lives in the century to come: this natural mould exuded a substance that had antibiotic properties. Not a decade later we had Penicillin, and on this Friday in 2014, Penicillin is helping cure my baby girl. Thank you, Alexander Fleming.

There's a problem, though. Penicillin is a wonderful drug, but bacteria — just like humans —  evolve and grow stronger. Put a drop of Penicillin in a petridish of bacteria and the bacteria will die. Probably. There's a tiny chance some of those bacteria will survive due to a random Penicillin-resistant mutation. Those lucky few survivers might reproduce and migrate. Repeat this process for a century and you're bound to have a couple of strains of bacteria to which even the strongest of Penicillins are useless.

We knew this would happen. Yet still to this day, Penicillin is used on a grand scale in meat-production of all things. When cattle have particularly bad living conditions, when too many cows are huddled up in too little space, they'll inflict little scratches on each other, wounds that might heal naturally on a green field of grass. But if your living quarters are also where you go to the toilet, no such luck. Hey, thought the meat industry, we can just pump the cattle full of Penicillin and no bacteria will grow in those wounds!

The way we treat our cattle is troublesome enough, but the inevitable consequences should be alarming. Those dirty farms and cattle transports are evolutionary crucibles for resistant bacteria. The strong bacteria will survive and require stronger Penicillins. It's an evolutionary arms race and we're losing. We always knew bacteria would evolve to be Penicillin-resistant eventually, but if we'd been smart about our Penicillin usage, we might've had enough time to research functional alternatives. As it stands, I'm worried about a future dad and his daughter battling an infection maybe just ten years from now. I hope she'll be alright, man.

So I guess here's another reason you should eat organic meat. Or no meat, that works too.

Chromecast

Ordered the Google Chromecast the other day. It's a little HDMI dongle you put into your TV to make it smarter. Amazing gadget, I must say, it's been a while since I was this excited about a piece of electronics. It's not that it's that full-featured — right now it's only actually useful if all you need is YouTube and Netflix (which happens to be the case for me) — rather, it's the implications of the device that excites me.

It doesn't have a remote control, and the device does nothing on its own. The remote is your phone or your tablet or your desktop. All the device does is receive streams from the internet, and you "suggest" those streams from your handheld. In essence it downgrades your "smart-TV" (or in my case, upgrades my dumb-TV) into being simply a display capable of receiving input. It removes every single bit of UI and interaction from the television itself, and propels it onto that thing you have in your pocket regardless.

The concept alone blew my mind when the implications sank in. I doubt it's controversial to say that television UIs have sucked for decades. Just pick up your remote control and look at it, chances are you'll find more than twenty buttons, 90% of which you've used only once. Alright maybe you picked up an Apple TV remote — vast improvement over most other remotes, but why is that? Right: fewer buttons. Which is why requiring all interaction happen on your smartphone is such a novel idea: by virtue of being a sheet of capacitative glass, your television remote now has only the buttons necessary, and only when you need them. 

It's just great.

What's even better is not having to switch on your television and change to the "HDMI" channel. The Chromecast is always listening for input, so if you tell it to play Netflix, it'll turn on your TV for you, on the right channel no less. When you turn off the television again (alright, I suppose you do need your remote for that — and for volume), your Netflix app will pause the show you were watching. 

This is how television is supposed to work. They've cracked it.

Yeah sure, it's early. Most people will need set-top boxes for a while still. For a 1.0, however, the Chromecast is remarkable. If only Netflix would auto-play the next episode in a TV show, if only Pocket Casts was Chromecast enabled… But hey, this dongle auto-updates transparently in the background. Who knows, maybe next time I turn on the televison, there it is. It is Chrome-based, after all.