The Old World Display

Maybe a decade ago, a web-designer and friend of mine told me a classic "client from hell" story. The details have since become fuzzy, but the crux of the story revolved around a particular design the client wouldn't approve. There was this one detail that was off, a particular element that just wouldn't center properly in the layout (it was insisted). Thankfully the client had come up with a seemingly simple fix: just draw half a pixel! Who would've guessed that just a decade later, "The Apple Retina Display" would herald the arrival of just that: the halfpixel?

While the term "retina" is mainly marketing chatter meant to imply that you can't see the pixels on the screen, it's not just about making sure the display is arbitrarily high resolution. In fact, it's pixel-doubling. The 1024×768 iPad doubled to 2048×1536 when going retina, and while the implicit goal of this was clarity and crispness, the exact doubling of screen dimensions means UIs elements scale perfectly: 1 pixel simply becomes 4 pixels. It's quite brilliant, and there's only one pitfall.

For a designer it's more than easy to get carried away with quadruple the canvas. In the plebean resolution of yore, tiny UI elements had little room for error: icons had to be perfectly aligned to even pixels, and ideally their stem weight would be a whole pixel-value. The atom of the display — the pixel — was the tiniest unit of information possible, and if it wasn't mindfully placed it would blur and artifact under heavy antialiasing. In The Old World we had a term for this: pixel-perfect.

However inside the retina display lives the halfpixel siren, and her song is alluring. She'll make you forget about The Old World. She'll draw tiny tiny pixels for you and she'll make your designs feel pixel-perfect, even when they're not. It's an irresistable New World.

David Pierce reviewing the new iMac 5K for The Verge:

I drove an Audi and never looked at my Saturn the same way again. Remember the first time you used a capacitive touchscreen, threw your 56k modem out the window and switched to broadband, or switched from standard-def TV to 1080p?

It only took about ten minutes of using Apple’s new iMac with Retina display to make me wonder how I’m ever supposed to go back. Back to a world where pixels are visible on any screen, even one this big.

It's a good life in The New World. It's a good life here in the first world. It's so true: no-one should have to endure the pin-pricking misery of looking at old-world 1x screens! (Actually, my 35 year old eyes aren't good enough to see pixels on my daughters etch-a-sketch, but I can still totally empathize with how completely unbearable the pain must be).

It gets worse — are you sitting down? Here it comes: most screens aren't retina. One wild guess puts the population of non-retina desktop users (or old-worlders as I call them) at 98.9%.

That's not good enough, and it's time to fight for the halfpixel. We're at a fateful crucible in history, and I can see only two possible paths going forward. Either we start a "retina displays for oil" program to bring proper high resolutions to the overwhelming majority of people who even have computers, or we just have to live with ourselves knowing that these old-worlders will have to endure not only disgustingly low resolutions, but indeed all of the added extra blur and artifacts that will result of future computer graphics being designed on retina screens and not even tested for crispness on 1x screens1.

Oh well, I suppose there's a third option. I suppose we can all wake up from this retina-induced bong haze and maybe just once in a while take one damn look at our graphics on an old world display.


  1. Hey Medium… We need to talk.  

Mobile

The future of computing is mobile, they say, and they're not referring to that lovely city in Alabama. Being a fan of smartphones and their UI design, I've considered myself mostly in the loop with where things were going, but recently it's dawned on me I might as well have been lodged in a cave for a couple of years, only to just emerge and see the light. The future of computing is more mobile than I thought, and it's not actually the future. (It's now — I hate cliffhangers).

I had a baby a few years ago. That does things to you. As my friend put it, parenthood is not inaccurately emulated by sitting down and getting up again immediately. It's a mostly constant state of activity. Either you're busy playing with the child, caring for it, planning for how you're going to care for the child next, or you're worrying. There's very little downtime, and so that becomes a valuable commodity in itself.

One thing parents do — or at least this parent — is use the smartphone. I put things in my calendar and to-do list because if it's not in my calendar or to-do list I won't remember it. I don't email very much, because we don't do that, but I keep Slack in my pocket. I take notes constantly. I listen to podcasts and music on the device, I control what's on my television with it, and now I'm also doing my grocery shopping online. (I'd argue that last part isn't laziness if you have your priorities straight, and having kids comes with an overwhelming sense that you do — it's powerful chemistry, man.)

So what do I need my laptop for? Well I'm an interface designer, so I need a laptop for work. But when I'm not working I barely use it at all. To put it differently, when I'm not working, I don't need a laptop at all, and if I were in a different business I'd probably never pick one up. There's almost nothing important I can't do on my phone instead, and often times the mobile experience is better, faster, simpler. By virtue of there being less realestate, there's just not room for clutter. It requires the use of design patterns and a focus on usability like never before. Like when a sculptor chips away every little piece that doesn't resemble, a good mobile experience has to simplify until only what's important remains.

It's only in the past couple of years that the scope of this shift has become clear to me, and it's not just about making sure your website works well on a small screen. Computers have always been doing what they were told, but the interaction has been shackled by lack of portability and obtuse interfacing methods. Not only can mobile devices physically unshackle us from desks, but their limitations in size and input has required the industry to think of new ways to divine intent and translate your thoughts into bits. Speaking, waving at, swiping, tapping, throwing and winking all save us from repetitive injuries, all the while being available to us on our own terms.

I'm in. The desktop will be an afterthought for me from now on — and the result will probably be better for it. I joked on Twitter the other day about watch-first design. I've now slept on it, and I'm rescinding the joke part. My approach from now on is tiny-screen first and then graceful scaling. Mobile patterns have already standardized a plethora of useful and recognizable layouts, icons and interactions that can benefit us outside of only the small screens. The dogma of the desktop interface is now a thing of the past, and mobile is heralding a future of drastically simpler and better UI. The net result is not only more instagrams browsed, it's more knowledge shared and learned. The fact that I can use my voice to tell my television to play more Knight Rider is just a really, really awesome side-effect.

Good Decisions, Else Options

There's a mantra in the WordPress development community: decisions, not options. It's meant to be a standard to which you hold any interface design decision: if you make a decision for users it'll ultimately be better than forcing them to make decisions themselves. It's a decent mantra — if you're not mindful you'll end up with feature creep and UI complexity, and it's orders of magnitude more difficult to remove an old option than it is to add one in the first place. Adding an option instead of making a decision for the user is almost always bad UI design.

Except when it's not.

The problem with a mantra like this is that it quickly gets elevated to almost biblical status. When used by a disgruntled developer it can be used to shoot down just about any initiative. Like Godwins law for WordPress: once you drop the "decisions not options" bomb, rational discussion comes to a halt.

The thing about open source development is that it's much like natural evolution: it evolves and adapts to changes in the environment. Unfortunately that also means features once useful can become vestigial once the problem they used to solve becomes obsolete. Baggage like this can pile up over years, and maintaining backwards compatibility means it can be very difficult to rid of. Sure, "decisions not options" can help cauterize some future baggage from happening, but it's also not the blanket solution to it.

The problem is: sometimes the right decision is unfeasible, therefore beckoning an option in its absence. WordPress is many things to many people. Some people use it for blogging, others use it for restaurants, portfolios, photo galleries, intranets, you name it. Every use case has its own sets of needs and workflows and it's virtually impossible to make a stock experience that's optimal for everyone. Most bloggers would arguably have a better experience with a slew of WordPress features hidden or removed whereas site owners might depend on those very same features for dear life. By catering to many use-cases at once, user experiences across the board are bound to be unfocused in some way or other.

The "Screen Options" tab is an example of a feature that would probably not exist were "decisions not options" taken at face value. Screen Options exists on the idea that not everyone needs to see the "Custom Fields" panel on their Add New Post page, yet acknowledges that some users will find that feature invaluable. It is an option added in absence of a strong decision, for example, to limit WordPress to be a blogging-only tool. I consider it an example of an exception to the mantra for the good of the user. Sure, the UI could use some improvement, let's work on that, but I really appreciate the ability to hide the "Send Trackbacks" panel.

I'm a fan of WordPress. I'm a fan of good decisions, and I'm a fan of good UI design. I believe that if we relieve ourselves of arbitrary straitjackets and approach each design objective with a sense of good taste and balance, we can make excellent open source software. Cauterizing entire avenues of UI simply because it adds options, however, negates the fact that sometimes those options exist in absence of a higher-up decision that just can't be made for whatever reason.

Font Smoothing

If you're really into icon fonts, which I have recently become, you may have noticed a tiny storm brewing in the suburbs of the internet. It's about CSS-specified font smoothing. Quite a nichy topic, one you can live a perfectly good life without ever knowing all about. You may in fact sleep better by not reading on.

Still here? Alright, here's the deal. WebKit — born of Safari, engine of Chrome — allows webdevelopers to specify how the edges of fonts are smoothed. The modern default font smoothing method is called subpixel antialiasing. It smoothes font edges using quite impressive means, and in nearly all cases it drastically improves the rendering of letters. If you look at the text in a magnifying glass, though, you'll notice a nearly imperceptible blue haze on the left side of each letter, and a red haze on the right side. WebKit provides a means for webdevelopers to pick which type of font smoothing is applied: subpixel-antialiasing, antialiasing, or none. Handy. Right?

The controversy is the fact that a number of people — smart people — feel that this CSS property is damaging to the readability of text on the web. There are very long articles on the topic. In fact quite recently a Google employee removed the CSS property from Chrome, citing the notion that the browser should render text according to the operating system. There's just one problem: icon fonts.

Icon fonts are custom-made webfonts that contain no letters, only icons. The purpose is to have fast access to a bunch of icons in a very lightweight and easy way in your webdesigns. Other benefits include the fact that the icons are infinitely scalable because they're vector graphics, and you can easily apply any color, drop-shadow or even a gradient to each icon using plain CSS. Sounds brilliant, doesn't it?

The only downside is that an icon font is still technically a font, so the computer thinks each icon is actually a letter, and by default will try to subpixel antialias it. While subpixel antialiasing does wonders to letters, it'll fuzzy up your icons and make them look blurry. Which is why the -webkit-font-smoothing property was so welcome. Here's an icon font without and with subpixel antialiasing:

As you can imagine, I'm strongly in favor of not only keeping the font-smoothing property, but in fact expanding it beyond WebKit to both Firefox and Internet Explorer. Icon fonts won't be a truly viable webdesign technique until every icon looks great on all the platforms.

"But SVG is the future of vector graphics on the web, surely you know that!" — Yes I do. But pragmatically speaking, that future is not here yet. SVG support is still lacklustre, especially when used as CSS backgrounds. More importantly, you can't easily change the color of an SVG icon using CSS only, or apply a drop-shadow. Yes, drop-shadows are on the road map for SVG, but the way it'll happen is not pretty. Icon fonts, on the other hand, provide a real-world solution today, which is both flexible and infinitely scalable. So next time you see someone bad-mouthing -webkit-font-smoothing, pat them on the head and mention icon fonts. The more you know.

Honestly Flat

The authors of LayerVault blog about the flat design era — a move towards simpler UI design with fewer bevels and textures and more flatness. I endorse this movement too much to let this pass me by without comment.

While one side of the mouth yells “good design is how it works,” the other side mumbles that great aesthetics mean realism. It doesn’t need to be this way. Designing honestly means recognizing that things you can do with screens and input devices can’t be done with physical objects — more importantly that we shouldn’t try copying them. It takes too much for granted. Can you imagine your pristine iPhone built into the body of an antique telephone handset? Is that beautiful design?

I really can't help but agree with these points, but I'm thinking perhaps they're overthinking it. Having been in this business for over a decade now (yep, I'm old), I feel like I've arrived at a couple of simple truths. One of them I arrived at reluctantly, and it took a while to accept it. It's the notion that there's no such thing as good UI design, there's only bad UI design.

To elaborate: when you see a good design, chances are you don't notice it. Because it's a good design, it's already set you on your way to your next destination, offering a clean and simple path on your journey. On the flipside, a bad design has you stumble in your tracks, wondering — where's the phone number for this restaurant? Whether the UI design is full of linen does not necessarily matter1, what matters is that you found what you were looking for with the least possible friction. The point is: good design is good design, no matter how you arrive at it.

Aside from good design and bad design, there's also the design that creates an emotional connection within you. Not only does it step in the background when you have a specific goal, but it will reach inside you and shake something when you don't know where you're going. I'm not sure textures, bevels, or even glorious flatness does this. In my experience, only tone, can do this. The tone of your wording, phrasing, or even your kooky layout may incite a smile in your viewer, and your large personal photos may tug at a heartstring somewhere. So it's about being personal, and the best way I know to do this is to let the content shine.

Because of these two lessons — design to help the user on with their journey, and let content shine — my love-affair with gloriously flat UI design is not so much a matter of being honest as it is a matter of getting to that point where you're feeling the design you're working on, as fast as possible. Any big design project is borne of agony and a feeling of insufficiency. The more pieces you put in play, the more functionality you sketch out, the faster those feelings subside until at some point they turn into pride.

If I start my process with a canvas of linen, I'll have already limited my playing field into a very narrow path that's needlessly hard to find and follow.

For me, keeping things flat for as long as possible is like leaving your body and seeing your project from high above. Suddenly the answer to the question: "should I put a linen texture here?" becomes easy. That answer is that it doesn't matter. So I usually leave it out.


  1. I hate myself for actually saying that, because the linen texture is the worst  

Pretty Is Relative

In interface design, pretty is a secondary task. The primary task is to achieve a balance between form and function. In this article I would like to explain why this is important, and why it should be taken to heart by every interface designer.

The interface is what separates the user from the raw functionality of an application, game or website. In a car, the interface is the clutch, the speeder, the brake and the steering wheel. It’s what makes the functionality accessible; it’s what makes it useful.

My primary objective in designing an interface is to empower the user and make as much of the functionality accessible, transparently and easily. In a nutshell: my goal is to make sure that the form follows the function.

Essentially, the form/function mantra spells it out for me. It unveils before me a rather narrow path I have to tread in order for the end-result to uphold and respect this principle.

For websites and applications, conventions established by countless previous interfaces dictate where most things go. If I stick to these conventions and make sure functionality works like users expect, I don’t have to teach them a thing. Every time user assumptions prove right, they will feel more comfortable and empowered. In game interfaces, things are slightly different as the user willingly enters a fictional realm created purely for entertainment purposes. For the sake of having fun, users are willing to learn new rules.

Common to all interfaces though, is that there is a time and place for pretty and that is more or less strictly dictated by the function of our product. Knowing when to prettify and when not to is crucial in ensuring a good product.

Leaving things alone can be difficult for a designer. It’s in our nature to want to add our personal touch, to want to show how pretty things can be. Alas, making things pretty rarely means making them useful, so before breaking rules set out by precedents, we’d better make sure that there’s a damn good reason for doing so. If we do not, we run the risk of simply adding visual clutter, confusing and detracting from the result with every stroke of the brush. There’s a reason most spoons look alike; it’s a tested design and it works pretty well.

Keep in mind, though: respecting functionality doesn’t mean we should simply perpetuate a design that we know works. Sometimes the formula can be improved upon and sometimes there’s simply no precedent. When this happens we need to rely on experience, gut feeling and extensive testing.

In my experience a good interface design goes by unnoticed. I jump in and know how things work, how to go about my business. The cog-wheels of the engine beneath grind creak and turn exactly the way I expect them to. The badly designed interface, on the other hand, immediately stings my eye. I get annoyed that I have to learn how a particular feature works simply because the designer chose to “spiff it up” when I know things could’ve been different, simpler and more useful. In some cases I become reluctant or simply too annoyed to explore features. Even if the designer means well by touching things up, doing so might just tip the delicate balance of form and function.

In the end, what we think is pretty will fade or change given time. What was pretty twenty years ago might not be pretty today. Mullets come and go like the tides, so learn to spot the mullet and try to avoid it. As interface designers we should teach ourselves to let go of our pride and put less focus on pretty. Taking the back seat to function is not a cop-out; it’s taking the high road. Walking the thin line between adding to and detracting from the functionality is no easy challenge. Who ever said simplicity was simple? The real challenge is to make things as pretty as possible within the confines given to us. If we can’t do that, we should settle for functional. After all, pretty is relative.