The future of computing is mobile, they say, and they're not referring to that lovely city in Alabama. Being a fan of smartphones and their UI design, I've considered myself mostly in the loop with where things were going, but recently it's dawned on me I might as well have been lodged in a cave for a couple of years, only to just emerge and see the light. The future of computing is more mobile than I thought, and it's not actually the future. (It's now — I hate cliffhangers).
I had a baby a few years ago. That does things to you. As my friend put it, parenthood is not inaccurately emulated by sitting down and getting up again immediately. It's a mostly constant state of activity. Either you're busy playing with the child, caring for it, planning for how you're going to care for the child next, or you're worrying. There's very little downtime, and so that becomes a valuable commodity in itself.
One thing parents do — or at least this parent — is use the smartphone. I put things in my calendar and to-do list because if it's not in my calendar or to-do list I won't remember it. I don't email very much, because we don't do that, but I keep Slack in my pocket. I take notes constantly. I listen to podcasts and music on the device, I control what's on my television with it, and now I'm also doing my grocery shopping online. (I'd argue that last part isn't laziness if you have your priorities straight, and having kids comes with an overwhelming sense that you do — it's powerful chemistry, man.)
So what do I need my laptop for? Well I'm an interface designer, so I need a laptop for work. But when I'm not working I barely use it at all. To put it differently, when I'm not working, I don't need a laptop at all, and if I were in a different business I'd probably never pick one up. There's almost nothing important I can't do on my phone instead, and often times the mobile experience is better, faster, simpler. By virtue of there being less realestate, there's just not room for clutter. It requires the use of design patterns and a focus on usability like never before. Like when a sculptor chips away every little piece that doesn't resemble, a good mobile experience has to simplify until only what's important remains.
It's only in the past couple of years that the scope of this shift has become clear to me, and it's not just about making sure your website works well on a small screen. Computers have always been doing what they were told, but the interaction has been shackled by lack of portability and obtuse interfacing methods. Not only can mobile devices physically unshackle us from desks, but their limitations in size and input has required the industry to think of new ways to divine intent and translate your thoughts into bits. Speaking, waving at, swiping, tapping, throwing and winking all save us from repetitive injuries, all the while being available to us on our own terms.
I'm in. The desktop will be an afterthought for me from now on — and the result will probably be better for it. I joked on Twitter the other day about watch-first design. I've now slept on it, and I'm rescinding the joke part. My approach from now on is tiny-screen first and then graceful scaling. Mobile patterns have already standardized a plethora of useful and recognizable layouts, icons and interactions that can benefit us outside of only the small screens. The dogma of the desktop interface is now a thing of the past, and mobile is heralding a future of drastically simpler and better UI. The net result is not only more instagrams browsed, it's more knowledge shared and learned. The fact that I can use my voice to tell my television to play more Knight Rider is just a really, really awesome side-effect.
As a fan of interface design, operating systems — Android, iOS, Windows — have always been a tremendous point of fascination for me. We spend hours in them every day, whether cognizant about that fact or not. And so any paradigm shifts in this field intrigue me to no end. One such paradigm shift that appears to be happening, is the phasing out of the desktop metaphor, the screen you put a wallpaper and shortcuts on.
Windows 8 was Microsofts bold attempt to phase out the desktop. Instead of the traditional desktop being the bottom of it all — the screen that was beneath all of your apps which you would get to if you closed or minimized them — there's now the Start screen, a colorful bunch of tiles. Aside from the stark visual difference, the main difference between the traditional desktop and the Start screen, is that you can't litter it with files. You'll have to either organize your documents or adopt the mobile pattern of not worrying about where files are stored at all.
Apple created iOS without a desktop. The bottom screen here was Springboard, a sort of desktop-in-looks-only, basically an app-launcher with rudimentary folder-support. Born this way, iOS has had pretty much universal appeal among adopters. There was no desktop to get used to, so no lollipop to have taken away. While sharing files between apps on iOS is sort of a pain, it hasn't stopped people from appreciating the otherwise complete lack of file-management. I suppose if you take away the need to manage files, you don't really need a desktop to clutter up. You'd think this was the plan all along. (Italic text means wink wink, nudge nudge, pointing at the nose, and so on.)
For the longest time, Android seems to have tried to do the best of both worlds. The bottom screen of Android is a place to see your wallpaper and apps pinned to your dock. You can also put app shortcuts and even widgets here. Through an extra tap (so not quite the bottom of the hierarchy) you can access all of your installed apps, which unlike iOS have to manually be put on your homescreen if so desired. You can actually pin document shortcuts here as well, though it's a cumbersome process and like with iOS you can't save a file there. Though not elegant, the Android homescreen works reasonably well and certainly appeals to power-users with its many customization options.
Microsoft and Apple both appear to consider the desktop (and file-management as a subset) an interface relic to be phased out. Microsoft tried and mostly failed to do so, while Apple is taking baby-steps with iOS. If recent Android leaks are to be believed, and if I'm right in my interpretation of said leaks, Android is about to take it a step beyond even homescreens/app-launchers.
One such leak suggests Google is about to bridge the gap between native apps and web-apps, in a project dubbed "Hera" (after the mythological goddess of marriage). The mockups posted suggest apps are about to be treated more like cards than ever. Fans of WebOS1 will quickly recognize this concept fondly:
The card metaphor that Android is aggressively pushing is all about units of information, ideally contextual. The metaphor, by virtue of its physical counterpart, suggests it holding a finite amount of information after which you're done with the card and can swipe it away. Like a menu at a restaurant, it stops being relevant the moment you know what to order. Similarly, business cards convey contact information and can then be filed away. Cards as an interface design metaphor is about divining what the user wants to do and grouping the answers together.
We've seen parts of this vision with Android Wear. The watch can't run apps and instead relies on rich, interactive notification cards. Android phones have similar (though less rich) notifications, but are currently designed around traditional desktop patterns. There's a homescreen at the bottom of the hierarchy, then you tap in and out of apps: home button, open Gmail, open email, delete, homescreen.
I think it's safe to assume Google wants you to be able to do the same (and more) on an Android phone as you can on an Android smartwatch, and not have them use two widely different interaction mechanisms. So on the phone side, something has to give. The homescreen/desktop, perchance?
The more recent leak suggests just that. Supposedly Google is working to put "OK Google" everywhere. The little red circle button you can see in the Android Wear videos, when invoked, will scale down the app you're in, show it as a card you can apply voice actions on. Presumably the already expansive list of Google Now commands would also be available; "OK Google, play some music" to start up an instant mix.
The key pattern I take note of here, is the attempt to de-emphasize individual apps and instead focus on app-agnostic actions. Matias Duarte recently suggested that mobile is dead and that we should approach design by thinking about problems to solve on a range of different screen sizes. That notion plays exactly into this. Probably most users approach their phone with particular tasks in mind: send an email, take a photo. Having to tap a home button, then an app drawer, then an app icon in order to do this seems almost antiquated compared to the slick Android Wear approach of no desktop/homescreen, no apps. Supposedly Google may remove the home button, relegating the homescreen to be simply another card in your multi-tasking list. Perhaps the bottom card?
I'll be waiting with bated breath to see how successful Google can be in this endeavour. The homescreen/desktop metaphor represents, to many people, a comforting starting point. A 0,0,0 coordinate in a stressful universe. A place I can pin a photo of my baby girl, so I can at least smile when pulling out the smartphone to confirm that, in fact, nothing happened since last I checked 5 minutes ago.
Matias Duarte, current Android designer, used to work on WebOS ↩
There's a mantra in the WordPress development community: decisions, not options. It's meant to be a standard to which you hold any interface design decision: if you make a decision for users it'll ultimately be better than forcing them to make decisions themselves. It's a decent mantra — if you're not mindful you'll end up with feature creep and UI complexity, and it's orders of magnitude more difficult to remove an old option than it is to add one in the first place. Adding an option instead of making a decision for the user is almost always bad UI design.
Except when it's not.
The problem with a mantra like this is that it quickly gets elevated to almost biblical status. When used by a disgruntled developer it can be used to shoot down just about any initiative. Like Godwins law for WordPress: once you drop the "decisions not options" bomb, rational discussion comes to a halt.
The thing about open source development is that it's much like natural evolution: it evolves and adapts to changes in the environment. Unfortunately that also means features once useful can become vestigial once the problem they used to solve becomes obsolete. Baggage like this can pile up over years, and maintaining backwards compatibility means it can be very difficult to rid of. Sure, "decisions not options" can help cauterize some future baggage from happening, but it's also not the blanket solution to it.
The problem is: sometimes the right decision is unfeasible, therefore beckoning an option in its absence. WordPress is many things to many people. Some people use it for blogging, others use it for restaurants, portfolios, photo galleries, intranets, you name it. Every use case has its own sets of needs and workflows and it's virtually impossible to make a stock experience that's optimal for everyone. Most bloggers would arguably have a better experience with a slew of WordPress features hidden or removed whereas site owners might depend on those very same features for dear life. By catering to many use-cases at once, user experiences across the board are bound to be unfocused in some way or other.
The "Screen Options" tab is an example of a feature that would probably not exist were "decisions not options" taken at face value. Screen Options exists on the idea that not everyone needs to see the "Custom Fields" panel on their Add New Post page, yet acknowledges that some users will find that feature invaluable. It is an option added in absence of a strong decision, for example, to limit WordPress to be a blogging-only tool. I consider it an example of an exception to the mantra for the good of the user. Sure, the UI could use some improvement, let's work on that, but I really appreciate the ability to hide the "Send Trackbacks" panel.
I'm a fan of WordPress. I'm a fan of good decisions, and I'm a fan of good UI design. I believe that if we relieve ourselves of arbitrary straitjackets and approach each design objective with a sense of good taste and balance, we can make excellent open source software. Cauterizing entire avenues of UI simply because it adds options, however, negates the fact that sometimes those options exist in absence of a higher-up decision that just can't be made for whatever reason.
There's a lot to like about the new iOS 7. As a whole, the result looks mostly unique. There's a nice clean aesthetic going with the thin Helvetica, the white UI chrome, the sandblasted layers and the almost complete absence of gaudy textures. It's also colorful. Which is a good thing. Right?
Leading up to this there were jungle-drums touting how flat the new UI was going to look (as though every UI will suddenly be clean and uncluttered if you just run it over with a bulldozer). Fortunately that's not what happened. Don't get me wrong: I do like my UIs to be clean and simple, I just find the term "flat" to be mostly meaningless when applied to design. There are no magic bullets, there's only good design and bad design, and I think Jony Ive gets that. So instead of trumpeting flat, Apple trumpeted true simplicity. Oh, and grid-based icons:
Sure, there's certainly a grid there. I was mostly paying attention to the light-source for those gradients, though: why does Phone looks embossed while Mail looks inset? Also: Game Center? Again?
There will be no tears shed for the linen texture. I will not mourn the loss of green felt. Still, the new iconography alone makes iOS 7 such a departure that there's bound to be some learning curve, which begs the question: why didn't they go further now they were at it?
They had a real opportunity here. Jony could've said to his team:
Team! We've dominated the smartphone market for the last 5 years with a grid of round-rect icons. How do we re-think it from the ground up for the next decade? How do we create something that'll make Samsung scramble to copy us again?
Perhaps they did just that. Conceivably they created giant mood-boards. Maybe they decorated hip little cubicles with smiling model faces and photos of subway signs and collages of differently colored post-it notes. Could be they brain-stormed all the places they see the mobile space go in the next ten years: creepy glasses, holographic watches, voice-controlled smart underwear. No doubt they considered the convergence of the cloud with all these new-fangled features. Perchance they arrived right back at a grid of icons: Eureka! We had it right all along!
I hope that's not the case. I hope they had grander ideas… post-smartphone ideas. I'm hoping they were just so lazer-focused on shipping on time they had to punt their ideas for replacing Springboard. I'm hoping Jony felt the most important thing was to uproot the old linen-clad ways and set out a strong new direction for all future Apple UIs. I want to believe.
I want to believe that maybe one day we'll have smartphones whose strongest visual cues aren't defined by the graphical prowess of 3rd party icon designers. I want to believe that maybe one day we'll look back at websites that use confirm() to alert us of their mobile apps as a dark age. I want to believe that maybe one day it'll be possible to avoid all social interaction in a manner more impressive than tapping in and out of apps. Is that so much to ask?
In interface design, pretty is a secondary task. The primary task is to achieve a balance between form and function. In this article I would like to explain why this is important, and why it should be taken to heart by every interface designer.
The interface is what separates the user from the raw functionality of an application, game or website. In a car, the interface is the clutch, the speeder, the brake and the steering wheel. It’s what makes the functionality accessible; it’s what makes it useful.
My primary objective in designing an interface is to empower the user and make as much of the functionality accessible, transparently and easily. In a nutshell: my goal is to make sure that the form follows the function.
Essentially, the form/function mantra spells it out for me. It unveils before me a rather narrow path I have to tread in order for the end-result to uphold and respect this principle.
For websites and applications, conventions established by countless previous interfaces dictate where most things go. If I stick to these conventions and make sure functionality works like users expect, I don’t have to teach them a thing. Every time user assumptions prove right, they will feel more comfortable and empowered. In game interfaces, things are slightly different as the user willingly enters a fictional realm created purely for entertainment purposes. For the sake of having fun, users are willing to learn new rules.
Common to all interfaces though, is that there is a time and place for pretty and that is more or less strictly dictated by the function of our product. Knowing when to prettify and when not to is crucial in ensuring a good product.
Leaving things alone can be difficult for a designer. It’s in our nature to want to add our personal touch, to want to show how pretty things can be. Alas, making things pretty rarely means making them useful, so before breaking rules set out by precedents, we’d better make sure that there’s a damn good reason for doing so. If we do not, we run the risk of simply adding visual clutter, confusing and detracting from the result with every stroke of the brush. There’s a reason most spoons look alike; it’s a tested design and it works pretty well.
Keep in mind, though: respecting functionality doesn’t mean we should simply perpetuate a design that we know works. Sometimes the formula can be improved upon and sometimes there’s simply no precedent. When this happens we need to rely on experience, gut feeling and extensive testing.
In my experience a good interface design goes by unnoticed. I jump in and know how things work, how to go about my business. The cog-wheels of the engine beneath grind creak and turn exactly the way I expect them to. The badly designed interface, on the other hand, immediately stings my eye. I get annoyed that I have to learn how a particular feature works simply because the designer chose to “spiff it up” when I know things could’ve been different, simpler and more useful. In some cases I become reluctant or simply too annoyed to explore features. Even if the designer means well by touching things up, doing so might just tip the delicate balance of form and function.
In the end, what we think is pretty will fade or change given time. What was pretty twenty years ago might not be pretty today. Mullets come and go like the tides, so learn to spot the mullet and try to avoid it. As interface designers we should teach ourselves to let go of our pride and put less focus on pretty. Taking the back seat to function is not a cop-out; it’s taking the high road. Walking the thin line between adding to and detracting from the functionality is no easy challenge. Who ever said simplicity was simple? The real challenge is to make things as pretty as possible within the confines given to us. If we can’t do that, we should settle for functional. After all, pretty is relative.
Earlier this month, Opera released their new browser. While testing Opera 9, I noticed the main browsing interface was radically different from that of Firefox. Namely, the browsing tabs were above the address-bar and primary navigation buttons (Back, Forward, Stop).
This got me thinking; If one could completely redesign the current browsing interface, ground-up, what would be the most logical and intuitive configuration?
Armed with only screenshots of Firefox and gut-feeling, I got to work on a configuration.
Zen Photo is a web application that allows you to create online photo albums. It supports automatic thumbnails and individual image comments. It has grown out of a desire to create simplicity among a myriad of complex alternatives.
I’ve been looking for just such an application for quite a while. Until just recently the landscape for web photo albums was bleak. Hence, I couldn’t refuse an invitation to design the default Zen Photo template, and the administration section.
A long while back, I stumbled upon a snippet of wisdom. Fortunately, I wrote it down, because the website that held this info is down. I have managed to track down the source to a Mr. Edward de Bono. His book, “Simplicity“, is available at Amazon.
The snippet of wisdom is related to achieving simplicity in designs. I am storing it here as much for your convenience, as for mine.