Incidentally, the new Pixel homescreen applies icon normalization, which resizes icons that do not follow guidelines:
Icon normalization isn’t perfect either, though, and it also doesn’t prevent app builders from creating wildly different icons. The move to round icons feels like a last ditch attempt at reaching for some kind of consistency in app icons, even if the icons themselves lose a great deal of creativity in the translation:
The thing is, outside of doing basic icon normalization, the Pixel phones don’t enforce the round icons, so unless a developer actually makes one, you’ll still end up with a mix of styles.
There Is No Single Style
Having consistent and beautiful app icons throughout an operating system is a beautiful dream, one that I’m sure many designers have had. But so long as app developers have to provide this icon themselves, the styles are going to vary as much as all human art does. It’s human nature, it’s an expression of individuality, a difference in taste, and a result of varying degrees of design time invested.
It might even be done intentionally for differentiation. While having an icon that blends in, fits, looks right among the other icons on the platform might make users happy, the app developers might want their icon to stand out (even if like a sore thumb), scream to the sky: look at me, my developers have families to feed, launch me.
Many attempts have been made at producing consistency where none is found, icon normalization is just one of them. Samsung phones put all icons on a squircle badge, Xiaomi phones offer multiple approaches ranging from badging and masking to replacing icons with huge icon packs to replace every icon on your homescreen.
Perhaps the most successful approach has been that of the iPhone, and probably due to the rather strict limitations imposed: no transparency, therefore no custom silhuettes. Every icon gets cropped into a rounded rectangle. Even then, icons on the iPhone differ wildly in style.
The story is the same on Windows and macOS. Windows 8 and 10 courageously tried to retire the app icon in favor of live tiles that could even be resized, so the user had more control of the aesthetics, and recommending flat white, fairly easily designed motifs:
Spot the odd one out
… but even then, no matter how specific your guidelines for app icon designs are, designers can choose to not follow them.
Even if the guidelines were enforced, I suppose, there’s no guarantee the results would be consistent.
The Why of App Icons
It’s always prudent to ask the question: what problems are we trying to solve? Why are there app icons in the first place? I suspect Microsoft asked the same question, leading them to try live tiles.
The obvious answer is that app icons exist because apps exist, and apps work in a specific way. Your operating system provides a platform on which an app runs, and it’s then the job of the app to solve problems, give information, make you productive. If you need to write text, you open a writing app. If you need to edit photos, you open a photo editing app. This is how it’s always been. But will it stay this way?
We have to go deeper!
Why did you want to write text? Why did you want to edit a photo? What happens to the text when you’re done writing, are you going to send it somewhere, publish it? What happens to the photo when you’re done editing?
What if the operating system was intent-based instead of just a platform for a wild west of apps? You want to call someone, use the operating system dialer, but use it with WhatsApp. Want to make a reminder? Use voice actions, but save it in Evernote. Want to listen to music? Press the play button, which you mapped to Spotify instead of Apple Music.
We’re getting ahead of ourselves, and I still have that upcoming post about how apps might look in the future — Update:here’s that post. But the point is — perhaps app icons don’t really matter in the future. Perhaps the platforms of the future finally break the shackles of “a grid of icons” and relieve us of the menial tasks of jumping in and out of apps. In such a future, perhaps an app doesn’t even need an icon — perhaps the app is more like a plugin that hooks into touch points of the operating system, replacing or giving alternatives to what’s already there.
Keeping such a future in mind, it’s hard not to look at Google’s round-icon efforts and smile like a loving parent: aaw that’s so cute, bless your heart for trying!
I like round icons. I like silhouetted icons. I like really well-designed icons, and I love that Google and Microsoft are trying their best to foster consistency among app icons. But perhaps the battle won’t be relevant in a few years. There will still be good designs and bad designs, of course, but perhaps that battle won’t be fought on your homescreen or desktop.
Let’s discuss, for the moment, the Home button. It’s the quintessential interface on smartphones.
As pioneered by the iPhone, its behavior arguably defined the user experience of the rest of the operating system. It suggested apps were a modal experience, and that a press of the button would always exit the app and take you home. In-app navigation would have to happen elsewhere. The home button conveyed a simplicity of interface that immediately resonated. Despite doing only a single thing, it was worth spending an entire physical button on it.
Since then, the button has taken on many new behaviors. First off, the double-click happened as something you could tie to a custom shortcut. Eventually that would open the multi-tasking tray. We got a long-press that would fire up Siri. When the form factor grew beyond 4 inches, we received an optional “reachability” feature that would require a double-tap (not click). Triple click could be mapped to an accessability feature.
Before we go into the feasibility of adding more gestures to the previously single-purpose iPhone home-button, let’s look at what Android did. In the early days, it wasn’t pretty. There would be physical buttons for Back, Menu, Home, and Search. Home and Back buttons worked (pretty much) like you’d expect, but Menu and Search were problematic. The former, in particular, meant that apps had no visible UI for extra features. You’d open an app, and some aspects would be buried in an invisible menu you had to click the menu button to see. It was classic mystery meat navigation, and search was almost as bad. Search would open an apps contextual search feature, if it existed. Otherwise it wouldn’t do anything. On the home screen, search would fire up Google. In the browser it would set focus on the addressbar. I may be misremembering bits here, can you blame me? Oh, and most buttons had longpress. Longpress back, I believe, would open up history in the browser. Longpress home would access voice navigation, I think. Longpress search, for whatever reason, would invoke the multi tasking menu.
There were a lot of qualifiers there, suggesting I might be misremembering. That’s because invisible features are almost always a bad idea. Hiding possibly critical app features in an invisible menu reduced the discoverability to those willing to play whack-a-mole. Same with search, and frankly, longpress. Before we get to that, though, Android 4.0 Ice Cream Sandwich (technically Android 3.0 Honeycomb, though I’m not sure anyone will recognize that version) improved things a fair bit and the Android systembar has been fairly stable ever since. Here’s the latest iteration:
For the purposes of this post, we’ll be discussing the Android “spec” version of the systembar, which features these three onscreen buttons in “back, home, overview” order. Some OEMs, like Samsung or HTC, would make the buttons physical, or flip around their order.
Android has a Back button, a Home button, and an “Overview” (multitasking) button. Initially, there were no longpress actions tied to any of the buttons. Instead, a vertical swipe would invoke Google Now, or Search, whatever your phone came installed with.
Redesigning an interface is like dancing a waltz. It’s usually two steps forward, one step back. In the case of the Android systembar, the swipe up to enter Google Now gesture was crazy making. I would accidentally invoke it every once and again, but my daughter playing with drawing apps would invoke it constantly and lose her place. Eventually (5.0 Lollipop I believe) the swipe gesture would be replaced by a longpress, and in 6.0 Marshmallow you could even disable the longpress.
The Goldilocks Principle
The elephant in the room is the burning question: which is better? What’s the right amount of system buttons? Should buttons be physical or on-screen?
In the end I think that’s mostly a question of personal taste, preference, who’s using it, and what one is used to. However if the purpose of an interface design is finding a beautiful balance of usability and discoverability, we can probably extrapolate a few general guidelines.
First of all, basic usability favors visibility. Anything you bury under non-standard gestures such as double-click or longpress are going to go unnoticed by most people, and only utilized when such a gesture is learned to be necessary. Ever see someone doubleclicking a hyperlink in a web-browser? That’s arguably a bad user experience design resulting in the wrong lesson: double-click to make sure it works. Online stores are still paying the price for this, having to disable the buy button once pressed a single time, or verbosely spelling out: “only press once”.
The swipe gesture is a powerful gesture when used right. In older Androids it invoked search, which was a terrible decision — as mentioned it was too easy to invoke, and just not useful enough to warrant such an action. When used to swipe between homescreens, or scroll a page, however, there are few better gestures.
Which brings us to the Back action. On Android it’s a permanent systembutton. It mostly does what you expect it to do, except for a few cases where it doesn’t. It’s supposed to always take you back to the previous screen, but what if you just launched an app and accidentally hit back, should you exit out to the homescreen? On iPhone, there isn’t a system back button. Instead, back-buttons are added by the developer to the top-most toolbar of an app, ensuring they are contextual and pretty predictable. They’re even labelled (most of the time), which is hard to beat from a usability perspective. On the flipside, they can be far away from your thumb, especially on large screen phones like the Plus models. To alleviate that, there’s the side-swipe, for lack of a better term: you swipe right from the left edge of the screen to go back to the previous screen you were on. Not quite as predictable, perhaps, as swiping left and right inside a photo gallery, but certainly convenient. The main downside of the side-swipe is that it can limit what apps can do with the same gesture. Once a gesture is reserved by the system, you should probably be mindful to not interfere with it.
Incidentally the side-swipe gesture suggests when it might be appropriate to add multiple actions to systembuttons: poweruser features. If a feature is complementary but not required learning in order to use an interface, it can add a lot of value to the experience. The side-swipe seems to do that — you don’t have to learn it in order to use an iPhone, Back buttons are still there. Android 7.0 Nougat has a similar power-user feature that is absolutely not required learning: double-tap the multitasking button to quickly switch to your last used app. It’s like alt-tab for your smartphone. I’m not finding it easy to defend mapping the multitasking feature to double-click on the iPhone, however. It seems like a feature so valuable you’d want to make it esaily discoverable.
Siri vs. Google Assistant
We come to it at last, the long-press features. On the iPhone, holding the home button opens Siri. Same with Android 5.0+ which would fire up Google Now, and with the upcoming Pixel phones, Google Assistant.
Pixel phones feature a little colored-dot animation as you press and hold, which seems to be an attempt at adding discoverability to the feature. I don’t buy it. The onscreen buttons are small already so you’re likely to cover the animation with your thumb, and nothing beats a label regardless. The Google Assistant is available through other means, though. The homescreen — the launcher as it’s called — has a big Google logo tab in the top left corner, which also invokes it. You can also use the “OK Google” hotword. So while the home-button longpress isn’t very discoverable per se, the Assistant itself should be. That embraces the poweruser nature of adding longpress features to systembuttons.
Siri is less discoverable. You have to enable it first (otherwise you just get “voice control”). Then it’s there on the longpress. You can also enable a hotword detection, “Hey Siri”, to invoke it. There isn’t a widget you can add, or an app icon you can tap. Perhaps that’s okay, though. Perhaps the assistants don’t have to be discoverable, yet. They pass the litmustest of can you use and enjoy the phone without it, and in both cases I’ve never used them for much other than setting timers and reminders.
I can’t help but feel like that’s about to change. Assistants, neural networks and AI seem to be evermore encroaching on the smart devices we use. Today it’s smart replies in Google Inbox and on smart watches. But what do the interfaces of tomorrow look like?
It feels like most of the design of the original iPhone sprung from the concept of having that singular home button. What would a phone look like if that button, instead of taking you home, was your assistant?
One thing that’s always boggled me is the increasing fascination with drinking coffee out of cardboard cups. Part of my fascination stems from the fact that it seems like a huge waste of resources — you could be making virtual reality goggles from that cardboard! But most of my curiosity has to do with the indisputable fact that coffee tastes worse out of cardboard.
I consider that truth to be self-evident. The delicious nectar that is coffee deserves better than to be carried in laminated paper and guzzled through a tiny hole in a plastic lid.
I get it, I get it, you can take a cardboard mug with you on your commute, or as you’re walking down the street, or as you’re waiting in line, and coffee through a plastic lid is better than no coffee at all.
But I’ve seen things you people wouldn’t believe. I’ve seen people in non-commute non-transit situations pick cardboard over glorious porcelain — intentionally and of their own volition and not while under duress (I checked to make sure).
Usually I can even all I need to. But when it comes to coffee, and porcelain is an option, and you still choose cardboard… I’m all out of evens. At that point, I can’t even.
One time I stopped a good friend as he was doing it, and like a concerned parent I asked him why he would pick the cardboard over the porcelain when both options were right there in front of him, in plain sight, literally on the table. He answered:
I like the idea that if I need to go somewhere, I can take the coffee with me.
Okay, that’s actually a fair point.
I mean, I’d just gulp down the coffee from the porcelain and then go ahead and grab another one in cardboard to bring along, but sure, the above is a cogent argument.
Still, I can’t help but feel like cardboard coffee is becoming a status symbol outside of just mobility: “Look at me, I’m on the move!” And that makes me sad. It makes me even sadder than it does when people add sugar to their coffee.
It was a day like any other, and it was an innocently looking article like any other. But on this one day, this one particular article and this one paragraph in particular that just missed my good side entirely:
Jim Carrey brought us our first live-action taste of Count Olaf from A Series of Unfortunate Events, but Netflix's upcoming TV series adaptation is (thankfully) going in a different direction.
It's an article from The Verge. It's nothing out of the ordinary, and I like this publication perfectly fine. I hold no grudges against the author either — this could've been published in any fine recent publication. All's good on that front. I'm also not a particular Jim Carrey fan these days, that doesn't change a thing. It's just, this one day, that last sentence got to me.
but Netflix's upcoming TV series adaptation is (thankfully) going in a different direction
On this day — it'd been snowing, by the way, it was rather pretty outside — this one sentence reminded me of everything I loathe about modern online discourse. I read this sentence — and I invite you to correct me, gosh I hope so much that I'm wrong about this — as an off-hand dismissive critique of the older film Lemony Snicket's A Series of Unfortunate Events, starring Jim Carrey, Emily Browning and Liam Aiken. The sentence seems to suggest that this older film is so atrociously bad that the new Netflix series (which I welcome) thankfully goes in a different direction. THANKFULLY! THANKFULLY!!
It's fun how bright the day can look when snow covers the ground. Yet inside of me, my heart held only darkness.
The 2004 movie is one of my favorite movies of all time. I dare you to watch the following deleted scene, and not at least have a tiny appreciation for the music and the visuals. Gorge on those trees.
This scene ends on a simple note: There's always something. And this film has just that: something. I heartily recommend it to you. Watch it tonight, I'm sure you can stream it.
While I mourn the lack of a sequel, I'm not objecting to a Netflix remake, I welcome it. It's a wonderful story, I'd love more. What I mourn is that we can't respect creative work for what it is. Today we apparently have to hate something that wasn't a runaway box office success and the beginning of an endless franchise.
There are no levels on which I don't adore this film. It's okay if you don't.
Lately it just feels like everyone hates everything. Because it's easier to dismiss something, than to like it. Because if you like something, you put yourself out there. You reveal to the world what makes you happy, what makes you cry, what makes you reflect, cope with, or just enjoy life. Someone might make fun of you for loving something, so it's easier to hate it. It's breeds a culture of antagonism, and while it might protect you from occasional ridicule by people not worth your time, it also insulates you from possibly discovering something amazing.
If only we could see past arbitrary notions of what's cool to like, and judge movies and music and books on their own merits. Because there's always something.
It happened. My 4 year old has found a franchise to latch on to. It’s not ideal: the one thing I’m the most allergic to in the world is horses. But if she’s into ponies she’s into ponies and there’s nothing I can do about that except embrace it. She’s got the toys, she’s got the bed-blanket, she’s got the t-shirt, and her favorite pony is Rainbow Dash. It’s a thing.
As an overprotective curling-dad, I consider it my solemn duty to learn about this thing that’s absorbing her attention. So I have been watching the show with her, trying to soak up the pony lore, learn of the details that make out this equestine construct.
The show follows Twilight Sparkle, a purple unicorn, as she visits “Ponyville” — the shining gem of the land of Equestria. You know… from equo in latin? Horse-land? Get it?
Twilight makes friends in Ponyville. Several of them. And she’s taught that though they are all different in appearance, interests, personality and even race, their friendship is the most important thing there is. When they’re all together, their friendship is literally magic. It’s in the tagline.
Sounds good right? It’s perfectly fine that my daughter watches such a diverse, female-positive and all-embracing show, right?
One of my favorite episodes of Lost — bear with me — is the one where wheelchair-bound John Locke cries “Don’t tell me what I can’t do!” and then goes on a walkabout. This is at the core of the values I want my daughter to learn: if she can dream it, she can do it. For that reason I already know the answer to questions she might one day pose to me: “Can I be an astronaut, dad?” YES. “Can I be at the Olympics, dad?” YES. She’ll learn eventually that it might not be a walk in the park, but there’s no reason she should have some sort of arbitrary mental block put in place by me, preventing her from even trying.
Which brings me to Equestria. In Ponyville, there are three races of ponies. The ponies you know, unicorns who have magical powers, and pegasi who can fly and make it rain. They all live and work together seemingly in perfect glittering harmony.
How does this even work? How aren’t the only-ponies perpetually jealous of the other two races?
Ponies are literally born with predisposed skills. Unicorns have magic powers, one of them being that they can write. Pegasi can fly. Sorry Applejack, I suppose you have to manually pluck those apples for selling on the market to make ends meet. If only you were a unicorn you could just use magic, but hey, life’s tough right? Applejack is basically caste-blocked from ever advancing beyond her racially defined place in society.
The fact that only unicorns can write has its own problems. History is written by those who can, well, write… right? I hope everyone trusts the unicorns to be truthful. Better not upset them.
Ever noticed how My Little Ponies have back-tattoos? Applejack has apples, Pinkie Pie has balloons. Those are literal coming-of-age tattoos. Puberty isn’t mentioned, but it’s implied that once a pony reaches that age, whatever “talent” they have is stamped on their back. Forever. A visual indicator of what you are.
The stamps are called cutiemarks.
Back-tattoos aside (some of those are really lovely, I’m sure) I don’t know that I appreciate the idea that you even can have a talent as such—how about those 10,000 hours? What about multiple “talents”: which one gets stamped on you? And why does your one talent need to be permanently advertised to the world? What if your talent is not showering? If you’ll indulge me as I recall a history lesson about mechanical vs. organic societies, this “know your place” undercurrent that permeates Ponyville is a trait I do not find attractive. Also, if I am to ever get a back-tattoo I want it to be something I choose to get. Probably a japanese glyph I think means “fire” but in fact means “toast”. Something I can laugh at years down the line, not something that forever defines my place in the world.
Another observation was that every single pony in Ponyville is either beautifully styled and coiffed at all times. Or an unsightly donkey dragging a cart with a grumpy look on their face. In fact I don’t think I’ve seen a single handsome donkey on the show. They’re like morlocks.
One of the dude-ponies was called “Shining Armor”. A bit on the nose, eh, Lauren Faust? Also, why weren’t there any any girl knights? My daughter happens to love playing knights and princesses. She’s the knight, I’m the princess.
I don’t know what the lesson is. I think I wanted to vet the show, but having now watched one too many episodes with my daughter on the couch, I’m not sure there’s really a lesson to learn here.
Selma likes ponies, she likes watching them on the television with me. Perhaps she doesn’t have to learn about societal norms and expectations and caste systems and harmful stereotypes through a kids show about magical ponies, at age 4. She likes Rainbow Dash, and I think it’ll start and end with that.
A popular brand uses this as their tagline, and it’s always annoyed me terribly.
I was brought up to know that as a human I have inherent value. I try to raise my daughter the same way, so I keep reminding her how much she means to me, bolster her heart to protect her against inevitable douchebags. In that vein, everyone is worth it.
Is worth what, exactly?
This brand sells … perfume? Face cream? I can’t even recall, and I don’t even care. The point is, their tagline is pointing out that you deserve to spend your money on their product. Well what if I can’t afford the product? Does that mean I’m not worth it?
We’ve been over this. The human condition is tough. Things don’t always go as planned. Some people get a particularly short end of the stick of life. There’s no justice to it, just wanton cosmic random chance. Whether you end up able to afford the face cream you’re worth is entirely up to a unique combination of the absence of bad luck, decades of hard work, and growing up in a place where such hard work pays off as it should.
I don’t usually watch TV, so I’m mostly spared zapping by beautiful models parrotting off the tagline in a bubbly tenor. Thankfully, because I think I’d go insane. In a world with people who would take medicine, antibiotics or clean water over a goddamn face cream, the phrase cuts me like a knife on a blackboard.
Everyone is worth it. I believe it’s in a charter somewhere.
All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.
If blocking becomes widespread, the ad industry will be pushed to produce ads that are simpler, less invasive and far more transparent about the way they’re handling our data — or risk getting blocked forever if they fail.
That's a load of manure.
A big part of the problem is how slow the ad industry itself has been to adapt. To this day most ads are still big squares (300×250) or giant skyscrapers (120×600). They're not hi-DPI, they're not responsive, and they're usually ugly blinking GIFs. With all the technology we have available to us today, you'd think we'd be able to see better ads at this point.
Ads don't offend me. Well some specific ads do, but the idea of exchanging my attention for a free service such as reading news on the web, that doesn't offend me. I'm an adult, I can make an informed decision as to which services I will leave my data with, whether those services are free through ads or are entirely paid.
The problem creeped up on us slowly: the more attention you could sell, the more money you could get. Ads became bigger and more plentiful. First came popups, then they were blocked. Now we're dealing with full take-over ads, interstitials, lightbox ads, and if you dare browse the mobile web, you'll be looking through blinds in the form of social sharing links at the top, and "dismiss" buttons that don't actually work. It's pretty bad, and it makes browsing websites slower.
In the end, it only takes a few horrible ads to poison the well, and adblocking would eventually become inevitable. It's like television, and Ghostery is the Tivo of the web. With iOS 9 content blockers, adblocking is going to be mainstream fast, and this is where the pullquote above falls apart: ad networks aren't going to get better, probably the opposite.
Today it's possible to make a living running a site that's free to read, solely because of ad revenue. Some can even make a good living. As adblocking grows more widespread, ads are going to be more intrusive to get around this, more guerilla, and even bigger, all in a fight to make the same income off the dwindling flock that still aren't blocking ads. It'll happen to good people that run these sites. Despite their best intentions, their staff have families to feed, and if they just use this slightly larger ad and add an interstitial, things can stay the same for a while and no-one has to be fired.
It would be unfair to blame them. It's human nature: millions and millions of sites aren't suddenly going to see the light at the same time and change their ways all at once. Even if they did, it's unlikely everyone would suddenly stop using adblockers because of this. Once the adblocker is installed, once web-ads have been poisoned by years of bad practices, ads aren't coming back.
John Gruber tweets:
I feel your pain, John. It's the same pain GigaOm felt when they died this year. It's not pretty. And I like Deck ads. They're nice. I agree they shouldn't be blocked. But they're still ads, and adblockers block ads. It's not your fault, it was that monkey ad, remember? Shoulder to shoulder, we stand. Love is a battlefield.
There is no ethical adblocker which blocks only the bad ads and leaves the "good" ads. I'd like to feel like an activist fighting for pure content when I install Marco Arments $2.99 "Peace" ad blocker. I want to believe that by blocking ads, I help force positive change on the advertising companies (and the livelyhoods that depend on them), force them to adapt.
But that's a beautiful illusion. What's more likely is that web ads are going to get way worse, adblocking is going to go way up, and at some point in this arms race, after the death of many a media company, eventually some will indeed have adapted. The big question is whether you'll like the alternatives. It can be apps. It can be inside Apple's Newsstand (featuring unblockable ads). It can be inside Facebooks instant articles. It can be subversive native ads. It can be paywalls. Think in-app purchases: "Pay $1 for this article, or pay by watching a video."
Nature will find a way. But we aren't suddenly going to wake up to rainbows and unicorns. No matter how cool that would be.
I watched Jupiter Ascending yesterday, and from the moment I saw flying roller blades, I was in love. The film is saturated with color, culture, style and fashion and detail. It has layers and layers and layers, it's creativity all the way down! Did you notice the design of the wooden bill the robot servitor bribed the bureaucrat with? It had the slickest little design and it was on screen for barely two seconds. The amount of work that went into this film was astounding, and apparently Rotten Tomatoes doesn't care, and that makes me sad.
It's not that I'd prefer everyone like the things I like. I'm routinely made fun of for thinking Time Cop is a good movie, and for ranking Sky Captain and the World of Tomorrow close to Raiders of the Lost Ark on the list of my all time favorites. It's fine, we don't all have to agree, I'm comfortable with my taste in movies.
What gets me is that that we'll probably never see another movie like Jupiter Ascending. We'll certainly never get a sequel. Neither did Serenity, or John Carter, or A Series of Unfortunate Events. Or Ninja Assassin. Yet they made Transformers 2, Transformers 3, Transformers 4, and they're making Transformers 5. That seems so wrong to me.
I understand how it works. The movies I mentioned either did bad at the box office, or critically, or both. Transformers 2 on the other hand pulled in $836m on a $200m budget. Little did it matter that it is almost universally deemed bad. I did see the full thing and to this day regret not staring at drywall for 2h30 instead. I don't often criticize things — okay actually I do that a lot — but Transformers 2 deserves it. You could cut it down to a 30 minute short, and not only would the film be better, but there might actually be enough story to warrant its runtime.
Jupiter Ascending really didn't deserve the critique it got. Even if the film wasn't for you, it had so many other things going for it: the elaborate space backdrops, the castle-like spaceships, the dresses, the sets, hell even the spacesuits that looked like they were ornately carved from wood. Did I mention the flying roller-blades? Jupiter Ascending oozed creativity and worked on so many levels. I still can't think of a single level Transformers 2 worked on, and I played 100+ levels worth of Desert Golfing.
Successful movies get sequels, and the Transformers franchise is like a pinata filled with money and shame. It's only natural that studio execs want to keep wailing on it with 2-by-4s. It's just so unfair.