Ethical Adblocking

Apple just released iOS 9 yesterday, and with it allowed adblockers into the app store. Since the mobile web is increasingly a Big Deal, this fact heralds a sea change for the web.

An article about adblocking made the rounds a few weeks ago. Here’s a pullquote:

If blocking becomes widespread, the ad industry will be pushed to produce ads that are simpler, less invasive and far more transparent about the way they’re handling our data — or risk getting blocked forever if they fail.

That’s a load of manure.

A big part of the problem is how slow the ad industry itself has been to adapt. To this day most ads are still big squares (300×250) or giant skyscrapers (120×600). They’re not hi-DPI, they’re not responsive, and they’re usually ugly blinking GIFs. With all the technology we have available to us today, you’d think we’d be able to see better ads at this point.

Ads don’t offend me. Well some specific ads do, but the idea of exchanging my attention for a free service such as reading news on the web, that doesn’t offend me. I’m an adult, I can make an informed decision as to which services I will leave my data with, whether those services are free through ads or are entirely paid.

The problem creeped up on us slowly: the more attention you could sell, the more money you could get. Ads became bigger and more plentiful. First came popups, then they were blocked. Now we’re dealing with full take-over ads, interstitials, lightbox ads, and if you dare browse the mobile web, you’ll be looking through blinds in the form of social sharing links at the top, and “dismiss” buttons that don’t actually work. It’s pretty bad, and it makes browsing websites slower.

In the end, it only takes a few horrible ads to poison the well, and adblocking would eventually become inevitable. It’s like television, and Ghostery is the Tivo of the web. With iOS 9 content blockers, adblocking is going to be mainstream fast, and this is where the pullquote above falls apart: ad networks aren’t going to get better, probably the opposite.

Today it’s possible to make a living running a site that’s free to read, solely because of ad revenue. Some can even make a good living. As adblocking grows more widespread, ads are going to be more intrusive to get around this, more guerilla, and even bigger, all in a fight to make the same income off the dwindling flock that still aren’t blocking ads. It’ll happen to good people that run these sites. Despite their best intentions, their staff have families to feed, and if they just use this slightly larger ad and add an interstitial, things can stay the same for a while and no-one has to be fired.

It would be unfair to blame them. It’s human nature: millions and millions of sites aren’t suddenly going to see the light at the same time and change their ways all at once. Even if they did, it’s unlikely everyone would suddenly stop using adblockers because of this. Once the adblocker is installed, once web-ads have been poisoned by years of bad practices, ads aren’t coming back.

John Gruber tweets:

I feel your pain, John. It’s the same pain GigaOm felt when they died this year. It’s not pretty. And I like Deck ads. They’re nice. I agree they shouldn’t be blocked. But they’re still ads, and adblockers block ads. It’s not your fault, it was that monkey ad, remember? Shoulder to shoulder, we stand. Love is a battlefield.

There is no ethical adblocker which blocks only the bad ads and leaves the “good” ads. I’d like to feel like an activist fighting for pure content when I install Marco Arments $2.99 “Peace” ad blocker. I want to believe that by blocking ads, I help force positive change on the advertising companies (and the livelyhoods that depend on them), force them to adapt.

But that’s a beautiful illusion. What’s more likely is that web ads are going to get way worse, adblocking is going to go way up, and at some point in this arms race, after the death of many a media company, eventually some will indeed have adapted. The big question is whether you’ll like the alternatives. It can be apps. It can be inside Apple’s Newsstand (featuring unblockable ads). It can be inside Facebooks instant articles. It can be subversive native ads. It can be paywalls. Think in-app purchases: “Pay $1 for this article, or pay by watching a video.”

Nature will find a way. But we aren’t suddenly going to wake up to rainbows and unicorns. No matter how cool that would be.

A version of this post originally appeared on Google+. Yes, that ghost town you may have heard of. Bring chains and white blankets, let’s haunt things.

iPhone switch observations, just a few days in

Just a few days ago, I made a temporary switch to an iPhone 5C as my daily driver, just to get it under my skin. Here are some observations I’ve made so far:

  • Man, there are a lot of passwords to type in on a new phone.
  • The fact that I have to type in my password in the App Store all the time, even for free apps, is driving me crazy. I know the fingerprint scanner makes this a non issue, but it still seems sub-optimal that there’s not even an off by default option to not ask for passwords.
  • The camera… Even on this two years old tech it takes better photos than most Android cameras I’ve used.
  • The 3rd party keyboard implementation is so janky it’s almost not even worth using a sliding keyboard. And on the stock keyboard, letters on the keyboard are capitalized even when what they output is not. That has to be the last vestigial skeuomorph in the ecosystem.
  • The physical mute switch is a stroke of genius, especially when the Settings > Sounds > Vibrate on Silent option is unchecked.
  • The decision to not allow me to pick a default browser to open links in, feels completely arbitrary and archaic, especially since some apps like Digg Reader implement workarounds to give you the choice.
  • The app situation is good in this ecosystem.
  • Notifications aren’t great. Clearing them even less so.
  • I miss the permanent Android back button in the bottom left. It seems every back button in the system is different. Some screens but crucially, not all, allow you to swipe left to go back. I bet this is an issue on the 6+.
  • I’ve missed this small form factor. Imagine if they removed all the bezels to make the screen larger, I bet they could put a 4.5 inch on it without an increase in size.

Switching to iPhone for a bit

I’ve been a fan of Googles products ever since I switched from Alta Vista. So it felt like a natural fit to get an Android device back in the day when it was time for me to upgrade from my dumbphone, and I’ve been using an Android device ever since. I wrote about ecosystems a while ago, and the ecosystem is exactly what’s kept me there: you sign in to your phone with your Google account, and mail, calendar, notes, contacts and photos sync automatically. Also there’s a really great maps application.

In my day job I make web-apps that have to work on mobile first, and iOS is an important platform for me to know. Now I’ve used iOS for years — it’s the phone I bought for my wife and recommended to my dad. We also have an iPad, and I have used an iPhone for testing for years. I’m no stranger to how things work there. But I feel like something special happens when you make a conscious switch to the platform, make it your daily driver. Phones have become so utterly personal devices, they’re always with us and we invest ourselves in them. Unless I jump in fully, I have a feeling there’s some bit I’m missing.

So starting today I’m an iPhone user. No, I wouldn’t call this a switch — call it a “soak test”. I fully expect to switch back to Android — I’m actually eyeing a Moto X 2014. That is, unless the experience of investing myself fully in the iPhone is so compelling that I have no desire to go back, which is entirely possible. I won’t know unless I give it a proper test. Since I’m in the fortunate position to be able to make this switch, there’s no good reason not to. I’ll be using my white iPhone 5C testing device. I expect to be impressed by the camera. I expect to enjoy a jank-free fluidness of the OS, even if I expect to turn off extraneous animation. I’m curious how I’ll enjoy the homescreen and its lack of customizability compared to Android, and I can’t wait to see if the sliding keyboards in the App Store are as good as they are on Android. I should have some experiences to share on this blog in a month or so. Let me know any apps you want me to try!

The One Platform Is Dead

I used to strongly believe the future of apps would be rooted in web-technologies such as HTML5. Born cross-platform, they’d be really easy to build, and bold new possiblities were just around the corner. I still believe webapps will be part of the future, but recently I’ve started to think it’s going to be a bit more muddled than that. If you’ll indulge me the explanation will be somewhat roundabout.

The mobile era in computing, more than anything, helped propel interface design patterns ahead much faster than decades of desktop operating systems did. We used to discuss whether your app should use native interface widgets or if it was okay to style them. While keeping them unstyled is often still a good idea, dwelling on it would be navelgazing, as it’s no longer the day and night indicator whether an app is good or not. In fact we’re starting to see per-app design languages that cross not only platforms, but codebases too. Most interestingly, these apps don’t suck! You see it with Google rolling out Material Design across Android and web-apps. Microsoft under Satya Nadella is rolling out their flatter-than-flat visual language across not only their own Windows platforms, but iOS and Android as well. Apple just redesigned OSX to look like iOS.

It feels like we’re at a point where traditional usability guidelines should be digested and analyzed for their intent, rather than taken at dogmatic face value. If it looks like a button, acts like a button, or both, it’s probably a button. What we’re left with is a far simpler arbiter for success: there are good designs and there are bad designs. It’s as liberatingly simple as not wearing pants.

dogma (noun)
a principle or set of principles laid down by an authority as incontrovertibly true

The dogma of interface design has been left by the wayside. Hired to take its place is a sense of good taste. Build what works for you and keep testing, iterating and responding to feedback. Remembering good design patterns will help you take shortcuts, but once in a while we have to invent something. It either works or it doesn’t, and then you can fix it.

It’s a bold new frontier, and we already have multiple tools to build amazing things. No one single technology or platform will ever “win”, because there is no winning the platform game. The operating system is increasingly taking a back seat to the success of ecosystems that live in the cloud. Platform install numbers will soon become a mostly useless metric for divining who’s #winning this made-up war of black vs. white. The ecosystem is the new platform, and because of it it’s easier than ever to switch from Android to iOS.

It’s a good time to build apps. Come up with a great idea, then pick an ecosystem. You’ll be better equipped to decide what type of code you’ll want to write: does your app only need one platform, multiple, or should it be crossplatform? It’s only going to become easier: in a war of ecosystems, the one that’s the most open and spans the most platforms will be the most successful. It’ll be in the interest of platform vendors to run as many apps as possible, whether through multiple runtimes or just simplified porting. It won’t matter if you wrote your app in HTML5, Java, or C#: on a good platform it’ll just work. Walled gardens will stick around, of course, but it’ll be a strategy that fewer and fewer companies can support.

No, dear reader, I have not forgotten about Jobs’ Thoughts on Flash. Jobs was right: apps built on Flash were bad. That’s why today is such an exciting time. People don’t care about the code behind the curtain.

If it’s good, it’s good.

Erase and Sync


I don’t even want to try to explain what’s going on here. I mean, I understand it, but I don’t understand it. I don’t see how it’s in anyones interest for it to be flaming-hoops difficult to sync a device to a new Mac. Seriously, Apple, how did this pass your “it just works” razor?


For years my lunatic Apple friends have asked me: “when are you going to get a Mac?”. When I finally did, they started asking me: “when are you going to get an iPhone?”. As iOS is growing increasingly more useful with good notifications and over-the-air updates, my answer has been trimmed down to when it has a Gmail app that’s as good as the Android one. “Gmail with IMAP works great” is the usual knee-jerk reaction and “what’s so special about the Gmail app?” the followup question. I’m thinking perhaps it’s time I change my stock answer. I think my new response will be: sync.

This morning on my way to work I was listening to Macbreak Weekly. A bunch of my heroes, including John Gruber, were talking about iCloud sync and the problems some of them were experiencing. Tonya had factory reset her iPhone several times trying to get contacts to sync properly. Andy jokingly suggested the merging of contacts was painful and would sometimes merge 17 different versions of the same contact into a lean 12. Chris suggested it was a good idea to make sure you had a backup of the contacts, calendar and email setup you considered “canonical”, before embarking on your iCloud adventure. When the team started talking about the supposed iOS 5 battery drain, iCloud was almost universally assumed responsible for this.

Grubers level-headed approach was that, while he apparently had no problems himself, he did believe Apples iCloud transition was going to be monumentally difficult and compared it to stepping from solid ground on to a boat while carrying valuable trinkets. Transitioning MobileMe customers to a new free setup, making sure not only calendars, email and contacts sync, but also documents, was bound to generate some headaches, but they’ll pass in time, he suggested. I agree, I’m sure things’ll improve once Apple is on the boat.

Perhaps there is something to be said about Apples approach to sync. As much as they tout that “the truth is in the cloud” — as Yogi Berra would say: that’s only true when it’s true. It’s no secret Apple loves native apps. Native apps run faster, smoother, nicer than web-apps. You’ll hear many chant this, they might even use allegories such as “being closer to the metal” when describing why a web-app can never be as good as a native app. Let me tell you this: Yogi Berra doesn’t care. If it works, it works. If the app is good, it’s good. If things sync, things sync. And if they don’t sync properly, they don’t sync properly.

Googles overarching approach to sync is to not sync. Push the changes immediately. When you add a bookmark to your Chrome browser, a teensy signal is immediately sent to Googles bookmark sync server pushing the change. When you finish typing a word in Google Docs, changes are saved. There is no sync, because there are not copies of files anywhere. There is only one file. There is only one email. There is only one contact. You’ll never have to worry about whether your Android phone, tablet, or Macbook has the most recently edited version of your document, or which one has the most complete contact, or which calendar you added an event to. Because everything is always in sync. It just works.

You’d think it would get muddy if you scratched the surface and peeked underneath. If you do, you’ll find that Android sync is actually asynchronous, and that if you use Google Docs’ offline editing capabilities, you’ll actually end up with some of the same sync challenges that Apple is facing: which version is the right version? Somehow I’ve never once had a problem with this, though. I don’t know if it’s because Google started with the web-apps and built native apps and offline sync at a later time, but I have no trust issues with Google getting my sync right. I know that if I visit and edit a contact, my changes will propogate to all my devices seamlessly. I never have to worry about losing contacts, losing appointments, losing emails, getting corrupt data, or even backing up. While these words may smell like famous last words, I wouldn’t even think of backing things up. I expect it to work, I trust that it will work, and has done so far.

Compared to the flaming hoops I had to jump through to get just calendars, contacts and Gmail to sync on my wifes iPhone, using an Android device is just a relief.

Prior Art

Do the tablets in Kubricks 2001 movie constitute “prior art” to the iPad?

This question recently incited much heated discussion on Twitter (( I feel I should apologize to those of you who happen to follow both me and Heilemann on Twitter for having polluted your streams. )). What made this spike my interest in such a fashion is my love for science fiction, and in particular the works of Arthur C. Clarke. Many of his ideas specifically, came to fruition decades later. For example, in 1945 Arthur C. Clarke inadvertently invented satellites. He didn’t patent them; as he put it:

I’m often asked why I didn’t try to patent the idea of communications satellites. My answer is always, “A patent is really a license to be sued”.

Now Clarke merely described what would later become satellites. He didn’t build one, nor did he design how such a thing looks. And indeed satellites today come in all manner of configurations and designs, yet they are still, clearly, satellites.

These days Apple is busy suing Samsung for infringing on Apples look and feel patents with their Galaxy line of phones and tablets. Put simply, Galaxy S phones are too like the iPhone, and the Galaxy Tab 10.1 is too like the iPad. While the comparison photos in the suit filing appear to have been doctored (( For example, scaling down the Tab and opening the App drawer for the photo op instead of comparing the homescreen to the homescreen. )), I’m not going to argue that Samsung TouchWiz is inspired by Apples iOS (which it clearly is) (( In fact I loathe Android skins in general and would like nothing more than Apple forcing Samsung to improve, or better yet rid the world of TouchWiz )).

Focusing on what sparked this discussion — could the tablet devices seen in the 2001 movie constitute prior art for the iPad — I do think that’s fair to say and I’ll get to why I think that is. Whether or not they’re merely portable televisions, they are electronic devices and their form factor is certainly strikingly similar to that of the iPad. But is it prior art?

Prior art:

Prior art […], in most systems of patent law, constitutes all information that has been made available to the public in any form before a given date that might be relevant to a patent’s claims of originality. If an invention has been described in prior art, a patent on that invention is not valid.

To be specific, Apple is suing Samsung over 4 patents. Two of those are related to the iPhone form factor. One is related to how iOS works. The fourth patent is over the tablet form factor; here’s the illustration from the patent application:


If you explore the patent application itself (beware, TIFF file), you’ll note that no specific size is noted in the patent application. The tablet illustrated doesn’t necessarily have a 10 inch screen.

Samsung is in a tight spot. While I find it surprising (and disappointing) that these four patents were granted in the first place, they clearly appear to have been infringed upon. Were I in Samsungs shoes, (and if I were I’d never have released TouchWiz in the first place) I’d be doing everything I could to defend against this suit. Certainly if I was able to find prior art that invalidated any of the four patents in question, I’d look wherever I could, even in my old sci-fi DVD collection. In the case of that one patent Apple has on the tablet form factor, I do see why Samsung would try and invoke prior art on that (though I’m surprised they didn’t pick Picards tablet instead). You see, if Samsung can convince the judge that patent #4 is invalid — that the slabs shown in 2001 are reminiscent of the pencil sketch shown above — it would cut their woes by a fourth.

Samsung is not my favorite Android vendor. They’re not even my favorite hardware vendor. Perhaps it would be good for them to suffer a defeat at the hands of Apple.

But I do consider Arthur C. Clarkes description of a satellite to be prior art. I consider Larry Nivens description of a ring-world to be prior art to the ring shown in the Halo video game. And so, hearing Samsung cite Kubricks tablets as prior art to the iPad is not the dumbest thing I ever heard. Apples tablet is a wonderful combination of a well-designed user-experience and durable, delicious hardware. Even so, the form factor described in their tablet patent is not a unique snowflake, as countless sci-fi authors would have you know.