plz 9

Mobile Apps

I want to tell you a story about your phone and the software that runs on it. I’m not talking about the operating system — I’m talking about the programs your phone runs so that you can stream music, order food, and read the news. You can tell this story a couple ways, but my version will focus on iOS because the Android ecosystem a) has minority market share in the U.S. (despite dominating globally); and b) is fragmented among many device manufacturers while the iOS experience is fairly uniform.

This story starts in 2007 with Steve Jobs announcing the iPhone. There wasn’t even a number then; it was just “The iPhone”. Do you remember? The early iPhones were thick and had round sides. You could watch YouTube on them! It was a magical time. The really killer thing about the iPhone was all the apps. You could find all sorts of apps in the App Store and lots of them were free! Apple always made nice physical products and iOS was sleek, but the apps really made the phone what it was. Without the apps you just had a very expensive touch-screen camera phone.

Let’s jump ahead ten years to 2017, which was around the time that iOS and Android reached a combined 90% of global mobile-device market share. The iPhone had come a long way in a decade: the phone was thinner, the screen was bigger, and the camera took beautiful HD photos. Cellular infrastructure had come a long way too: we went from 2G to 5G and you could reliably stream music and video on your cell connection in most populated areas of the U.S.

The combination of advances in phone hardware, cellular data speed, and smartphone market penetration caused demand for apps to explode. Apple (via the iOS App Store) and Google (via the Play Store) both managed thriving marketplaces where third-party developers could (with approval) distribute their applications to users around the world. If you were a developer making mobile apps you could make money by charging users to download it or (in some cases) selling a subscription service inside the app. The ‘microtransaction’ model also gained popularity where developers would give their apps away for free but offer optional in-app purchases.

Opinions vary, but in hindsight I’d say this was a pretty good time for mobile app developers. The Apple and Google app stores provided access to a new market of phone users and some industries (especially mobile gaming) discovered that you could make boatloads of money via paid apps and microtransactions. Of course, there were some downsides. Apple and Google took a 30% cut of any app-related transaction and exercised complete (and sometimes arbitrary) control over which apps could appear in their stores. The relationship between third-party developers and the app stores was mostly symbiotic — developers liked the distribution that the store provided and the companies running the stores benefited from a thriving app ecosystem. But for any one developer there was always the risk that Apple or Google could arbitrarily block you out of half the market overnight with little or no recourse. The 30% cut they took also dwarfed the ‘platform’ fees of other industries.

But if you were an app developer in 2017 there was not really anything you could do about it. Apps were the golden ticket: they got you real estate on users’ home screens (via the app icon), provided access to phone hardware like the camera and motion sensor, and allowed you to send native notifications to the phone’s lock screen. The app stores also provided a smooth experience for people to actually pay money for apps. Even with the 30% fee, those capabilities incentivized the creation of an entire mobile app industry.

The only other way to even get any sort of content on a phone was a mobile website, and mobile websites in 2017 were bad. They had improved over time, but the experiences of using a website vs. an app on your phone were still not remotely comparable. With no ability to access phone hardware or send notifications a mobile website was a non-starter from the developer side, too. The browser on your phone was mostly for Googling things. Once you found the link you wanted it would probably just redirect you to that company’s app. It seems odd to even compare the two, given the differences.

But fast-forward to 2021 and those differences are… fading? Certainly there is a class of mobile apps whose functionality you can’t (and possibly will never be able to) replicate in a mobile website. Maybe that’s because apps are more performant — they integrate better with the phone’s hardware and use faster languages while a website has to run inside a browser. Maybe that’s because an app has access to a suite of first-party phone APIs that can’t be accessed from the browser. Or maybe that’s because app developers know how valuable it is to be able to spam users with notifications to drive engagement.

There is another class of apps, however, that don’t need any of those things. Lots of mobile apps are just thin wrappers on a website and those apps now have a serious alternative to the walled gardens that have ruled phone software for the last 14 years. When iOS 12 was released in 2018, iPhones quietly gained the ability to save websites to the home screen. That means that a website can live on your home screen between Spotify and YouTube, complete with its own app icon and name. All it takes is two taps in Safari[0]! When saved websites like this open they render in ‘full screen’ the way an app does with no browser bar or control buttons. I think that the ability to capture home screen real estate is incredibly valuable and I don’t understand why every website isn’t begging mobile users to save their page as an ‘app’. Truly this is the most slept on iOS feature of all time. So-called ‘installability’ is a core feature of progressive web apps which is the running term for these websites that feel and behave like native apps.

Other app-like functionality isn’t fully available to websites yet, but it seems clear it’s only a matter of time. Browser APIs evolve slowly but in the last few years we’ve seen desktop sites gain the ability to request the user’s location and send ‘native’ notifications[1]. It seems logical that browser APIs will eventually allow interaction with many first-party phone APIs, if a bit obtusely.

Web development tooling (and yes, JavaScript) has also continued to improve. This is less important than some of the other factors but still helps tilt the needle toward a world where you can deliver native app-quality experiences via the phone’s browser.

For developers there are many incentives to deliver content via a website instead of an app, as long as they can do it without compromising the user’s experience. Avoiding the 30% fee is the obvious one (though for ostensibly unrelated reasons both Apple and Google will soon reduce their take to 15% for annual payouts under $1 million). Also compelling is the ability to cut app store gatekeepers out of the update process. A website can be updated at the discretion of its operators while apps must be re-approved before users can receive an update[2].

For most of the history of the iPhone the idea of a website delivering a comparable experience to an app was a non-starter. But due to a confluence of recent advances in programming languages, developer tooling, phone hardware, browser APIs, and iOS features, a website is now a viable replacement for some native apps. There are clear financial and technical incentives for businesses to escape the app stores and we should expect mobile website capabilities to continue advancing in the near future. The result will be that more and more businesses have a true alternative for delivering software to mobile devices.

Heuristics

By nature, humans are susceptible to many flawed thought patterns. My favorite is the availability heuristic, where our brains judge the importance or likelihood of a thing or event by how easily it can be recalled. There’s no rational reason to be afraid of dying on a commercial U.S. airline flight — flying is extremely safe, safer than driving to the grocery store — but it’s easy for our brains to conjure examples of fatal plane crashes so we may assess this risk as being greater than it is. Taken together, these mental shortcuts or heuristics are often referred to as cognitive biases.

Anyways, here’s something I’ve been thinking about recently: do people judge things or concepts as being more similar when those things or concepts are presented as a set of options for a decision[3]?

My example that best illustrates this is college majors. Students who enroll in traditional colleges and universities in the U.S. must almost always pick a major in order to graduate. Usually they must do this early in the (typically) four-year degree process; the choice of major dictates what a large percentage of their future classes will be. Students also (typically) have relative freedom to choose whatever major they like[4]. A larger university might have 150 majors for students to choose from.

In this context, a student’s major is like a slot. There is a major-shaped hole in each student’s life which must be appropriately filled by one of the available options. But although all the options are equally suitable for the task of filling that slot, the actual majors (and the varying effects they have on students’ lives) are vastly different! My conjecture here is that it is very easy for our brains to let that first property of a thing (the property of being an equally valid option for a selection) bleed into the second property (the property of being a good or bad option). The effect is that the options are viewed as more similar than they actually are.

I don’t think students are generally fooled or confused — there are cliche jokes about the worth or worthlessness of certain majors. But I do think the framing of the decision makes it natural to view majors as being identified by their variance from an underlying sameness rather than as the disparate set that they are.

A generic decision is the generalized situation where I think people are most vulnerable to this bias. Options in the decision tend to be implicitly viewed as having a similar fundamental nature or being of a similar class, regardless of whether this is true[5]. The bias-fighting question to ask yourself is something like, “are any of these options actually much better or much worse than the others?” They may be, and you may not have noticed!

My absolute favorite version of this is in the first Dark Souls game. When you create a character at the beginning of the game you get to choose a gift (an item that is immediately in your inventory when the game starts). There are nine options, including ‘none’: three of the options are common items that can be easily obtained early in the game; three are items of minor usefulness that are more complicated to obtain in-game; one is a powerful healing consumable; and one is a key that instantly unlocks giant swaths of the game, fundamentally transforming the paths the player can take through the world. The juxtaposition is almost comical. Imagine choosing the binoculars!

How’s Tether doing?

Well, a few weeks ago the CFTC fined them $41 million “for making untrue or misleading statements and omissions of material fact” about the way Tethers were backed (we talked about the tumultuous history of Tether’s “reserves” back in May). Also, Tether sometimes makes loans collateralized by Bitcoin, minting new Tethers in exchange for the collateral and destroying the Tethers when the loan is repaid. It’s anyone’s guess what percentage of the $69 billion of Tether in circulation were created this way.

Anyways, Bloomberg has the latest long-form account of the bizarre Tether situation. Here’s a nice teaser:

If the trolls are right, and Tether is a Ponzi scheme, it would be larger than Bernie Madoff’s. So earlier this year I set out to solve the mystery. The money trail led from Taiwan to Puerto Rico, the French Riviera, mainland China, and the Bahamas. One of Tether’s former bankers told me that its top executive had been putting its reserves at risk by investing them to earn potentially hundreds of millions of dollars of profit for himself. “It’s not a stablecoin, it’s a high-risk offshore hedge fund,” said John Betts, who ran a bank in Puerto Rico Tether used. “Even their own banking partners don’t know the extent of their holdings, or if they exist.”

Somehow I bet we’ll be talking about this again soon.

Bookmarks

More on fighting how your brain works. Rickrolling a school district. Humans are not automatically strategic. Why are new buildings often ugly?

[0] I read Hacker News via this viewer (which is just a website) saved to my home screen. Tap the share button and then ‘add to home screen’ to try it out. The site is well configured and renders just like a native app. Compare that to the same experience on news.ycombinator.com.

[1] Of course, some sites abuse these relentlessly and they also create new avenues for phishing and other user-hostile behavior. Apple’s argument of ‘we need control to keep app-quality high by setting and enforcing standards’ has some clear merit! If you do too much notification spamming on iOS Apple can just ban you from the store.

[2] There was a kerfluffle last year when Apple blocked updates to the Hey email client over issues related to its payment system. Apple took issue with Hey accepting payments outside of the app store and denied its updates in apparent retaliation.

[3] If this is an established thing in psychology I’m not aware of it and I didn’t find anything that matched well when I went looking.

[4] Some schools have internal obstacles for choosing a major, like requiring students to apply into the relevant department. I think this is a minor factor relative to the question of ‘what majors do students want to have’, though.

[5] There is also possibly an inverse version of this: when generating options for a decision it is more natural to consider things that _are _fundamentally similar to existing options.