The missing app economy

· Design, Product · Patrick Smith

In the late 90s Steve Jobs was faced with a dilemma. Apple’s charismatic CEO couldn’t convince Adobe and Microsoft, makers of Photoshop and Excel, to make software for Apple’s upcoming operating system. Missing a ‘killer app’ like Photoshop on the Mac meant many people would buy a Windows PC instead. Apple conceded by investing substantial engineering effort in ‘Carbon’, a backwards-compatible software layer added to both their old and new operating systems. This stopgap gave Apple’s most important third-party developers a bridge from one system to the next, and Apple a bridge from one century to the next. 

It worked. Apple dominated the first two decades of the 21st century, reinventing computers and upending markets. Now one of the most valuable companies ever with reliable quarterly profits in the tens of billions, it appears Apple has mastered the recipe for success. However, its key ingredients of user-friendly design, tight software/hardware integration, and ground-breaking technology are today all tightly controlled or obstructed by Apple. The same recipe is out of reach to many would-be competitors and collaborators, preventing the next wave of ‘killer apps’ from thriving. 

Apple began their journey into people’s pockets twenty years ago with the iPod. The music player succeeded not only due to its striking industrial design and memorable ad campaigns with colourful dancing silhouettes, but thanks to deals with record companies Jobs had helped broker himself. He told writer Steven Levy “we walked in and we said, ‘We want to sell songs a la carte’”. The iTunes Music Store sold over one million songs in its first week. 

This success was repeated on a grander scale with the iPhone and App Store, with ten million downloads of apps in first weekend. Both were more capable versions of their predecessors: the iPhone had a larger screen and faster chips than the iPod, and apps could communicate with the internet and were far more interactive than a song. A wave of software accessible to everyday people followed — Instagram, TikTok, Uber, Airbnb — that took advantage of pocketable, always-online devices.

However, a recent court case between Apple and Epic Games found that 70% of Apple’s App Store revenue is from games. That leaves only 30% for productivity software used by businesses such as Excel and Photoshop.  Which makes reports from Apple that they have “paid out over $200bn to developers since 2008” much less compelling. According to Sensor Tower, Adobe makes an estimated $10m per month from the App Store, which is roughly 1% of its $1bn total monthly revenue.  And Microsoft earns even less from the App Store, even though it makes ten-fold the revenue of Adobe. 

New productivity tools like Notion, Jira, and GitHub are all web-first: while they may offer a companion app, the primary experience is designed to be used from a web browser, like Gmail or Google Docs. Another is Figma, a design tool founded ten years ago, which made $75m in 2020 and forecasted double that for 2021. GitHub, used to collaborate on the code behind software projects, was acquired by Microsoft in 2018 for $7.5bn, and was reported at the time to be making $200m–$300m annually. Microsoft’s own Office productivity suite including Word and Excel now work in web browsers (this article was written using it), and Adobe is working on the same for its flagship apps Photoshop and Illustrator. 

The web has several key advantages over apps sold via the App Store. First, the web is an open standard that is purposefully designed to not be controlled by any single company. While makers of web browsers, such as Google’s Chrome, Microsoft’s Edge, Mozilla’s Firefox, and Apple’s Safari, are all made by large corporations with their own interests, they all must work to a single standard that they collaborate over. iPhone apps are made using Apple-provided tools that are updated every year, sometimes breaking what previously worked, which adds maintenance overhead. The first web page made in 1991 still works in the browsers of today. 

This also means developers who build using web standards aren’t locked into a particular platform. Instead of writing a particular app for iPhone and another app for Android and yet more for tablet and desktop platforms, they can be made once and adapt to whichever device a customer uses. Platform providers are incentivised to continue working with as many existing websites as possible, which puts much of the burden of maintenance and compatibility on them. And if a new device is introduced, there’s likely less work by a developer to get an existing website adapted than it takes to provide another app. 

Second, anyone can create and share a website provided they know how. Apple requires a $99 annual membership and submission to a board of reviewers before apps appear in their store. Membership can be revoked, as Epic Games found when their account was terminated in 2020, preventing them from being included in the App Store again. And reviewers are human and subject to the whims of Apple’s broader strategy. Apple Arcade is a game subscription service integrated into the iPhone, which helped contribute to the $54bn services revenue Apple made from its users in 2020. That same year Apple rejected the game streaming service xCloud. Its maker Microsoft responded “Apple stands alone as the only general purpose platform to deny consumers from cloud gaming”. xCloud is now available to iPhone users through the only alternative available: the web. 

Finally, web software can integrate with whatever payment system they prefer, whether it be PayPal, Stripe, or even cryptocurrency. Businesses can find the service that offers the lowest cost, or which makes refunding and recurring subscriptions easiest. On the App Store, Apple requires online payments be done through them, taking 15–30%. And refunds can’t be issued by a developer looking to please a dissatisfied customer, instead users must submit a request themselves via an Apple support form. 

Innovation requires trial and error, and on the iPhone app developers are prevented from properly trialling their ideas. Apple prescribes their business model: they can’t charge users an upgrade from one app version to the next, but instead must either offer new features for free or use a subscription model unpopular with many customers. Novel app ideas often never find their way on the store as their creators find themselves reading an opaque Apple rejection letter. This prevents original ideas from even being started, since Apple only accepts submissions for fully-functional apps and not early concepts, making starting new ventures as risky as betting what the weather will be in 200 days time. 

The web offers an escape hatch, but here Apple has leverage too. On the iPhone only a single implementation of the web standard is available, and that is Apple’s. Google, who primarily makes its money advertising on the web, is incentivised to make websites more like apps, and has been pushing the web standards in that direction for a number of years. Apple is incentivised to keep the web more limited than the platform-specific apps that help sell its devices. (They counterbalance this with a different approach: Google is estimated to have paid Apple $15bn in 2021 to remain the default search engine on the iPhone & iPad.) 

Steve Jobs once said “design is how it works”. If innovation is being prevented from working by the company he founded, then a new wave of design is surely being missed. 

Appreciating Legible Diagrams

Accessibility-First Tool Concepts

· Accessibility, Workflow · Patrick Smith

Twice now I’ve created desktop apps for designing UIs. Neither shipped and I know I want to return to this space again.

My current thinking is that accessibility is a must, and is something tools today severely lack. They are visual-first and often visual-only. Why aren’t we thinking about accessibility at the early design stage?

Doing so would both make implementation easier as we aren’t just bolting accessibility needs at the end. And faster too if we get it right from the beginning.

I think getting developers (tricking them almost) to use accessibility affordances for their own needs (for example writing tests) is an interesting way to get them to care more about it.

Here are some tooling ideas:

Emulate Screen Reader Output

  • Supports VoiceOver, JAWS, NVDA.
  • Turns HTML into what would be spoken by a screen reader.
  • Can validate what actual screen readers would interpret, without having to run all of them.
  • Can use in Snapshot tests to ensure implementations are accessible and don’t break due to changes.

Screen reader emulator as CLI

  • Run with URL.
  • It streams back a screen reader representation of the page.
  • Actually might be useful for developers for browsing a website.

OCR automated testing with contrast / colour checking

  • Takes text and a role as input.
  • Renders using Playwright, and uses OCR to find the element visually on the page.
  • Only work if text passes expected contrast and colour requirements. e.g. “Can’t read button ‘Sign Up’ as it lacks contrast”.

Accessibility-first prototyping tool

  • Content is usually what differentiates your brand, what the user reads, what matches user’s existing language.
  • Write the content first, and what fundamental accessible widgets you want used.
  • See a live preview without writing any code — for visual users and for screen reader users.
  • Export a set of automated tests to verify your actual implementation.

Mocking window.location in Jest

· Concepts · Patrick Smith

It’s sometimes necessary to mock window.location in Jest tests. This could because you want to know when window.location.reload() is called, or one of its other methods (.assign(), .replace()).

Here’s some code to achieve this:

const savedLocation = window.location;

beforeEach(() => {
  delete window.location;
  window.location = Object.assign(new URL("https://example.org"), {
    ancestorOrigins: "",
    assign: jest.fn(),
    reload: jest.fn(),
    replace: jest.fn()
  });
});
afterEach(() => {
  window.location = savedLocation;
});

You could then detect whether your implementation called .reload():

it("reloads the page", () => {
  expect(window.location.reload).toHaveBeenCalled();
});

If you wanted to reuse this in multiple tests, you could wrap this in a reusable function:

function mockWindowLocation() {
  const savedLocation = window.location;

  beforeEach(() => {
    delete window.location;
    window.location = Object.assign(new URL("https://example.org"), {
      ancestorOrigins: "",
      assign: jest.fn(),
      reload: jest.fn(),
      replace: jest.fn()
    });
  });
  afterEach(() => {
    window.location = savedLocation;
  });
}

The Apple Experience Augmented — Part 2: Developer Experience

· Concepts, Product, UX · Patrick Smith

Apple has been iterating for years on a curious set of APIs called ARKit that has been largely ignored by the iPhone & iPad developer community, which hints that more compelling hardware is coming to take full advantage. CEO Tim Cook has openly said AR is the ‘next big thing’.

Rumors suggest Apple is developing an augmented reality (AR) glasses product. This product would likely act as an iPhone periphery, at least initially, similar to how Apple Watch once relied on a host iPhone to provide the main computational grunt. Well informed supply chain analyst Ming-Chi Kuo has said the AR glasses “will primarily take a display role offloading computing, networking, and positioning to the iPhone.”

Apple has recently introduced M1 Macs powered by Apple Silicon. These Macs are notable because they bring a marked improvement in battery life and performance. But they also bring Apple’s developer devices finally in line with the more capable hardware of their consumer devices.

Apple’s head of software engineering Craig Federigi talked with Ars Technica about the advantages of M1’s unified memory architecture:

Where old-school GPUs would basically operate on the entire frame at once, we operate on tiles that we can move into extremely fast on-chip memory, and then perform a huge sequence of operations with all the different execution units on that tile. It’s incredibly bandwidth-efficient in a way that these discrete GPUs are not.

Ars Technica interview with Apple executives

Tiled rendering will be needed by the upcoming AR glasses product, with the increased throughput allowing both high resolutions and high frame rates. High frame rates are required as “frame rates below 90 frames per second (FPS) is likely to induce disorientation, nausea, and other negative user effects”. The iPad Pro can already achieve 120 frames per second, so it’s likely the AR glasses’ display would reach similar rates.

So how would you develop apps for such a device? Let’s look at how developing software for the iPhone works today.

Developers buy Macs and install Xcode which allows them to write, compile, and deploy iPhone (and iPad, Mac, Watch & TV) apps. To actually experience the user experience of their apps, developers either push the app to their own iPhone and launch it like any other app, or they run it directly on their Mac within the Simulator. They choose which device they want to simulate and then see an interactive representation of the device’s screen running their app within a window on their Mac.

Screenshot showing the run destination menu in the toolbar where you choose a real device, choose a simulated device, or create a custom simulator.
Could you one day choose AR Glasses from the list here?

Currently this has worked by compiling the iPhone or iPad software for Intel chips, allowing the app to be run ‘natively’ on the Mac. Macs are powerful enough to run several of these simulators at once, however checking graphic intensive experiences such as 3D or animation sometimes means avoiding the simulator and trying the app directly on the target device. On the whole the simulator does a capable enough job to preview the experience of an app.

(An iPad version of Xcode has been speculated for years, been even with their improved keyboards and fancy trackpads, nothing has been released. The Mac maintains its role as the developer device for Apple’s platforms.)

How will this work for the AR glasses? Will Xcode provide an AR glasses Simulator for Mac? Would that appear as a window on screen with a preview for each eye? Or would you need to push the app to an actual device to preview?

If a simulator was provided, the pre-Apple-Silicon technology of an Intel chip and AMD GPU would not be able to reproduce the capabilities of a unified memory architecture, tiled rendering, and the neural engine. It would either run poorly, at low frame rates, or some capabilities might not even be possible at all. An Intel Mac can simulate software but it cannot simulate hardware. A Mac with related Apple Silicon hardware would allow a much better simulation experience.

Instead of seeing a preview of the AR display on your Mac’s screen, consider if the product could pair directly to your Mac. The developer could see a live preview of their work. The Mac could act as a host device instead of the iPhone, providing the computation, powerful graphics, machine learning, and networking needs of the AR glasses.

With the same set of frameworks brought over allowing iPhone & iPad apps to be installed and run on the Mac, both software and hardware will be ready to run AR-capable apps designed for iPhone. The Mac is now a superset of iPhone, and so what the iPhone can do, the Mac can also do. App makers now have a unified developer architecture.

Perhaps AR-capable apps from the iPhone App Store could even be installed by normal users directly on their Mac. With augmented reality perhaps the glasses will augment the device you currently use, whether that’s the iPhone in your pocket or the Mac on your desk. And allow switching back-and-forth as easily as a pair of AirPods (which would likely be used together with AR glasses).

There’s one last picture I want to leave you with. Swift Playgrounds works by showing a live preview of interactive UI alongside editable code. Change the code and your app immediate updates. The Simulator has been integrated into the app developer experience.

Now imagine Swift Playgrounds for AR — as I edit my code do my connected AR glasses instantly update?