Make the iPad more like the Mac

Or: What I learned using macOS on the iPad Pro

Radu Dutzan
9 min readDec 9, 2018

A note from the future: This is from December, 2018. A lot of satisfactory progress has been made since then, but there’s still a long way to go. Godspeed.

Pretty much every reviewer has said the same thing: the iPad Pro is an amazing piece of hardware encumbered by clumsy software. iOS on iPad has lagged for years, getting a little attention every so often, but mostly just being a stretched-out iPhone OS. In the beginning, Apple’s iPad story was all about the apps—special, brand-new apps just for the big screen. But now that the app promise has been mostly fulfilled and even Real Photoshop™ is (almost) on the tablet, there’s just no excuse for the state of the system’s interface.

The Mac is a stable, mature operating system. It carries the baggage of having been in the market for 35 years, but also the freedom of precise and reliable input mechanisms. When Apple created the iPhone OS, they decided to break free from the Mac’s interface conventions and start from scratch. A menubar and windows would be absurd in a tiny 3.5" screen, and the tiny mouse targets are very hard to hit with fingers. Makes perfect sense: they’re completely different devices. One sits at a desk with a keyboard and mouse, the other lives in your pocket and has a multi-touch screen. When the iPad came along with a 9.7" screen and the same internals as the iPhone 4, it made total sense to make it run the same OS as the iPhone. So far, so good, right?

Fast forward to almost-2019: the iPad is now “Pro”, the screen goes up to 13", it has an optional keyboard and pointing device, and bests over half the MacBook line in benchmarks. Yet it still runs the iPhone’s OS. Yeah, they added a fancier multitasking UI and the ability to run up to 3 apps at once in a limited set of configurations, but it still behaves like it’s a pocket-sized device for use with your imprecise fingers as you walk down the street. The home screen is still just a sparse grid of apps, a useless mess left to the user to manage. Things like Spotlight, Siri, voice calls or interacting with notifications still take up the entire screen, and so do apps (except for the highly limited and sometimes confusing floating window mode). Undo is still a mess. And text cursor behaviors are a bureaucratic hassle, even when used with a Pencil.

Enter the macPad

Recently, I got a Luna Display. I was on the fence about getting it, but then my girlfriend pointed out that I could use it to run Xcode and Sketch on my iPad Pro. I’ve wanted that for so long, how come I didn’t think of it? It’s brilliant. I’m tired of holding my breath for Apple to release some sort of iPad Xcode, and the people at Sketch said back in 2015 that it just didn’t make financial sense for them to build a touch version. So I pulled the trigger, and got the Luna on Black Friday. It arrived yesterday, and I’ve been living my dream: I’m running macOS on my iPad. Well, not so much as running it on the iPad, more like streaming it from my Mac, but it’s pretty close.

My macPad, featuring Queso

It’s not perfect (even though it looks really good). Luna Display doesn’t have a software keyboard, so without the Keyboard Folio or some other keyboard, it’s useless, and even though you can scroll with two fingers on the screen, other trackpad gestures (like 3-finger swipes for Mission Control) just don’t work. Besides, things look just tiny—not because they’re being scaled (they’re not), but because everything on the Mac is just smaller. The Mac’s mouse pointer is precise down to 1 screen point, and because the cursor is responsive to changes in tracking speed, it’s easy to control it with precision, so there’s no need for the huge tap targets we find on iOS.

But even though the targets are tiny, the system is very much usable with an Apple Pencil. The Pencil is even more precise than a mouse, and the Mac’s targets on the macPad are within what you’re accustomed to when filling out a paper form. There’s no support for hovering, because the iPad won’t register the Pencil hovering its screen, which is a bit of a nuisance if your Dock is hidden, but you can definitely manage without it (though I had to turn off Magnification). Armed with the Pencil, plus Control-arrow keystrokes for Mission Control (or just your wireless trackpad or mouse, if you give up), you have a decent makeshift macPad.

The experience could definitely be better, but it still runs Xcode and Sketch, which are the two main apps I use to do “real work.” You can only imagine how good it could be if, for example, Luna had a software keyboard, or if it supported more trackpad gestures, or if instead of bringing up the iPad Dock, swiping up from the bottom edge brought up the macOS Dock, or if Command-tab actually worked for macOS instead of being intercepted by iOS. But it works, and it’s a revelation that makes you wonder.

The freeform iPad

What if the iPad had a system interface that more closely resembled a Mac instead of the stretched-out phone UI we have today? How might things be different? This hacked-together prototype offers some insights.

Having freely draggable and resizable windows contained on different spaces is a radically different, yet immediately familiar way of interacting with a device like the iPad. iOS apps are about to get support for arbitrary resizing thanks to Apple’s efforts to bring them to the Mac, so it could happen. We could still have fullscreen and split-screen apps—the Mac does too—, and finding a window could be handled by some adapted version of Mission Control. Opening a new app could be done with the Dock, like today, but also with a modal layer of icons, like the way Launchpad works on the Mac. In fact, I just opened Launchpad with Luna on my iPad, and it’s not even funny. It works exactly like SpringBoard (the iOS home screen), with a nice bonus: Mac icons look beautiful on the iPad display. A floating and draggable Spotlight window would be a delight on the iPad as well—Spotlight already works okay as an iOS app launcher. (It only needs a more ubiquitous entry point that doesn’t require a hardware keyboard.)

Handling text is one of the most noticeable changes. On iOS, when tapping text fields, the cursor invariably lands at the beginning or at the end of a word, even if you’re touching the middle of that word with the Pencil—iOS assumes there’s just no way you could have intended to place the cursor exactly where your sub-pixel-precise pointing device landed. And to begin selecting text, you need to tap, tap, and tap again, because the system needs to be absolutely certain that selecting text is what you intended to do — even if you do it with a precise pointer like the Pencil. It’s excruciating.

On the macPad, things are radically different. You need to keep in mind that touches are immediately interpreted as clicks by Luna, so scrolling works only with two fingers, and tapping and dragging with one finger (aka swiping) is usually interpreted as a click-drag gesture that triggers selection on the Mac. That puts you on your toes, because swiping is a very natural gesture on touch devices—you don’t even think about it. But once you wrap your head around this, you see that tapping and dragging to select is actually a much more efficient interaction that whatever we’re doing today in iOS text fields, or in canvas-based apps like Keynote for iOS. Seriously, what are we doing with text and object selection on the iPad? Whatever it is, it’s kinda awful, especially after trying out tap-and-drag selection on the macPad.

There are so many places where the iPad could benefit from some adaptation of tap-and-drag selection. It’s such a better model that imagining the interaction is worth the effort: there already is a heuristic somewhere on iOS that starts measuring for how long you’ve kept your finger still after starting a touch in order to decide whether to transition from a scrolling gesture to a drag. That same heuristic could be applied to iPad text fields and layout apps such as Keynote: after holding a touch still on a text field or on the canvas for a set amount of time, the gesture could become a selection drag, and moving your finger could begin selecting the text or objects encompassed by the net dragged distance.

What about the desktop? Well, what about it? The Mac has it because its fundamental organizational unit—its main metaphorical currency—is files, and since we keep files scattered around IRL, we have a digital equivalent on the Mac’s desktop. The iPad’s currency is apps, so if we have an iPad OS with windows, spaces, Mission Control, and a classic icon-based app launcher on a separate modal layer, then what should be on the ‘base’ layer? Well, what about widgets and a set of user-defined (or Siri-suggested) shortcuts? The current iOS widgets are pretty good, but they look weird in their sparse, centered space on the iPad today. We could combine widgets and shortcuts into a new home screen that offers glanceable and interactive information, plus instant access into whatever people could want to do next.

An iPad Pro OS

Most of these changes, if not all of them, would only make sense on an iPad—some would only make sense on an iPad Pro—, which is why it’s important to think of iOS on the iPad as a separate OS. Separating the iPad from its phone-y legacy allows us to separate its interface mechanism, which makes sense—apps still need to be adapted especially for the tablet, and the tablet can still host phone apps in a special mode—kind of like how the Mac will be able to run iOS apps while still being a different OS.

Today’s iPad might be good enough for people to do casual browsing and video streaming, but it’s not for Mac people who are eager to truly live on it. And the Pro is far from being a viable primary platform for professionals in creative fields. A professional computing device needs the flexibility and speed of interaction to enable itself to be used as quickly as your brain thinks. But iOS puts training wheels on every single interaction, and the macPad has to be used with a high degree of consciousness to not screw it up. The iPad Pro needs to find an adequate middle ground where it gets the benefits of touch interaction with the flexible and speedy multitasking afforded by windows and spaces, while also better benefitting from the optional presence of a keyboard and precise pointing device.

And though not all of these changes would necessarily work, the need for radical advances in the interaction model of professional touch devices is latent. The “old world” might be a good place to look for inspiration. Traditional “PCs” carry long baggages that touch-based devices might not need, but most of the Mac features I mentioned were introduced in Lion (10.7), which came out in 2011 (the same year as iOS 5 and the iPad 2), and were clearly inspired by the fluidity of iOS. They could work on a touch-based device, and in fact, they kinda already do, even in the currently crummy state of the macPad “prototype.”

We should be thinking of the iPad Pro as a truly new, hyper-flexible computing category that combines the best of what we’ve learned on the iPhone and the Mac into something new, and hopefully better. Only then can it fulfill its promise of being a “real” computer for professionals, and do some justice to its lovely hardware.

Thanks for reading this far. I know you wanted to see some mockups, but I already spent a day procrastinating on other stuff writing this, so maybe some other time? While I have you, I make a sweet offline music app for iPhone, check it out. Also, check out my website. Or don’t, I’m not your mom.

--

--