App Development

Touch Software Must Be Real

When I first heard the iPhone would have a touchscreen, I thought of bowling alley scoring machines and ATMs. Touch software had always been awful, and I didn’t want more of it. As it turned out, the iPhone worked very well, but it wasn’t the touchscreen that drew my attention. Browsing the web with my finger was nice, but carrying the web in my pocket was amazing.

Eight years later, apps on touchscreens are used by a significant fraction of humanity. I want to talk about what makes touchscreen software different, because it took me a long time to understand, and because I don’t want my apps to remind anyone of bowling alley scoring machines and ATMs.


I grew up with both command-line software and mouse software. To use the command line, you had to know the right words to type. And just to learn those words, you had to know how to ask for help. Mouse software changed that by showing options on the screen that you could choose from with a gesture. Learning commands wasn’t necessary because the computer would show you the options again next time. That discoverability made mouse software accessible to millions of people.

I believed touch software was only a minor step forward in comparison. But touch software largely omits something that mouse software depends on, and that is metaphor.

At some point in our lives, we each had to learn the mouse metaphor. It was “how you use a computer.” A motion of the mouse produces a motion of a pointer on the screen, which indicates the user’s interest. The real space on the desk translates to a virtual space, in which weightless windows float over one another in zero gravity. A window is so called because it looks into another virtual space, and users can adjust the view like reading a scroll. Users perform substantial actions with their fingertips, which is so unlike real life that we had to invent the term_ clicking_.

This metaphor is intrinsically tied to hardware, and mouse hardware is a very thick layer between the user and software. Signals from the mouse are reflected on the screen only after traveling through cables and buses and Bluetooth radios. As a mouse user I can’t help but feel that hardware is substantial and real, while software is virtual.

On the other hand, touch hardware is nothing but a sheet of glass. Input and output happen in the exact same place, so there is no need for a complex metaphor. As a touch user I forget about the hardware. I feel that the things on the screen are real. I feel that the software is real.

Software IRL

Touch operating systems attempt to get rid of metaphor and operate literally. You don’t scroll a web page in a window, you touch the page and pull it across the glass. You don’t metaphorically use a button by clicking, you just press it. You press it IRL. And if the act of pressing a button is perceived as real, then so is the button. It happens to be flat and ephemeral and backlit. But it’s something you press to make something happen, and in your sensorimotor system, that’s what a button is.

So the metaphor is as thin as the hardware. That is what makes touch software fundamentally simpler than mouse software, and just as big an evolutionary step. If the touch metaphor has any substance, it’s that the objects on the screen are real. This is so natural to our animal brains that there are iPad games for cats.

Science fiction has often suggested that virtual worlds are our destination. Some of us thought we would go to the Grid or the Matrix and leave meatspace behind. It seemed like an obvious conclusion that we would end up in the world of software. After all, using mouse software is largely an out-of-body experience. But instead, we brought software into to the real world.

When the metaphor is upheld, it may as well be true that touch software is made of physical objects. Of course, hardware hasn’t completely vanished. The touch metaphor can be broken. There is still an event loop that collects touches and draws the screen. Then, this is our job:

Keep it real

First, omit needless metaphor. Skeuomorphism was the norm on iOS until version 7, but no user believes that their calendar app has real leather inside, especially because what they are touching is flat and rigid. The iOS Photos app is a fine example of letting objects look like what they are.

Since the objects in our software are supposed to be the real thing, they must also behave like the real thing. When you pan a view, use the Newtonian physics offered by the system SDK. Use spring physics if it makes sense. Move and rotate objects in the plane of the screen, not in and out of the screen.

If you need to scale objects, do it with a pinch gesture, and keep the origin of scale between the user’s fingers. If you need depth, consider using blur or parallax to communicate distance behind the glass, or treat the whole screen as a window to another place, like television.

Don’t drop animation frames. When your app animates slowly compared to the rest of the system, users won’t notice every dropped frame, but they will notice their phone acting slow. In other words, they are reminded of the hardware that we wanted them to forget about. Especially because animation and touch are synchronized, we need to match the hardware refresh rate as closely as possible.

It’s useful to think of framerates in terms of how many frames are dropped. On 60 Hz hardware, the screen updates every 16 milliseconds. A dropped frame means the previous frame will be shown for another 16 milliseconds, producing stutter over time. I used to think that 55 frames per second was only 10% better than 50 fps, so the difference was negligible. But 5 dropped frames mean the animation is twice as smooth as 10 dropped frames.

Do what you must to get the framerate all the way up and eliminate stutter, because real objects move smoothly. In particular, keep synchronous work off the main thread, use your profiler, and be skeptical of layout passes.

Our job is to make touch software act like real life. Sometimes we’ll have to break the metaphor, but we can still err on the side of realism. If the user’s sense of touch would disagree with what’s on the screen, maybe something else should be on the screen.

Security Scanning and Automation in a CI/CD Pipeline: Being Proactive In Security

Exposition How do we make our applications more secure upon release? Over the...

Read the article