Cover Image: February 2013 Scientific American Magazine See Inside

Apple Shouldn’t Make Software Look Like Real Objects

Why digital design doesn't have to imitate the physical world















iPhone, iPad, Apple, skeuomorphim Image: Ben Wiseman

More In This Article

Last fall Apple fired executive Scott Forstall, considered by many to be a Steve Jobs protégé. His departure prompted a flurry of discussion about a formerly obscure design-industry concept that he had championed: “skeuomorphism.”

In the physical world, a skeuomorph is an ornamental version of something that was, in an earlier product, a functional necessity. Fake shutter sounds in digital cameras. Fake candles in electric chandeliers. Fake grain in leatherette.

In software, skeuomorphs are everywhere. Desktop folders look like physical filing folders, the Trash can looks like a wastepaper basket and the Save button looks like a floppy disk.

Those shapes don't serve any technological function. But there is, of course, a human reason for software skeuomorphs. In the 1980s, to help the public make the transition to computers, Apple and Microsoft designers chose real-world shapes for on-screen icons to convey their meanings. Jobs, in particular, was a fan of skeuomorphic designs. And in the beginning, they played a big role in helping the graphic user interface catch on.

These days there's a building backlash to skeuomorphic design—a rising sense that Apple has gone too far.

When you turn a page in an Apple e-book, the “paper” curls as you flip it over, even revealing the faint image of words printed on the other side. Apple's Contacts app looks like a physical address book, complete with fake “staples” in the “binding” between “pages.” The background screen of its Game Center app is made to look like green felt, as on a Vegas gaming table. And, perhaps most superfluous of all, torn-off paper scraps adorn the top of the Calendar program's “binding,” as though previous months' pages have been torn away.

These design features, critics argue, no longer help novices make a transition. You don't need unsightly paper remnants to understand that you are using a calendar. A curling-page animation just slows the reader down for the sake of showing off. Meanwhile slavish dependence on real-world visual metaphors could be holding back more creative, space-efficient or self-explanatory designs.

Sometimes Apple uses skeuomorphs that would not even make sense to modern-day customers. How many members of Generation Y have ever even used a Rolodex? In Apple's new Podcast app for the iPhone, the dominant visual element is a reel-to-reel tape—a technology that fell out of use 30 years ago.

Microsoft's latest operating systems—Windows Phone, for example—run full bore the opposite direction. Their interfaces are all digital, with no references to the physical world. The designers are clearly saying, “It's 2013, people. We don't need fake wood grain and green felt to convey software functions.”

Many Apple designers would probably argue that helping novices recognize software functionality isn't the sole objective. They would probably point out that detailed photorealistic depictions of physical things also look cool. Yes, it's showing off, but it's part of making something pleasant to use. In truth, many of the complaints come from other designers. You don't hear the masses—people who are actually buying these products—griping about those little digital staples.

In any case, Apple's famous chief of hardware design, Jony Ive, is now in charge of software design as well, and he's not a fan of skeuomorphism in software. The days of iPhone apps that have fake wood grain, fake brushed metal and fake stitching in fake leather are probably numbered.

And that's fine. Skeuomorphism in software has its place when used well: it can put you at ease with a new program in a flash and convey functions with simple visual metaphors (camera apps will always have camera icons). As with any design concept, this one can be taken too far. The instant a skeuomorph makes software less pleasant to use, somebody should rein it in.

SCIENTIFIC AMERICAN ONLINE
Five of Apple's realism flops: ScientificAmerican.com/feb2013/pogue



This article was originally published with the title Out with the Real.



Subscribe     Buy This Issue

Already a Digital subscriber? Sign-in Now
If your institution has site license access, enter here.

ABOUT THE AUTHOR(S)

David Pogue is the personal-technology columnist for the New York Times and an Emmy Award–winning correspondent for CBS News.


12 Comments

Add Comment
View
  1. 1. drieck 12:23 PM 1/20/13

    Mr Pogue, normally I always agree with your thinking but I am just stunned that you think Windows Phone is a look into Apple's future. And as a long time Apple investor I hope and pray you're wrong. Who uses Windows Phone and do you know many or any that likes it? I assume you do remember that we are not machine parts or modules in a big mechanism? Most of our waking hours are devoted to being human and pursuing human activities. When I open the fridge I have to navigate a fridge door! When I read an actual book I have to turn a page to continue. I find Skeuomorphism is just about the only thing that doesn't render the everyday computer experience boring and depressing. If Jony Ive goes all cold and efficient he will lose me and millions of other Apple users who use Apple products because they are, regular, people friendly. I think you need to get out into the real world. By that I mean no disrespect or insult, but rather you should reconnect with the world that most of us inhabit. You know, get up, go to work, use computers and smart phones all day but don't place them at the center of our existence. We use them because we have to and sometimes like to but really don't want to work to hard to make them go. And. Don't want to guess what's going to happen when we press a key of a mouse button because the function gives us no clue as to what will happen when we do.

    Reply | Report Abuse | Link to this
  2. 2. Joel454 09:30 AM 1/22/13

    The Icon is a kind of Hieroglyphs. The meaning have been standardized. So just as I am typing this message on a keyboard designed to slow me down for typesetting equipment gone for well over 60 years we will be stuck with the icons.

    Reply | Report Abuse | Link to this
  3. 3. Joel454 09:30 AM 1/22/13

    The Icon is a kind of Hieroglyphs. The meaning have been standardized. So just as I am typing this message on a keyboard designed to slow me down for typesetting equipment gone for well over 60 years we will be stuck with the icons.

    Reply | Report Abuse | Link to this
  4. 4. curmudgeon in reply to drieck 10:10 AM 1/22/13

    If you wish to protect your investment I would suggest that it is you who gets out into the real world a bit more. I remember dreaming when the first word processors arrived, taking up half a room, and costing the equivalent of a small house, of a day when I could finally throw away the ridiculous mechanical encumbrance that was the typewriter, never believing that a mere few years later I would be sitting in front of my very own Amstrad PCW churning out pages at an unprecedented rate and saving hours laboriously typing up or handwriting the good copy.

    If you think you represent a majority of computer users, even a majority of, shall we say more mature, users, you are seriously out of touch. From the very first moment that I laid my hand on that Amstrad keyboard I have used computers not because I have to but because I love to. And for the younger generation that goes double.

    It is quite absurd that programmers continue to use icons that were literally iconic in the early years of computers but are simply meaningless to those who are starting their computer 'education' now. The floppy disk as an icon for save, for example. Who, in a generation which uses a phone for just about everything apart from what it was invented for has ever seen a floppy disk, let alone knows how it was used? Although by no means as completely pointless as those applications that sounded like a typewriter when you typed, or issued a 'kerching' when some kind of financial information was input (I worked in a local shop in 1974 and we were already way past the mechanical cash register!), the fact is that this infantilised iconography is way past its sell-by date.

    Computers have grown-up beyond imagining. It's way overdue that UI design and iconography in particular did the same!

    Reply | Report Abuse | Link to this
  5. 5. Derick D 10:25 AM 1/22/13

    It's called style, people, and if you think it's days are numbered you've got a lot to learn about consumer behavior.

    Ultimately, how designers want the software to look (in a general sense) is of little consequence. Consumers will ask for what they want, and the developers who give it to them will make money. Most developers who insist on designing software they way THEY want it to look will fail to become commercially successful, and then either get a job at McD's or working for a developer with some business sense.

    In the end, it's consumers who will decide on the appearance of the software that they use. Some will want to future-flashy-functionality, while others will prefer new stuff that looks like old stuff. And developers will deliver it all, for a tidy profit. The days of a few developers making software for masses of computer illiterates are over. As consumers continue to become more and more tech savvy, the technology market will continue to become more and more diverse, and ultimately consumers will find what they want.

    Reply | Report Abuse | Link to this
  6. 6. DaveG 12:19 PM 1/22/13

    Ce n'est pas une citation exacte de René Magritte.

    Reply | Report Abuse | Link to this
  7. 7. DaveG in reply to Joel454 12:34 PM 1/22/13

    To build on your point: all of the letters in our alphabet are based on long-defunct pictographs. The letter "A," for example, is an ox, (you have to turn it upside-down.)
    So where should we draw the line when eliminated anachronistic symbols? And what if these symbols are still doing some good, because they are so deeply embedded in the visual language of our software? (e.g. the floppy disk.)
    That said, Apple goes a step further when they make the notepad app look like a real notepad. I can understand that this style is not to everyone's taste.

    Reply | Report Abuse | Link to this
  8. 8. gmitche8 in reply to Derick D 02:07 PM 1/22/13

    So what you are saying is that consumers shouldn't, and don't embrace change? Well, that could be a problem for them in terms of technology. I'm sure it's not going to be a quick drastic change, but it isn't an artists job to re-create someone else s work. It's like telling van Gogh to re-paint the Mona Lisa.

    Reply | Report Abuse | Link to this
  9. 9. Derick D in reply to gmitche8 02:24 PM 1/22/13

    That's not what I'm saying at all. I'm saying that consumers' needs and desires vary, and that it's consumers' needs and desires (rather than the preferences of people like Steve Jobs or Scott Forstall) that will determine what software interfaces look like.

    Personally, I like a little anachronistic styling on my super high-tech devices. It has nothing to do with comfort or ease of use - I'm actually more comfortable with mouse & keyboard than pen & paper. But that page turn animation on my iPhone's Kindle app (which I turn on or off, depending on my mood) and the torn calendar sheet at the top of my iCal are just a little bit of pizazz that makes using them a bit more fun for me.

    Incidentally, if consumers didn't embrace technological change we wouldn't all be walking around with smartphones and trying to figure out how to sync them with our laptops and tablets. Nor would we be having a debate about how software user interfaces should look in an age when an increasing number of consumers grew up with "new" technology ;)

    Reply | Report Abuse | Link to this
  10. 10. glinbear 05:23 PM 1/22/13

    Sounds like a tempest in a teapot to me.

    Reply | Report Abuse | Link to this
  11. 11. davidpla 07:23 PM 1/22/13

    An even better reason for limiting fancy superfluous effects is that they waste CPU power which is not a limitless resource. One telling difference between Vista (a CPU hog) and Windows 7 is the clock Gadget. Under Vista, if you activate the second hand, it quivers like a cheap mechanical clock as the hand goes from second to second. In Windows 7, a much more CPU friendly OS, the second hand goes smoothly from second to second, saving resources and programming effort.

    Even Microsoft can learn!

    Reply | Report Abuse | Link to this
  12. 12. markcraig 09:18 AM 1/23/13

    I'm having troubling getting at the point of this article. When you say (quote):

    These design features, critics argue, no longer help novices make a transition. You don't need unsightly paper remnants to understand that you are using a calendar. A curling-page animation just slows the reader down for the sake of showing off.

    What in the world does this mean? A curling page animation hardly slows down the reader - the milliseconds saved pales in contrast to increasing bootup speed (yes, even in an iPhone).

    I agree that the "instant a skeuomorph makes software less pleasant to use, somebody should rein it in," but you haven't give an example. As others have noted here, most people are using computers to do something, to complete tasks, and not using technology for the sake of technology. Design elements that help are useful and while it may be somewhat silly to have a file folder as an icon for a digital folder, it makes sense at this stage of our relationship to technology.

    Once we reach the point where my phone, tablet, or any computing device reads my mind and knows when I want to file something, then it's a different story. Likewise, once my tablet tracks my eyeballs and can adjust the flow of text from page to page automatically. But we're not there yet, that proverbial land of Oz. Perhaps five years, when processor power will reach the point where this is doable.

    But suggesting that skeuomorphs are somehow slowing us down or is only for novice users just doesn't make sense. Yes, I concede, the fake wood grain is kind of silly - but so a simulated burnished metal as a background for our digital texts.

    In the end, the best design approach - which Pogue completely neglects to mention here - is to give users the choice of what they want. Good design should be a framework for customization. Give people options which is really, when you think about it, something that technology is all about.

    Reply | Report Abuse | Link to this
Leave this field empty

Add a Comment

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Click one of the buttons below to register using an existing Social Account.

More from Scientific American

See what we're tweeting about

Scientific American Editors

Free Newsletters


Get the best from Scientific American in your inbox

  SA Digital

Latest from SA Blog Network

  SA Digital

Email this Article

Apple Shouldn’t Make Software Look Like Real Objects: Scientific American Magazine

X
Scientific American Magazine

Subscribe Today

Save 66% off the cover price and get a free gift!

Learn More >>

X

Please Log In

Forgot: Password

X

Account Linking

Welcome, . Do you have an existing ScientificAmerican.com account?

Yes, please link my existing account with for quick, secure access.



Forgot Password?

No, I would like to create a new account with my profile information.

Create Account
X

Report Abuse

Are you sure?

X

Institutional Access

It has been identified that the institution you are trying to access this article from has institutional site license access to Scientific American on nature.com. To access this article in its entirety through site license access, click below.

Site license access
X

Error

X

Share this Article

X