At Pearson, Brad Jones supported this project throughout, but the most credit goes to Chris Webb, whose tenacity, focus, and hard work really made The Inmates. Business executives have let the inmates run the asylum! In his book The Inmates Are Running the Asylum Alan Cooper calls for revolution - we need. mingnoreftatan.tk Table of Contents Index Inmates Are Running the Asylum, The: Why High-Tech Products Drive Us Crazy.
|Language:||English, Spanish, Hindi|
|Distribution:||Free* [*Sign up for free]|
The recurring metaphor in The Inmates are Running the Asylum is that of the dancing bear--the circus bear that shuffles clumsily for the amusement of the. Ebook download any format The Inmates Are Running the Asylum Unlimited Free E-Book Download now. In his book The Inmates Are Running the Asylum Alan Cooper calls for revolution - we need technology to work in the same way average people think - we need.
LinkedIn The Inmates Are Running the Asylum, published in , introduced the use of personas as a practical interaction design tool.
Based on the single-chapter discussion in that book, personas rapidly gained popularity in the software industry due to their unusual power and effectiveness. Had personas been developed in the laboratory, the full story of how they came to be would have been published long ago, but since their use developed over many years in both my practice as a software inventor and architectural consultant and the consulting work of Cooper designers, that is not the case.
Since Inmates was published, many people have asked for the history of Cooper personas, and here it is. Like so much software of the time, it was terribly hard to use, and its real power was in demonstrating that making software easy to use was harder than everyone thought. In particular, I spoke at length with a woman named Kathy who worked at Carlick Advertising.
It seemed a classic project management task. Kathy was the basis for my first, primitive, persona. In , compared to what we use today, computers were very small, slow, and weak.
I usually performed a full compilation at least once a day around lunchtime. After eating, while my computer chugged away compiling the source code, I would walk the golf course. From my home near the ninth hole, I could traverse almost the entire course without attracting much attention from the clubhouse. During those walks I designed my program. As I walked, I would engage myself in a dialogue, play-acting a project manager, loosely based on Kathy, requesting functions and behavior from my program.
I often found myself deep in those dialogues, speaking aloud, and gesturing with my arms. In , instead of creating another program, I offered my software design experience to my colleagues as a consultant for the first time. I quickly learned that consulting is quite different from entrepreneurial invention. Previously, I just built what I thought was right. Now I found that I had to persuade my clients before they would follow my lead or see the benefits of my ideas.
This imperative for communication eventually impelled me to formalize the notion of personas. During discussions with them regarding interaction design for their product, I found myself continually engaged in a circular dialogue. Ironically, the thing that will likely make the least improvement in the ease of use of software-based products is new technology. There is little difference technically between a complicated, confusing program and a simple, fun, and powerful product.
The problem is one of culture, training, and attitude of the people who make them, more than it is one of chips and programming languages. We are deficient in our development process, not in our development tools. The high-tech industry has inadvertently put programmers and engineers in charge, so their hard-to-use engineering culture dominates. Despite appearances, business executives are simply not the ones in control of the high-tech industry. It is the engineers who are running the show.
In our rush to accept the many benefits of the silicon chip, we have abdicated our responsibilities. We have let the inmates run the asylum. When the inmates run the asylum, it is hard for them to see clearly the nature of the problems that bedevil them. When you look in the mirror, it is all too easy to single out your best features and overlook the warts. When the creators of software-based products examine their handiwork, they overlook how bad it is.
Instead, they see its awesome power and flexibility. They see how rich the product is in features and functions. They ignore how excruciatingly difficult it is to use, how many mind-numbing hours it takes to learn, or how it diminishes and degrades the people who must use it in their everyday lives.
The Origins of This Book I have been inventing and developing software-based products for 25 years. This problem of hard-to-use software has puzzled and confounded me for years.
And a wonderful thing happened! I immediately discovered that after I freed myself from the demands of programming, I saw for the first time how powerful and compelling those demands were.
Programming is such a difficult and absorbing task that it dominates all other considerations, including the concerns of the user. I could only see this after I had extricated myself from its grip. Upon making this discovery, I began to see what influences drove software-based products to be so bad from the user's point of view.
In I wrote a book about what I had learned, and it has had a significant effect on the way some software is designed today. In March , my coauthor Robert Reimann and I released a revised second edition of the book. It was completely rewritten, including updated examples and seven brand new chapters. It is called About Face 2. To be a good programmer, one must be sympathetic to the nature and needs of the computer.
But the nature and needs of the computer are utterly alien from the nature and needs of the human being who will eventually use it. The creation of software is so intellectually demanding, so all-consuming, that programmers must completely immerse themselves in an equally alien thought process. In the programmer's mind, the demands of the programming process not only supersede any demands from the outside world of users, but the very languages of the two worlds are at odds with each other.
The process of programming subverts the process of making easy-to-use products for the simple reason that the goals of the programmer and the goals of the user are dramatically different. The programmer wants the construction process to be smooth and easy. The user wants the interaction with the program to be smooth and easy. These two objectives almost never result in the same program. In the computer industry today, the programmers are given the responsibility for creating interaction that makes the user happy, but in the unrelenting grip of this conflict of interest, they simply cannot do so.
In software, typically nothing is visible until it is done, meaning that any second-guessing by nonprogrammers is too late to be effective. Desktop-computer software is infamously hard to use because it is purely the product of programmers; nobody comes between them and the user. Objects such as phones and cameras have always had a hefty mechanical component that forces them into the open for review.
But as we've established, when you cross a computer with just about any product, the behavior of the computer dominates completely. The key to solving the problem is interaction design before programming. We need a new class of professional interaction designers who design the way software behaves.
Today, programmers consciously design the code inside programs but only inadvertently design the interaction with humans. They design what a program does but not how it behaves, communicates, or informs. Conversely, interaction designers focus directly on the way users see and interact with software-based products. This craft of interaction design is new and unfamiliar to programmers, so—when they admit it at all—they let it in only after their programming is already completed.
At that point, it is too late. The people who manage the creation of software-based products are typically either hostage to programmers because they are insufficiently technical, or they are all too sympathetic to programmers because they are programmers themselves. The people who use software-based products are simply unaware that those products can be as pleasurable to use and as powerful as any other well-designed tool.
Programmers aren't evil. They work hard to make their software easy to use. Unfortunately, their frame of reference is themselves, so they only make it easy to use for other software engineers, not for normal human beings The costs of badly designed software are incalculable.
The cost of Jane's and Sunil's time, the cost of offended air travelers, and the cost of the lives of passengers on Flight cannot easily be quantified. The greatest cost, though, is the opportunity we are squandering. While we let our products frustrate, cost, confuse, irritate, and kill us, we are not taking advantage of the real promise of software-based products: Because software truly is malleable far beyond any other medium, it has the potential to go well beyond the expectations of even the wildest dreamer.
All it requires is the judicious partnering of interaction design with programming. Here is a riddle for the information age: What do you get when you cross a computer with a camera? A computer! Thirty years ago, my first camera, a 35mm Pentax Model H, had a small battery in it that powered the light meter. I merely swapped in a new one every couple of years, as I would a wristwatch battery. Fifteen years ago, my first electronic camera, a 35mm Canon T70, used two AA batteries to power its rather simple exposure computer and its automatic film drive.
If I forgot to turn it off, it automatically shut down after one minute of inactivity. One year ago, my second-generation digital camera, a Panasonic PalmCam, had an even smarter computer chip inside it.
It had modes: I had to put it into Rec mode to take pictures and Play mode to view them on its small video display. In fact, it has a full-blown computer that displays a Windows-like hourglass while it "boots up. There is no "On" setting, and none of my friends can figure out how to turn it on without a lengthy explanation. The new camera is very power-hungry, and its engineers thoughtfully provided it with a sophisticated computer program that manages the consumption of battery power.
A typical scenario goes like this: I aim the camera and zoom in to properly frame the image. Just as I'm about to press the shutter button, the camera suddenly realizes that simultaneously running the zoom, charging the flash, and energizing the display has caused it to run out of power. In self-defense, it suspends its capability to actually take pictures. But I don't know that because I'm looking through the viewfinder, waving my arms, saying "smile," and pressing the shutter button.
The computer detects the button press, but it simply cannot obey. In a misguided effort to help out, the power-management program instantly takes over and makes an executive decision: Shed load.
It shuts down the power-greedy LCD video display. I look at the camera quizzically, wondering why it didn't take the picture, shrug my shoulders, and let my arm holding the camera drop to my side. But as soon as the LCD is turned off, more battery power is available for other systems.
The power-management program senses this increase and realizes that it now has enough electricity to take pictures. It returns control to the camera program, which is waiting patiently to process the command it received when I pressed the shutter button, and it takes a nicely auto-focused, well-exposed, high-resolution digital picture of my kneecap.
That old mechanical Pentax had manual focusing, manual exposure, and manual shutter speed, yet it was far less frustrating to use than the fully computerized, modern Nikon COOLPIX , which has automatic focusing, exposure, and shutter speed. The camera may still take pictures, but it behaves like a computer instead of a camera. A frog that's slipped into a pot of cold water never recognizes the deadly rising temperature as the stove heats the pot.
Instead, the heat anesthetizes the frog's senses. I was unaware, like the frog, of my cameras' slow march from easy to hard to use as they slowly became computerized. We are all experiencing this same, slow, anesthetizing encroachment of computer behavior in our everyday lives. It has a very sophisticated computer brain and offers high fidelity, digital sound, and lots of features.
It wakes me up at a preset time by playing a CD, and it has the delicacy and intelligence to slowly fade up the volume when it begins to play at 6: This feature is really pleasant and quite unique, and it compensates for the fact that I want to hurl the infuriating machine out the window.
It's very hard to tell when the alarm is armed, so it occasionally fails to wake me up on a Monday and rousts me out of bed early on a Saturday. Sure, it has an indicator to show the alarm is set, but that doesn't mean it's useful. The clock has a sophisticated alphanumeric LCD that displays all of its many functions. The presence of a small clock symbol in the upper-left corner of the LCD indicates the alarm is armed, but in a dimly lit bedroom the clock symbol cannot be seen.
The LCD has a built-in backlight that makes the clock symbol visible, but the backlight only comes on when the CD or radio is explicitly turned on.
There's a gotcha, however: The alarm simply won't ever sound while the CD is explicitly left on, regardless of the setting of the alarm. It is this paradoxical operation that frequently catches me unawares.
It is simple to disarm the alarm: Simply press the "Alarm" button once, and the clock symbol disappears from the display.
However, to arm it, I must press the "Alarm" button exactly five times. The first time I press it, the display shows me the time of the alarm. On press two, it shows the time when it will turn the sound off. On press three, it shows me whether it will play the radio or the CD.
On press four, it shows me the preset volume. On press five, it returns to the normal view, but with the alarm now armed. But with just one additional press, it disarms the alarm. Sleepy, in a dark bedroom, I find it difficult to perform this little digital ballet correctly.
Being a nerdy gizmologist, I continue to fiddle with the device in the hope that I will master it. My wife, however, long ago gave up on the diabolical machine. She loves the look of the sleek, modern design and the fidelity of the sound it produces, but it failed to pass the alarm-clock test weeks ago because it is simply too hard to make work.
The alarm clock may still wake me up, but it behaves like a computer. When it was armed, a single red light glowed. When it was not armed, the red light was dark. I didn't like this old alarm clock for many reasons, but at least I could tell when it was going to wake me up. Because it is far cheaper for manufacturers to use computers to control the internal functioning of devices than it is to use older, mechanical methods, it is economically inevitable that computers will insinuate themselves into every product and service in our lives.
This means all of our products will soon behave the same as most obnoxious computers, unless we try something different. This phenomenon is not restricted to consumer products. Just about every computerized device or service has more features and options than its manual counterpart. Yet, in practice, we often wield the manual devices with more flexibility, subtlety, and awareness than we do the modern versions driven by silicon- chip technology.
High-tech companies—in an effort to improve their products—are merely adding complicating and unwanted features to them. Because the broken process cannot solve the problem of bad products, but can only add new functions, that is what vendors do. Later in this book I'll show how a better development process makes users happier without the extra work of adding unwanted features. Porsche's beautiful high-tech sports car, the Boxster, has seven computers in it to help manage its complex systems.
One of them is dedicated to managing the engine. It has special procedures built into it to deal with abnormal situations. Unfortunately, these sometimes backfire. In some early models, if the fuel level in the gas tank got very low—only a gallon or so remaining—the centrifugal force of a sharp turn could cause the fuel to collect in the side of the tank, allowing air to enter the fuel lines.
The computer sensed this as a dramatic change in the incoming fuel mixture and interpreted it as a catastrophic failure of the injection system. To prevent damage, the computer would shut down the ignition and stop the car.
Also to prevent damage, the computer wouldn't let the driver restart the engine until the car had been towed to a shop and serviced.
When owners of early Boxsters first discovered this problem, the only solution Porsche could devise was to tell them to open the engine compartment and disconnect the battery for at least five minutes, giving the computer time to forget all knowledge of the hiccup.
The sports car may still speed down those two- lane blacktop roads, but now, in those tight turns, it behaves like a computer. In a laudable effort to protect Boxster owners, the programmers turned them into humiliated victims.
Every performance-car aficionado knows that the Porsche company is dedicated to lavishing respect and privilege on its clientele.
That something like this slipped through shows that the software inside the car is not coming from the same Porsche that makes the rest of the car. It comes from a company within a company: Somehow, the introduction of a new technology surprised an older, well-established company into letting some of its core values slip away.
Acceptable levels of quality for software engineers are far lower than those for more traditional engineering disciplines. Whenever I withdraw cash from an automatic teller machine ATM , I encounter the same sullen and difficult behavior so universal with computers.
If I make the slightest mistake, it rejects the entire transaction and kicks me out of the process. I have to pull my card out, reinsert it, reenter my PIN code, and then reassert my request.
Typically, it wasn't my mistake, either, but the ATM computer finesses me into a misstep. It always asks me whether I want to withdraw money from my checking, savings, or money-market account, even though I have only a checking account. Subsequently, I always forget which type it is, and the question confuses me. About once a month I inadvertently select "savings," and the infernal machine summarily boots me out of the entire transaction to start over from the beginning. To reject "savings," the machine has to know that I don't have a savings account, yet it still offers it to me as a choice.
The only difference between me selecting "savings" and the pilot of Flight selecting "ROMEO" is the magnitude of the penalty. It doesn't tell me what that amount is, inform me how much money is in my account, or give me the opportunity to key in a new, lower amount. Instead, it spits out my card and leaves me to try the whole process again from scratch, no wiser than I was a moment ago, as the line of people growing behind me shifts, shuffles, and sighs.
The ATM is correct and factual, but it is no help whatsoever. The ATM has rules that must be followed, and I am quite willing to follow them, but it is unreasonably computer-like to fail to inform me of them, give me contradictory indications, and then summarily punish me for innocently transgressing them.
This behavior—so typical of computers—is not intrinsic to them. Actually, nothing is intrinsic to computers: They merely act on behalf of their software, the program. And programs are as malleable as human speech. A person can speak rudely or politely, helpfully or sullenly. It is as simple for a computer to behave with respect and courtesy as it is for a human to speak that way. All it takes is for someone to describe how.
Unfortunately, programmers aren't very good at teaching that to computers. Computers Make It Easy to Get into Trouble Computers that sit on a desk simply behave in the same, irritating way computers always have, and they don't have to be crossed with anything. My friend Jane used to work in public relations as an account coordinator.
The core of Windows 95 is the hierarchical file system. All of Jane's documents were stored in little folders, which were stored in other little folders. Jane didn't understand this or see the advantage to storing things that way. Actually, Jane didn't give it a lot of thought but merely took the path of least resistance. Jane had just finished drafting the new PR contract for a Silicon Valley startup company.
She selected Close from the File menu. Instead of simply doing as she directed and closing the document, Word popped up a dialog box. She responded—as always—by pressing the Enter key. She responded this way so consistently and often that she no longer even looked at the dialog box. The first dialog box was followed immediately by another one, the equally familiar Save As box.
It presented Jane with lots of confusing buttons, icons, and text fields. The only one that Jane understood and used was the text-entry field for File Name.
She typed in a likely name and then clicked the Save button. The program then saved the PR contract in the My Documents folder. Jane was so used to this unnecessary drill that she gave it no thought. At lunchtime, while Jane was out of her office, Sunil, the company's computer tech, installed a new version of VirusKiller 2. After viewing the file, Sunil closed it and returned Jane's computer to exactly the way it was before lunch. At least, he thought he did.
After lunch, Jane needed to reopen the PR contract and get a printout to show to her boss. Jane selected Open from the File menu, and the Open dialog box appeared. Jane expected the Open dialog box to show her, in neat alphabetic order, all of her contracts and documents. Instead, it showed her a bunch of filenames that she had never seen before and didn't recognize. One of them was named Readme. Of course, when Sunil used Word to view the Readme file, he instructed Jane's copy of Word to look in an obscure folder six levels deep and inadvertently steered it away from Jane's normal setting of My Documents.
Jane was now quite bewildered. Her first, unavoidable thought was that all of her hard work had somehow been erased, and she got very worried. Finally, in a state approaching panic, Jane telephoned Sunil to ask for his help. Sunil was not at his desk, and it wasn't until Monday morning that he had a chance to stop by and set things right.
Although computer operating systems need hierarchical file systems, the people who use them don't. It's not surprising that computer programmers like to see the underlying hierarchical file systems, but it is equally unremarkable that normal users like Jane don't.
Unremarkable to everyone, that is, except the programmers who create the software that we all use. Jane's frustration and inefficiency is blamed on Jane, and not on the programmers who torpedoed her.
At least Jane has a job. Many people are considered insufficiently "computer literate" and are thus not employable. As more and more jobs demand interaction with computers, the rift between the employable and the unemployable becomes wider and more difficult to cross. Politicians may demand jobs for the underprivileged, but if the underprivileged don't know how to use computers, no company can afford to let them put their untrained hands on the company's computers.
There is too much training involved, and too much exposure to the destruction of data and the bollixing up of priceless databases. The obnoxious behavior and obscure interaction that software-based products exhibit is institutionalizing what I call "software apartheid": Otherwise-normal people are forbidden from entering the job market and participating in society because they cannot use computers effectively. In our enlightened society, social activists are working hard to break down race and class barriers while technologists are hard at work inadvertently erecting new, bigger ones.
By purposefully designing our software-based products to be more human and forgiving, we can automatically make them more inclusive, more class- and color-blind.
Commercial Software Suffers, Too Not only are computers taking over the cockpits of jet airliners, but they are also taking over the passenger cabin, behaving in that same obstinate, perverse way that is so easy to recognize and so hard to use. Modern jet planes have in-flight entertainment IFE systems that deliver movies and music to passengers.
Advanced IFE systems are generally installed only on larger airplanes flying transoceanic routes. One airline's IFE system was so frustrating for the flight attendants to use that many of them were bidding to fly shorter, local routes to avoid having to learn and use the difficult systems.
This is remarkable, considering that the time-honored airline route-bidding process is based on seniority, and that those same long-distance routes have always been considered the most desirable plums because of their lengthy layovers in exotic locales such as Singapore or Paris. For flight attendants to bid for unglamorous, unromantic yo-yo flights from Denver to Dallas or from Los Angeles to San Francisco just to avoid the IFE system indicated a serious morale problem.
Any airline that inflicted bad tools on its most prized employees—the ones who spent the most time with the customer—was making a foolish decision and profligately discarding money, customer loyalty, and staff loyalty. The computer IFE system that another large airline created was even worse.
It linked movie delivery with the cash-collection function. In a sealed jet airplane flying at 37, feet, cash-collection procedures had typically been quite laissez-faire; after all, nobody was going to sneak out the back door. Flight attendants delivered goods and services when it was convenient and collected later when their hands weren't full and other passengers weren't waiting for something.
This kept them from running unnecessarily up and down the narrow aisles. Sure, there were occasional errors, but never more than a few dollars were involved, and the system was quite human and forgiving; everyone was happy and the work was not oppressive. With cash collection connected to content delivery by computer, the flight attendant had to first get the cash from the passenger, then walk all the way to the head end of the cabin, where the attendant's console was, enter an attendant password, then perform a cash-register-like transaction.
Only when that transaction was completed could the passenger actually view a movie or listen to music. This inane product design forced the flight attendants to walk up and down those narrow aisles hundreds of extra times during a typical trip. Out of sheer frustration, the flight attendants would trip the circuit breaker on the IFE system at the beginning of each long flight, shortly after departure. The airline had spent millions of dollars constructing a system so obnoxious that its users deliberately turned it off to avoid interacting with it.
The thousands of bored passengers were merely innocent victims. And this happened on long, overseas trips typically packed with much-sought-after frequent flyers.
I cannot put a dollar figure on the expense this caused the airline, but I can say with conviction that it was catastrophically expensive. The software inside the IFE systems worked with flawless precision but was a resounding failure because it misbehaved with its human keepers.
How could a company fail to predict this sad result? How could it fail to see the connection? The goal of this book is to answer these questions and to show you how to avoid such high-tech debacles. In September , while conducting fleet maneuvers in the Atlantic, the USS Yorktown, one of the Navy's new Aegis guided-missile cruisers, stopped dead in the water.
A Navy technician, while calibrating an on-board fuel valve, entered a zero into one of the shipboard management computers, a Pentium Pro running Windows NT. The program attempted to divide another number by that zero—a mathematically undefined operation—which resulted in a complete crash of the entire shipboard control system. Without the computers, the engine halted and the ship sat wallowing in the swells for two hours and 45 minutes until it could be towed into port.
Good thing it wasn't in a war zone. What do you get when you cross a computer with a warship? Admiral Nimitz is rolling in his grave!
Despite this setback, the Navy is committed to computerizing all of its ships because of the manpower cost savings. To deflect criticism of this plan, it blamed the "incident" on human error. Because the software-creation process is out of control, the high-tech industry must bring its process to heel, or else it will continue to put the blame on ordinary users while ever-bigger machines sit dead in the water.
Techno-Rage An article in the Wall Street Journal once described an anonymous video clip circulated widely by email that showed a "[m]ustachioed Everyman in a short-sleeved shirt hunched over a computer terminal, looking puzzled. Suddenly, he strikes the side of his monitor in frustration. As a curious co-worker peers over his cubicle, the man slams the keyboard into the monitor, knocking it to the floor.
Rising from his chair, he goes after the fallen monitor with a final, ferocious kick. The man in the video may well be an actor, but he touches a widespread, sympathetic chord in our business world.
The frustration that difficult and unpleasant software-based products are bringing to our lives is rising rapidly. Joke emails circulate on private email lists about "Computer Tourette's. The joke is that you can walk down the halls of most modern office buildings and hear otherwise-normal people sitting in front of their monitors, jaws clenched, swearing repeatedly in a rictus of tense fury. Who knows what triggered such an outburst: Or maybe the program just blandly erased the user's only copy of a page manuscript because he responded with a Yes to a confirmation dialog box, assuming that it had asked if he wanted to "save your changes" when it actually asked him if he wanted to "discard your work.
Cognitive Friction It's one thing to see that a problem exists, but it's quite another to devise a solution. One key part of problem solving is the language we use. Over the years, I've developed many useful terms and mental models.
They have proven vital to framing the problem presented by hard-to-use software-based products. In this chapter I will introduce those terms and ideas, showing how they can help bring the benefits of interaction design to our troubled process. Behavior Unconnected to Physical Forces Having just left the industrial age behind, we are standing at the threshold of the information age with an obsolete set of tools.
In the industrial age, engineers were able to solve each new problem placed before them. Working in steel and concrete, they made bridges, cars, skyscrapers, and moon rockets that worked well and satisfied their human users. As we tiptoe into the information age, we are working increasingly in software, and we have once again brought our best engineers to the task.
But unlike in the past, things haven't turned out so well. The computer boxes are fast and powerful, and the programs are generally reliable, but we have encountered a previously unseen dimension of frustrated, dissatisfied, unhappy, and unproductive users. Today's engineers are no less capable than ever, so I must deduce from this that, for the first time, they have encountered a problem qualitatively different from any they confronted in the industrial age.
Otherwise, their old tools would work as well as they ever did. For lack of a better term, I have labeled this new problem substance cognitive friction. It is the resistance encountered by a human intellect when it engages with a complex system of rules that change as the problem changes. Software interaction is very high in cognitive friction. Interaction with physical devices, however complex, tends to be low in cognitive friction because mechanical devices tend to stay in a narrow range of states comparable to their inputs.
Playing a violin is extremely difficult but low in cognitive friction because—although a violinist manipulates it in very complex and sophisticated ways—the violin never enters a "meta" state in which various inputs make it sound like a tuba or a bell.
The violin's behavior is always predictable—though complex—and obeys physical laws, even while being quite difficult to control. In contrast, a microwave oven has a lot of cognitive friction, because the 10 number keys on the control panel can be put into one of two contexts, or modes. In one mode they control the intensity of the radiation, and in the other they control the duration. This dramatic change, along with the lack of sensory feedback about the oven's changed state, results in high cognitive friction.
When you press the E key, the letter E appears on the page. On a computer—depending on the context—you may also get a metafunction. The behavior of the machine no longer has a one-to-one correspondence to your manipulation. Cognitive friction—like friction in the physical world—is not necessarily a bad thing in small quantities, but as it builds up, its negative effects grow exponentially.
Of course, friction is a physical force and can be detected and measured, whereas cognitive friction is a forensic tool and cannot be taken literally. Don't forget, though, that such things as love, ambition, courage, fear, and truth—though real—cannot be detected and measured. They can't be addressed by engineering methods, either. The skilled engineers who manufacture microwave ovens typically consult with human-factors experts to design the buttons so they are easy to see and press.
But the human-factors experts are merely adapting the buttons to the user's eyes and fingers, not to their minds. Consequently, microwave ovens don't have much "friction" but have a lot of cognitive friction. It is easy to open and close the door and physically press the buttons but, compared to the simplicity of the task, setting the controls to achieve your goals is very difficult.
Getting the microwave to perform the work you intend for it is quite difficult, though our general familiarity with it makes us forget how hard it really is. How many of us have cooked something for one second or one hour instead of for one minute? How many of us have cooked something at a strength of 5 for 10 minutes instead of a strength of 10 for 5 minutes?
On the computer screen, everything is filled with cognitive friction. Even an interface as simple as the World Wide Web presents the user with a more intense mental engagement than any physical machine. This happens because the meaning of each blue hyperlink is a doorway to some other place on the Web. All you can do is click on a hyperlink, but what the link points to can change independently of the pointer without any outward indication.
Its sole function is pure metafunction. The very "hyper"ness is what gives it cognitive friction. How We React to Cognitive Friction Most people, even apologists, react to cognitive friction in the same way.
They take the minimum they need from it and ignore the rest. Each user learns the smallest set of features that he needs to get his work done, and he abandons the rest. The apologists proudly point out that their wristwatches can synchronize with their desktop calendar systems, but they conveniently neglect to mention that it has been six months since they used that feature.
Successfully reported this slideshow. We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads.
You can change your ad preferences anytime. Upcoming SlideShare. Like this presentation? Why not share! An annual anal