It’s not something most of us spend much time thinking about, but poor design truly is all around us. Norman starts off his book by relating how he finds it difficult at times to figure out how to open doors. He himself points out how it sounds silly, yet at the same time I think we’ve all had it happen at least once or twice. Further, he then describes an incident his friend had with a set of doors so poorly designed that honestly I’m not sure anyone could have been expected to know how to open without trial and error. This is where his argument really hooks you in, or at least it was for me personally. I find that offering an extreme example is a good way to lead into pointing out more mundane issues, and bringing up a set of doors with no conceivable way of knowing which way to push them open is certainly one way to start!
Of course, having already hooked the reader with the example of the doors, he’s careful to actually reel them in from there instead of just letting the line unwind. He immediately starts on a second example, pointing out how a husband and wife working in fields associated with incredibly high intelligence were both profoundly confused by the settings on their high-tech washing machine, the husband refusing to touch it and the wife simply using a single memorized setting. Again, I think most of us have that experience even with more mundane washing machines. I know I certainly default to remembering one pattern of buttons when I’ve chucked a given load of clothes into the machine.
From here he goes into the fact that every man-made object around us is actively and painstakingly designed. Now, this is something I already learned and had my glowing epiphany on when taking another design course, so I suppose unfortunately for him the wow factor of this part of the reflection is a bit faded. Now that I’ve long since had pointed out that every product we interact with is , at least ideally speaking, carefully designed to maximize the level to which it is both easy to use and enjoyable to use, that particular bit becomes a tad less novel. But it’s still a useful lesson for the uninitiated, albeit one I think an author more specifically focused on teaching it might be a better way to introduce a reader to.
Of course, there’s still something of a point to the refresher, as he then uses it to lead into a new lesson, one born of an alien mindset but easily understandable nonetheless. Namely, that engineers often have a tendency to design things in a needlessly opaque fashion and then subsequently assume that since they know how to use it, surely everyone else also can easily learn the same by simply reading the instructions. Having started from an engineering perspective himself, he somewhat amusingly goes on to recount how he had to gradually deprogram the Dunning-Kruger effect out of himself as he switched to focusing on psychology. More soberingly, he also points out that the Three Mile Island incident was the result of exactly this sort of blind spot in design mentality. Thus he emphasizes the lesson that we must always remember that we’re designing the machine to be used by the human, not the other way around.
“You’re designing the machine to be used by the human, not the human to be used by the machine.” It’s a reflection worth repeating, as the next several pages of the chapter can be summed up in that single sentence. Humans are flawed, humans make mistakes, humans are trained to rely on specific cues. These are all things we have to be careful to never forget during the design process.
With this reminder cemented, he thus establishes the first of our four principles of interaction: affordances.
Affordance can be roughly defined as meaning “what can I do with this object?” If you can use an object a certain way, that’s an affordance. Norman points this out with the immediate example of a chair, in this case. A chair is intended to be sat on, so of course that’s an affordance it offers. But a chair can also be lifted and moved by some people, yet not others. In that case, moving the chair is an affordance offered to some people, but unavailable to others. He also points out that some objects are designed with anti-affordances, or rather, specific things it’s intended to prevent someone from doing. You can’t blow air through a pane of glass, hence the prevention of air flow through glass is an anti-affordance.
Let’s see if we can find a couple more examples of affordances on our own then, shall we? A floor is a good example of something with a lot of affordances, for example. You can walk on it, you can place things on it, you can roll something round across it if you feel like it. A floor is something easy to imagine the possibilities with, but it’s not all that useful for us as web design students. So let’s try another example, shall we? Buckle up, everyone, we’re about to get a little bit meta.
We all used this page to create blog posts, didn’t we? What we weren’t consciously thinking about at first, of course, is that the very fact that we can create blog posts with WordPress software is an affordance. We can create a blog with this page, hence we are afforded to. Pretty neat, right?
Of course, Norman then goes on to point out that affordances can easily be confused with the next principle he points out: signifiers. I personally I think signifiers are much easier to separate mentally from affordances when you put each into layman’s terms. If affordances are “what can I do with this object,” then signifiers are “what tells me where I should interact with this object?” Again, Norman gives us several examples. A plate on a door tells us to push it. A knob tells us to turn it then either push or pull. Slots tell us to insert things into them, balls tell us to throw them or bounce them. Signifiers are the visual cues that let us know where there’s affordances we might have otherwise missed.
Again, let’s take a step back and think about where we see signifiers in the world around us. I think I’ll cheat a little here and give two examples that are technically called the same thing, although that’s not really quite the case. Buttons. Think about a button on a controller. When you think of something in the world around you that actively communicates to you “you can do something with this object by touching it over here,” a button is probably what immediately comes to mind. Going back to our work page, then, this is also a signifier.
We call it a button, but it’s not really the same thing, is it? A button on a controller is something we physically press. The Add Media button on our blog page is something that we interact with virtually through code. But we still associate it with a real, tangible button because chances are there’s a physical button on a computer mouse or laptop that we’re pressing in the real world in order to take advantage of the affordance that the virtual “button” provides.
Now that we know what interacting with an object is called and what the things that indicate to us where we should interact with them are called, the next concept we need a word for is “if three objects are lined up next to three buttons, then the buttons in front of each object should be the ones to turn each respective one on, right?” This is Mapping. The intuitive feeling we get that if a signifier or set of signifiers is lined up a certain way, then the affordances they point to should follow a similar pattern. If you see a set of light switches in a room, you can figure out due to their positioning that each one controls a set of lights in the same room. If you turn a steering wheel one way, you can assume that a car will go in a matching direction.
Now for our own examples! Let’s think about turning the page of a book. A book written in English uses mapping to indicate that the more you turn pages to the right, the further along you are in the narrative. However, if a book is written in Japanese, mapping indicates that the further left you turn the pages, the farther along you go. Mapping isn’t a modern invention, it’s something inherent in the way we as humans make things. Even something as ancient as language encourages mapping by it’s very nature. It’s intuitive, and that’s the point of designing with mapping in mind.
Going back to our blog editing page, the scroll bar is an example of mapping in web design. Much like turning a physical page, the scroll bar is also intended to navigate a digital “page.” The further you scroll down, the further along in a document you get. Physical or digital, it’s all about employing direction to clue people into how they’re supposed to interact with something.
So we’ve answered the “what,” “how,” and “where” of design. Now time to cover our last principle, Feedback. Otherwise known as, “how do I tell that I actually just did something or not?” This is how the senses interact with affordances to let you know that you actually succeeded or failed at using them. Seeing a target is feedback that lets you know where to throw a ball to hit it, Norman points out to us. But personally I think his best communicated examples are the tactile ones. How we feel a glass and use that information to make sure we don’t drop it, break it, or spill it. Tactile feedback is just so easy and fun to understand.
I think that countless hours of video games may have made me a little bit of a nerd for controller design, because I’m going to go back to that for a second to offer my own example of physical feedback. In this case, however, I’m going to cover the concept of controller rumble. In most video game controllers these days, as well as in cell phones, there’s a bit of machinery installed that makes the device shake when certain programmed events happen. In phones this is used to help remind you that your device is ringing. In video game controllers, on the other hand, it’s used to put a bit of actual feeling behind hitting an enemy with an attack. If you swing a sword at a monster, there’s a little bit of kickback in your hands, letting you know that the attack landed. I think one of my favorite examples of feedback in video game controllers is when playing the Nintendo Switch port of Skyrim. Now, for those of you familiar with Skyrim, one has the ability to pick locks in the game in order to open doors and treasure chests you might not otherwise be permitted to dig into. The Nintendo Switch has a particularly sensitive rumble system in it’s controllers, however, so when porting the game they slipped in a subtle upgrade to the lockpicking system. The controller “clicks” subtly as you turn your lockpick, and the closer you are to the exact spot you need to stick the implements in order to unlock the device, the stronger the “click” feeling gets. Once you’ve figured out how the feedback works, lockpicking becomes utterly trivial. Even the most complicated of locks are now mere playthings, all through the power of feedback!
Ahh, but I’m getting ahead of myself. Unfortunately web design doesn’t offer such tactile examples of feedback. But visual feedback is still very much a thing.
Look at the difference created in the “Page Settings” tab on our blog editing page before and after clicking on it. Once you’ve clicked on it, the tab becomes outlined in a blue color. That’s the feedback. Without the outline, you might potentially click on it and then forget that you’ve done so, assuming that the tab simply always existed or that you used some other affordance to activate it. But the blue outline lets you know that you “clicked” on the tab, and that’s what caused it to drop down. It’s visual cues like this that are very subtle examples of good design. It’s like writing the word “said” in a book. The viewer instinctively treats it as though it’s invisible. Yet if it were ever absent, the void and subsequent bad design left behind would suddenly become very obvious.
At any rate, I’ve gone on for three times the required text length on this post. Hopefully some of you have survived long enough through this to tell me where I should tweak some things, because otherwise I might be in a bit of a pickle editing this draft all on my own! Then again, I suppose any lack of critique on this post is just evidence of my own poor design in this case. . .
I also have gone over…but I think that it was going to happen anyways. I think you did a great job of explaining things. I would have liked to see a different example at the end instead of another button type, but I think it is fine!
Hey stephanie! So first off, I love your initial meme right at the top haha! Great way to start. I also like that you touched on video games, because, while obvious, my mind hadn’t put controller and vibration with the principles.
As for critical feedback: I think some of your wording is a little strange. Takes a couple read-overs to get. It’s probably your normal speaking rhythm, but for reading it’s a bit off-putting. “He himself points out how it sounds silly, yet at the same time I think we’ve all had it happen at least once or twice. ” Minimally, get rid of “himself” and it clears up a lot of the sentence.
Anyways, I’d say just re-read your post outloud and it’ll be easier to identify where the reader might stumble.
Hey Samantha,
Nice article! I really like how well you described everything and that you did it in such a way that if someone outside of our class were to read it, they would still have a good understanding of everything in the chapter. The pictures were also placed really nicely and I appreciate the meme for some humor. Awesome job!