The first chapter from Don Norman’s, The Art of Everyday Things, was certainly a relatable reading that captured my interests with funny stories and sadly true perception. His view on human-centered design stuck out to me the most. It was intriguing to read about the story of his friend who was stuck in a postal service entry-way in Europe. The fact that the design was so pretty that it caused the functionality of the doors to be near impossible to figure out is what intrigues me the most.
As an aspiring graphic designer, I’ve encountered a few of these situations myself. There is a time in the design process that you have to really figure out if a design is truly user friendly. I’ve always looked at those situations in the form of graphic design, so it was very interesting to discover how often human-centered design is actually used, or not used according to Norman’s experiences. I personally don’t view Norman’s ideas as a way of saying that simplicity is key, when it comes to overall design. I believe that there is almost always a way to make something visually appealing but self-explanatory enough at the same time. Let’s use a phone app for example. Twitter, Facebook, Instagram, and many other social media apps are appealing to the eye and very simple to navigate through. Those apps are so complex at the same time, but their designs and user interface are simple enough for most users. My only complaint with any of those social media apps I listed above is that Facebook makes it very unclear about how to view photo’s on a friends profile or even your own. They blend it in right under where you would make a post, so every time I see it I think that the photo’s button is how you attach a photo to a post. I’m getting a little off target here, but that is something that really annoys me about the Facebook app.
I was recently in a web design class and it taught me so much about human-centered design. There was a whole flow chart that I had to create that explained my websites user interface and how I made it simple for people to navigate, while also making it appealing to each users eye. While creating my website, I set a specific set of rules for myself to follow in order to accomplish hassel-free user-ability. Those rules were: make headlines bold, make buttons stand out in an appealing fashion, make the footer small but understandable, and several other little things that allowed my users to scroll and click on things in an orderly fashion. When it was time for peer feedback I didn’t only take my classmates into account, I also had my mother test it and tell me what confused her and what was easy. My classmates are fairly tech savvy, like the engineers that Norman worked with, so I didn’t fully trust their opinions on what worked well. Having that extra opinion only helps and proves Norman’s point even further that people like us who have studied how technology works are ignorant towards people that haven’t.
Norman lists and goes into much detail about the four “Fundamental Principles of Interaction”. Those four principles are affordance, signifier, mapping, and feedback.
Affordance:
This is the relationship between a person and the object. Norman says, “affordance is jointly determined by the quality of the object and the abilities of the agent that is interacting”. An object isn’t affordable if the user can’t fully operate the way the object is intended to. Affordances determine what actions are possible.
A physical example of an Affordance would be a light switch. A light switch is usually just a switch that flicks up and down to turn a light on or off. The switch is very simple in design and functions very easily. For users that might be confused if the switch is on or off, usually, can just look at it and see the switch has engravings for which setting it’s on.

A digital example of affordance would be a scroll bar on a website. Most webpages have a scroll bar, whether it’s on the right side of the screen to scroll up and down or at the bottom of the page to scroll side to side. It is a function that is simple and the user almost always notices it and knows how to use it.

Signifier:
Signifiers go hand in hand with affordances. Affordances tell what’s possible, while signifiers tell you what to do. To elaborate on that a little bit, signifiers are meant to signify. They tell you exactly where to click or tap. It is literally that simple. Signifiers are just signals that tell you what you can do.
A physical example of a signifier is an exit sign. You usually see them hovering above a door and they have red text that light up when it’s dark. When you approach the doors of your local supermarket, one of the first things you think of when walking up to the door is “which one opens for me to exit through?”. Well, the answer is almost always right in front of you with that sign dangling above the exit door. To be fair, the only reason they look the way they do is because they are meant to mostly be a guide in case of emergency, but they still do the same job even when there isn’t one.

A digital example is pretty easy to come up with. When you look at your home screen on your phone, what do you see? If you have an iPhone, I know that every time you unlock your phone you see a bunch of bubble with images in them. It doesn’t take a genius to know that those bubbles are actually app icons. Those icons signify that you can tap on them to open the app.

Mapping:
This one can seem to be a bit complicated so bare with me here. Mapping is the design and layout and how it corresponds between object and user. Norman gives the example of a classroom or auditorium having rows of lights and a row of switches that layout just right, so you know which switch controls which row of lights.
A physical example of mapping would be a gaming controller. The PlayStation 5 is a gaming system that uses a wireless controller to allow the user push a button or sequence of buttons to perform an action. A new user almost always struggles with figuring this out but will pick it up quickly the more they play. If the controls don’t feel right for the user, most games have a series of settings ironically called button mapping, which allows the user to select what each button does.

A digital example for mapping would be swiping up on your iPhone to unlock it. Is it the best mapping out there? Probably not, but with the ever evolving technology in phones, buttons just aren’t as necessary as they once were. Buttons also take up space, so being able to swipe up and keep that extra half inch or so of screen is definitely nice.

Feedback:
This one is probably one of the easier principles to understand. Feedback is basically a symbol. That symbol gives the user information on whether something works or not.
When a fire alarm goes off in a public place, you usually will see flashing lights and a deafening noise. That’s feedback from the smoke detector saying that something is on fire and you should probably evacuate the building. This is a prime example of a physical form of feedback.

A digital example of feedback would be an error message displaying when you lose connection to the Wi-Fi or when your mobile data can’t connect. Sometimes the site or app will notify you before your phone or computer can comprehend that it’s not connected, but either way that notification will always pop up.
