Are You Smart Enough to Feel?

 There is growing evidence that emotion and cognition are largely intertwined (Meyer & Turner, 2006). Emotion may play a key role in activating learning centers in the brain, influencing attention and sophistication of thinking (Hu et al., 2007; Petty & Briñol, 2015). These studies suggest that emotion can fundamentally shape what we pay attention to, as well as our awareness and interpretation of events and therefore what is remembered and learned (Immordino-Yang, 2015; Norman, 2004; Ortony, Norman, & Revelle, 2004). Therefore, it would not be a wrong statement to posit that we need to have emotions to be smart and vice versa. This assertion resonates with the arguments in my previous blog post: “Emotions, Cognition, and Social Discourse in Redefined Community of Inquiry: A Vision Created through the Lens of Nietzsche’s Genealogy”, as it also addresses how emotions filter our attentional focus, which ultimately changes what we perceive from our interactions with the surroundings. According to the Vygotskian perspective, knowledge begins in the external world and is later internalized by the individual, and due to the deterministic nature of our perceptual structure, we can only see one set of the external structures (Schwartz & Heiser, 2006).  Our limited attention capacity to see all the aspects of external structures and the need to construct deterministic perceptuomotor patters rather than vague ones, our cognition needed to use an instrument that serves as a guide to direct our limited attention (Wilson, 2002). Can you guess what?  YES! EMOTIONS! What is more is that they are not only instrumental in perceptuomotor activities situated in real life context, but also guide us in our off-line cognitive activities such as imagining, reflecting, and so forth (Wilson, 2002). So, we need out emotions to direct our cognitive processes.

Are you bored yet? Do not worry, I am done with the concept and theory throwing section of the post. Now, let’s discuss about what is fun about these arguments! So, there are exciting and concerning developments are happening in AI technologies. With the existing cognitive capacity, they can beat us in games, drive car without our guidance, and even perform surgery (Vincent, 2017). How exciting and humiliating! It seems that AI technologies can perform much better than humans in tasks that require monotonic cognitive processing, and they seem to integrate the strategies for embodied learning quite well. For example, self-driving cars can recognize the signs and act upon them.  However, does cognitive process only include such monotonic activities?  Of course not! We imagine, we create, we feel… If we need to feel to be considered as intelligent beings as discussed in the first paragraph, then, are AI technologies really smart or are they just more advanced and faster processing systems that need human inputs all the time in order to function? Well, the current trend seems to be focusing on this question: can AI be imaginative, creative, curious and emotional beings just like us (Boden, 2015; Giles, 2018; Hall, 2017; Knight, 2017)? I would like to touch this question in this post as it has a lot to offer to online education.

Creative, Generative, Imaginative and Curious AI

The current technologies and models allow AI to have combinational creativity, which suggests that using the existing pieces they can create unfamiliar ways. However, despite this certain level of creativity as Boden (2015) discusses, AI is not ready to supplement human artists as it is not able to create new ways and new combinations from the existing pieces by filling the missing or vague pieces through imaginative and generative cognitive processes. The need for human input in this process is another limitation of AI creativity. As an attempt to solve this problem, Ian Goodfellow created a model called GAN, or “generative adversarial network” that enables AI to engage in deep learning or unsupervised learning (Giles, 2018). In other words, using limited amount of resources and input, using this model, AI can “imagine” the missing or vague pieces, and generate creative artifacts. What this suggests is similar to our off-line cognitive activities to eliminate the vagueness in our conceptual patterns, AI will be able to create deterministic conceptual patterns using small chunks of input through imagining and generating. So, they are gaining humanistic conciseness, and they are becoming less and less dependent on human input. Another model aiming to make AI smarter so they can generate and create solutions for even complex and ill-structured problems was proposed by researchers at the University of California, Berkeley (2017). They call the model as “intrinsic curiosity model”, and they aim to equip AI with the ability of getting curious about the environment to solve the problem. More specifically, AI explores the surroundings and experiment with nearby objects once it is introduced to an environment instead of directly acting upon the demanded task (Berkeley, 2017).  Through exploration of the environment and examination of the surrounding objects, AI may be able to perceptuomotor patterns stored in episodic memory, and later, they can use those patterns to learn and create more. Sounds familiar? That is exactly what humans do according to embodied cognition theory as briefly touched in the first paragraph (Wilson, 2002).

Emotional AI

AI can imagine, create, generate and engage in self-learning now! But can it (?) feel?  Or maybe, we should ask should it feel? A humanoid robot, called Octavia, was designed to fight fires on Navy ships, and she has mastered an impressive range of facial expressions as a way of expressing certain emotions (Hall, 2017). She demonstrates emotional responses to certain stimulus. For example, “She looks pleased…when she recognizes one of her teammates. She looks surprised when a teammate gives her a command she wasn’t expecting. She looks confused if someone says something she doesn’t understand” (Hall, 2017) Using some sensorimotor tools, she can also engage in embodied learning, and she can process massive amounts of information about her environment and create “embodied cognitive architecture” (Hall, 2017), which is quite a similar way of how humans act and think. Currently, she can create emotional responses; however, that does not suggest she actually feels. Even though the emotional reactions might make her more approachable for humans, those mechanic reactions programmed to be produced in certain situations limit her cognitive abilities (due to the strong interrelation between emotions and cognition, discussed above). Would not it be great if she could actually feel empathy for a person and act upon it? Well, it might depend on why we will need a smart AI technology who can also feel. The current applications are mostly for military purposes. If we are going to ask the machine to risk its life and destroy the bombs in the war arena, creating it in a way that it actually feels would be extremely cruel. So, there are some ethical considerations to address. Plus, the interactions between humans and AI machines might gain socio-emotional dimension if AI machines are developed with the capacity of feeling humanistic emotions. (Well, at least I hope. This would mean I still have a chance to avoid dying as a spinster.) Can you imagine developing bonds with a robot, and then, with your limited life capacity, you die leaving her/him mourn behind you? The image I provided in the introduction is actually a subtle touch on this potential ethical situation since in that movie, a man falls in love with an AI machine. Another ethical concern would be if AI with humanistic emotions is used in non-humanistic tasks, would they develop sadistic characteristics which in return destroys the human species?  Many questions still remain to be answered.

Implications for Online Education

What do all these arguments suggest for the future of online education? What kind of implications can we expect to see within that realm? Well, if we can have someone who shows personalized emotional reactions and help us quickly and efficiently once we need help and support in our online learning experiences, that would solve the common problems such as limited social presence of instructor, lack of personalized scaffolding, lack of personalized emotional support and scaffolding, limited or lack of on-time interventions, and so forth. Plus, if it learns on its own by observing how humans interact would decrease the amount of workload drastically. Such an AI technology would also help personalize overall online learning experience, and better address the needs and demands of learners by adjusting the content, learning goals, assessment strategies, course schedule, sequencing and so forth according to the individual learner. What do you all think?

 

References

The image is taken from the movie “Her”. For more information, please visit the following IMBD page: http://www.imdb.com/title/tt1798709/

Boden, M. A. (2015, October 20). Artificial Creativity: Why computers aren’t close to being ready to supplant human artists. Retrieved from https://www.technologyreview.com/s/542281/artificial-creativity/

Giles, M. (2018, February 21). The GANfather: The man who’s given machines the gift of imagination. Retrieved from https://www.technologyreview.com/s/610253/the-ganfather-the-man-whos-given-machines-the-gift-of-imagination/?utm_source=facebook.com&utm_medium=social&utm_content=2018-02-28&utm_campaign=Technology+Review

Hall, L. (2017, October 24). How We Feel About Robots That Feel. Retrieved from https://www.technologyreview.com/s/609074/how-we-feel-about-robots-that-feel/

Hu, H., Real, E., Takamiya, K., Kang, M. G., Ledoux, J., Huganir, R. L., & Malinow, R. (2007). Emotion enhances learning via norepinephrine regulation of AMPA-receptor trafficking. Cell, 131(1), 160-173.

Immordino-Yang, M. H. (2015). Emotions, learning, and the brain: Exploring the educational implications of affective neuroscience. New York, NY: WW Norton & Co.

Knight, W. (2017, May 23). Curiosity May Be Vital for Truly Smart AI. Retrieved from https://www.technologyreview.com/s/607886/curiosity-may-be-vital-for-truly-smart-ai/

Meyer, D. K., & Turner, J. C. (2006). Re-conceptualizing emotion and motivation to learn in classroom contexts. Educational Psychology Review, 18(4), 377-390.

Ortony, A., Norman, D. A., & Revelle, W. (2004). The role of affect and proto-affect in effective functioning. In J.-M. Fellous & M. A. Arbib (Eds.), Who needs emotions? The brain meets the machine (pp.173-202). New York: Oxford University Press.

Petty, R. E., & Briñol, P. (2015). Emotion and persuasion: Cognitive and meta-cognitive processes impact attitudes. Cognition and Emotion, 29(1), 1-26.

Schwartz, D. & Heiser, J. (2006). Spatial Representations and Imagery in Learning. In R. Sawyer (Eds.), The Cambridge handbook of the learning sciences (Cambridge Handbooks in Psychology, (pp. 283-298). Cambridge: Cambridge University Press.

Vincent, J. (2017, October 18). DeepMind’s Go-playing AI doesn’t need human help to beat us anymore. Retrieved from https://www.theverge.com/2017/10/18/16495548/deepmind-ai-go-alphago-zero-self-taught

Wilson, M. (2002). Six views of embodied cognition. Psychonomic Bulletin & Review, 9(4), 625-636.

Leave a Reply

Your email address will not be published. Required fields are marked *