Operant Conditioning

Shaping is a part of operant conditioning used to develop a behavior in an individual. “Shaping reinforces guide behavior closer towards a desired behavior,” also known as successive approximations (Wede). Shaping was first used by B.F. Skinner when he invented the Operant Chamber to study animal behavior. Skinner would use the chamber as a box to train the rats to press down on a lever in order to get food, water, or another reinforcer. He trained the rats by using successive approximations, he continued to reinforce until the final behavior was reached. If the rat was close to pressing down the lever it was rewarded. Rewards were given until the rat learned to press down on the lever. Shaping is common in everyday life, like learning how to do something. For example, I have used shaping to train my brother’s cat. Similar to a dog, I have shaped the cats behavior by first giving him a treat, repetitively telling him to give me his paw, and then once he does I will give him another treat. The treat served as a positive reinforcement, which worked to increase the cat’s behavior by presenting a positive stimulus. After many cycles of this, he eventually learned that when he wanted a treat that he would give me his paw first and then I would give him a treat. It is important to understand if the individual is not progressing, to use simpler steps of reinforcing. They will not learn the new behavior right away, so it will take time. I think shaping is a really interesting principle because essentially this is how both humans and animals learn. Our parents raised us with techniques involving shaping. How do you think we were able to walk, talk, be potty trained, as infants? I am sure we would have learned eventually by observing others, but the only way for it to be built into us was for our parents to shape it. 

 

Wede, J. (2019). Introduction to Psychology, lecture 15 notes [Powerpoint slides]. Retrieved from https://psu.instructure.com/courses/2006917/files/104343631?module_item_id=27881395