Friday, February 25, 2011

Biography & Overview of Skinner's Work


Biography

On March 20, 1904, Burrhus Frederic Skinner was born to a young lawyer and a housewife in the quaint rural town of Susquehanna, PA. Even as a child, he had a gift for invention (although some of his creations worked poorly, if at all): a cart that steered backwards, a perpetual motion machine, a system to separate ripe from green elderberries.

The young Skinner read extensively of Frances Bacon's works while attending high school – and Bacon's advocacy of the inductive method of scientific research greatly influenced Skinner's later research and experimentation. After graduation from Hamilton College with a BA in English, Skinner hoped to become a writer of fiction, but his literary talents were not as significant as he'd hoped, and he soon abandoned the attempt.

In 1928, he enrolled in Harvard University's Psychology Department, where he remained as a researcher for several years after receiving his PhD in 1931. He was mentored by William Crozier, whose view of studying animal behavior closely matched Skinner's: "the animal as a whole" as opposed to internal processes, which was the commonly held philosophy of the time. With Crozier's encouragement to relate behavior to experimental conditions, Skinner's inventiveness came into play: he built a multitude of new devices geared towards his rats' evolving behaviors, as well as the now-famous cumulative recorder, a piece of equipment which records the rate of response as a factor of the experiment's contingencies.

The birth of "operant behaviorism/conditioning" evolved once Skinner realized his rats' performance differed significantly from the theories of John B. Watson and Ivan Pavlov: that the frequency of a given rat's response was independent of a prior stimulus and, instead, depended upon what happened after the stimulus occurred. Skinner concluded that behavior, then, was a function of the situation and consequently was dictated by its effects.

Skinner married at the age of 32; he and his wife, Yvonne (Eve), relocated soon after to Minneapolis for his first teaching position at the University of Minnesota. For several years, his work with behaviorism took a back seat to his family and career, but in 1944, he became intrigued with finding a method of training pigeons to become the guidance system for missiles. His top-secret project involved training the birds to peck reliably at a target which represented the bomb's objective. Even though his work was never utilized by the military (they were working on another new idea – radar), it remained valuable as a means of proving his analyses were viable.

One of Skinner's most famous – and controversial – inventions came about at the request of his wife, who believed that there could be a better alternative for a baby crib. She was concerned about the safety of traditional barred cribs, which posed dangers to infants who might entangle limbs in the bars, or suffocate in their blankets. So Skinner created an enclosed, heated crib, fronted with plexiglass, and called it the "baby tender." Designed only to be used as a bed for babies, it potentially simplified child care by greatly reducing laundry, cradle cap, diaper rash, and other annoyances, and encouraged babies to feel more secure, comfortable, move around more easily, and ultimately be healthier. Unfortunately, Ladies Home Journal magazine changed the title of his article about the tender to "Baby in a Box," which gave rise to accusations that the tender was cruel. Skinner fought these negative repercussions for the rest of his days.

He and his family returned to Harvard in 1948, where he remained as a tenured professor for the rest of his career.

In 1953, Skinner paid a visit to his youngest child's math class. Because Skinner's research had previously demonstrated that postponement of reinforcement hindered performance, he was astonished to realize that the teacher of the class, "through no fault of her own, was violating almost everything we knew about the learning process": students were expected to complete an entire page of math problems with no way of knowing if a given problem was correct before moving on to the next one. No feedback was offered – until perhaps the next day, when the papers were graded.

Yet how was the teacher to give individual shaping behavior (i.e., immediate reinforcement to a response) to each student in such a large class? Returning home, Skinner created his first "teaching machine" – a device to present the math questions randomly to the students, and offer feedback on whether or not they had worked the problem correctly. At first, his machine taught no new behavior – it only allowed the students to practice skills already in their possession. Over time, however, the methodology of programmed instruction was developed, where material to be learned was broken down into small increments, similar to what a tutor would bring to an individual student. At completion of the learning sequence, a student is able to solve problems they couldn't have worked out when the sequence was initiated; learning bit by bit, they uncover answers for immediate "rewards."

In later life, Skinner turned his attention to application of his behavioral science to society, specifically moral and philosophical issues. His work in the late 1960s brought him more public attention, and ultimately resulted in numerous television appearances to air his views.

Diagnosed with leukemia at the age of 85, he remained as active a lecturer as was possible until his death in August of 1990.

Theoretical Work

"A learning theory is a series of interlocking conjectures about how experience influences behavior. These conjectures necessarily involve processes or mechanisms that make the model for behavior work. But with inference comes the means to test if it is true. Without the ability to test an inference, a scientific theory can never be."

Skinner's theory is based on the concept that learning – in animals and humans – results from changes in behavior. These changes result from a subject's response to events that occur in their environment. Responses engender consequences such as solving a math problem, or receiving another type of positive reinforcer. Individuals are conditioned to respond when a specific stimulus-response pattern is rewarded.

Skinner's cumulative recorder provided a record of the frequency with which an animal performed an act to "work" for a food reward. He found that behaviors weren't dependent on prior stimuli, as Watson and Pavlov asserted, but, instead, were dependent upon what occurred after the response. He termed this phenomena operant behavior. Skinner defined his term operant as "any active behavior that operates upon the environment to generate consequences." To put it more plainly, his theory explains how humans acquire the wide variety of learned behaviors we display on a daily basis.

Examples of his theory are all around us: children earn rewards from parents and/or teachers for completing their homework, adults receive promotions or other benefits for meeting project deadlines at work.

In the above instances, the possibility of reward generates an increase in a specific behavior, but operant conditioning can decrease a behavior, as well. Removing a disagreeable outcome to a situation or being punished may diminish or even extinguish objectionable behaviors; i.e., a teacher telling a student that if they talk in class, they will not be allowed to go out to recess. And so the risk of losing that privilege leads to a decrease in disruptive behaviors.

Fun Trivia Fact

In the Disney movie "Ratatouille," the character of Skinner, head chef at Gusteau's Restaurant, is a nod to B.F.'s renowned experiments with rats.

References