What are unlearned automatic responses by an organism to a stimulus in the environment?


I. CLASSICAL CONDITIONING
1. Discovered the principle of classical conditioning by accident.

2. Pavlov wanted to understand how a dog's stomach prepares to digest food when something is placed in the dog's mouth.

3. He noticed that the mere sight or smell of food was enough to get the dog salivating - investigated how this worked.

2. Pavlov chose the tuning fork as a neutral stimulus - it initially had nothing to do with the response (salivation).

3. By pairing the neutral stimulus with the food he found that the tone alone could elicit salivation.

4. Generally thought of as involving responses the individual has little control over such as autonomic nervous system responses - salivation, sucking in infants, heart rate, blood pressure.

III. Concepts Describing Classical Conditioning

1.

Unconditioned Stimulus (UCS)

- the environmental factor (stimulus) that naturally brings about a particular behavior (response).

2.

Unconditioned Response (UCR)

- the unlearned, automatically occurring reaction brought about by an unconditioned stimulus.

3.

Conditioned Stimulus (CS)

- is a neutral stimulus that becomes capable of eliciting a particular response through being paired during training with a UCS.

4.

Conditioned Response (CR)

- is a response aroused by some stimulus other than the one that automatically produces it.

IV. Other Principles

1.

Extinction

- Pavlov discovered that if he stopped presenting food after sounding the tuning fork, the sound gradually lost it's effect on the dog - it no longer elicited salivation. The association between the two lessened. Called extinction.

2.

Spontaneous Recovery

- He also found that if he gave the dog a rest from the apparatus after extinction and later struck the tuning fork again the dog salivated again. Called spontaneous recovery. If pairing not made with UCS however, second extinction is quicker. Also successive spontaneous recoveries not as strong.

3. |
|
|
|
|
|
|
|____________________ ____________ ____ ____________
2 4 6 8 10 12 14 16 2 4 6 8 10
Acquisition Extinction Rest Extinction

4.

Stimulus Generalization

- Not only the conditioned stimulus under which a response was first learned can elicit the response (CR) but also a range of similar stimuli. Example: Pavlov - not only tuning fork but also maybe hitting the edge of a glass.

5.

Stimulus Discrimination

- The ability to respond differently to two or more stimuli that are similar yet distinct.

6.

Higher-Order Conditioning

- After a secondary conditioned stimulus is paired with the primary stimulus a number of times, it may take on the functions of a primary stimulus. Example: Pavlov's experiment have tone eliciting salivation, now lets pair tone with a light - after a while the light alone will elicit salivation.

V. Examples

1. Fear - Give example.

Watson & Raynor

(1920) - demonstration of classical conditioning to emotional responses in humans. 9 month old child - Little Albert - beginning of experiment white rat did not elicit fear, loud noise did elicit crying and distress. paired the loud noise with the rat and after a while the rat elicited fear in Little Albert.

What is UCS? (loud noise)
What is UCR? (fear response to loud noise)
What is CS? (rat)
What is CR? (fear of rat)
What would it be if Albert also became afraid of cats? (stimulus generalization)
What if Albert was not afraid of gerbils? (stimulus discrimination)
This is one way phobias can develop.

Most Common Phobias

acrophobia - fear of high places
agoraphobia - fear of open places
claustrophobia - fear of closed places
gynephobia - fear of women
hydrophobia - fear of water
mysophobia - fear of dirt
ophidiophobia - fear of nonpoisonous snakes

Other Fairly Common Phobias

ailurphobia - fear of cats
algophobia - fear of pain
astrapophobia - fear of lightning
hematophobia - fear of blood
monophobia - fear of being alone
nyctophobia - fear of darkness
ocholophobia - fear of crowds
pathophobia - fear of disease
pyrophobia - fear of fire
syphilophobia - fear of syphilis
xenophobia - fear of strangers
zoophobia - fear of animals or a specific animal

More Exotic Phobias

emetophobia - fear of vomiting
enosiophobia - fear of committing an unpardonable sin
entomophobia - fear of insects
eosophobia - fear of dawn
ergasiophobia - fear of work
erthyrophobia - fear of blushing in public
gamophobia - fear of marriage
gephyrophobia - fear of crossing a bridge
gymnophobia - fear of naked bodies
haphephobia - fear of being touched
heliophobia - fear of sunlight
homilophobia - fear of sermons
melissophobia - fear of bees
parthenophobia - fear of virgins
pnigophobia - fear of choking
taphophobia - fear of being buried alive

2.

Taste Aversion

- Lets say you go to a restaurant and you try escargot for the first time. You then get sick - will you eat snails again? (probably not). What if ate steak, baked potato, and snails? (aversion probably just to snails). Why? (snails are the novel stimulus).

OPERANT CONDITIONING AND REINFORCEMENT

I. Background

1.

Definition

- Type of learning in which the consequences of a behavior influence whether the organism will act in the same way in the future - the animal learns the relationship between his own behavior and a reinforcing or punishing stimulus.

2.

Reinforcement

- An environmental stimulus which is contingent on a response and increases the probability of a response.

3.

Punishment

- An environmental stimulus which is contingent on a response and decreases the probability of a response.

II. The ABCs of Behavior (Functional Analysis)

1.

A = Antecedent

- Stimuli happening before the behavior such as instructions or gestures.

2.

B = Behavior

- This is simply the act itself. The individual's response.

3.

C = Consequence

- This is the event that follows the behavior.

4.

Example

- A = A teacher asks a student to answer a question. B = The student answers the question. C = The teacher tells the student that he did good (reinforcement).

III. Four ways to administer contingent consequences.

1.

positive stimulus presented

(positive reinforcement) - presentation of a stimulus increases the probability of a response. Example? (give candy for sitting in seat)

2.

positive stimulus removed

(response-cost) - Should produce a suppression of the behavior. Example? (take away TV for talking back to mother)

3.

negative stimulus presented

(punishment) - Decreases the probability of the behavior. Example? (spanking)

4.

negative stimulus removed

(negative reinforcement) - Results in increase in probability of a behavior. Example? (kid wants a cookie, he cries until mother gives him one, mother's cookie giving behavior removes negative stimulus, crying, more likely to give cookie again to escape from crying - child however was positively reinforced for crying)

IV. Reinforcement

1.

Primary Reinforcement

- Unconditioned. A stimulus that does not require an organism to learn its reinforcing qualities. Examples? (food, water, sex).

2.

Secondary Reinforcement

- Conditioned. A neutral stimulus that, through constant association with primary reinforcers, acquires its own reinforcing qualities. Examples? (money, grades)

3.

Extinction

- When we withhold a reinforcer than you should have extinction of the behavior it was reinforcing. Hope not always the case - when have added behavior through reinforcement, hope other things will have taken over reinforcing properties or it has become intrinsically reinforcing.

3.

Bribery vs. Reinforcement

- some say reinforcing someone for acting the way they should is Bribery but offering incentives is not necessary Bribery. Is a salary for working Bribery?

V. Factors Influencing Effectiveness of Reinforcement

1.

Potency of the Reinforcer

- Want it to be enough to be reinforcing but not too much or subject will satiate quickly and item will lose its reinforcing qualities. Example? (using edible reinforcer, M&Ms, is one enough but if give large bag may get sick of them).

2.

Immediacy of Reinforcement

- The greater the delay in administering the reinforcer the less the effect.

3.

Verbalization

- It helps verbalizing the connection between the behavior and the reinforcement to strengthen the connection.

4.

Shaping

- If behavior not in repertoire you will never have the opportunity to reinforce it. To get it in repertoire you reinforce successive approximations of behavior. Example: Nonverbal child getting them to ask for things instead of grabbing - first, grunting and pointing, then attempting to say cookie, then saying cookie, then cookie please, then may I have a cookie?

VI. Schedules of Reinforcement

1. Continuous vs. Partial Schedules

Continuous

- A reward is given every single time the response is elicited. Generally used only to establish a behavior. Easy to extinguish.

Partial

- The subject is only occasionally rewarded for the proper response. More resistant to extinction.

2. Partial Schedules Either Ratio or Interval

Ratio

- Based on the number of correct responses the organism makes between reinforcements.

Interval

- Based on the amount of time that has elapsed between reinforcements.

3. Fixed or Variable Schedules

Fixed

- regular schedule

Variable

- irregular schedule

4. Four Possible Partial Schedules

A)

Fixed-Ratio

- Reinforcement depends on a certain amount of behavior being emitted, for example every fifth response is reinforced. Example? (a typist who gets paid after a certain number of pages are typed). Pattern - people generally work hard on FR schedules, pausing briefly after each reward. Problem - if the amount of work responses required before the next reward is large you will likely see low morale and few responses at the beginning of each cycle.

B) Variable-Ratio - The number of required responses varies around some average. Example? (Salesman who sells something to the second customer, then to the sixth, the eighth, etc. also gambler at a slot machine, the faster he puts money in machine the faster he will hit jackpot). Pattern: People tend to work at a high steady rate. No delay because reinforcement may come on next response.

C)

Fixed-Interval

- Reinforcement is given at a predetermined time no matter how many responses have been emitted. Example? (not many naturally occurring examples but studying just before a test may be one). Pattern: scollop effect - begin interval responding slowly but increase as interval progresses in anticipation of reward.

D)

Variable-Interval

- The time a reinforcer will be available varies around some average time. Example? (dialing a number when the line is busy, will get through but don't know when). Pattern: steady, moderately paced responses.

E.

Which is Most Resistant to Extinction?

Variable interval and variable ratio most resistant to extinction with variable ratio being the best. Why? (never know when the next reinforcement is going to come). If you gradually increase the ratio or interval you can make the behavior even more resistant to extinction.

VICARIOUS LEARNING
(Modeling)

I. Acquisition of Behavior

A. Can we explain the acquisition of all things through classical and operant conditioning? NO!!!

1.

Classical

- Some fears can not be explained in a classical framework.
2.

Operant

- to a strict behaviorist reinforcement is necessary for acquisition of a behavior. People perform apparently novel behaviors in absence of reinforcement.
3.

Bandura

- Because of these factors, Bandura offers a social learning framework.

B. Bandura, Ross, & Ross (1963)

1. Young children saw older children attack a large, inflated "Bobo" doll by sitting on it, punching it, kicking it, and hitting it with a wooden mallet while saying such things as "hit him!"
2. Some children saw the model receive punishment for his aggression, some saw him rewarded, and some saw no consequences at all.
3. Then the children were allowed to play for a while with a number of toys including a Bobo doll. Behavior was recorded.
4. Only in the condition where the models behavior was punished did the subjects not imitate the aggressive behavior; however, they could recall the behavior later.
5. This shows that reinforcement is not necessary for the acquisition of a novel response.

II. Acquisition/Performance Distinction

1.

Acquiring v. Performing

- Behavior can be acquired but not performed. They learn a behavior by observing someone else engage in a behavior.
2.

Reinforcement

- Reinforcement is a critical aspect, not in acquisition, but in performance. When you expect reinforcement, you will engage in the behavior.

3.

Example

- Murder

4.

Timing

- Acquisition and performance could occur almost simultaneously but often there is a time delay between them because (1) they are unable to engage in the modeled behavior; (2) the situation they are in is different so they outcome would be different, wait until different outcome; or (3) socially undesirable behaviors may have severe inhibitions that prevent performance of behavior.

What is the unlearned automatic response by an organism to a stimulus in the environment?

Reflex: An automatic, unlearned response to a stimulus.

What is an automatic unlearned response called?

Unconditioned responses are automatic and unlearned. They can be seen from the time we are born. Up until Ivan Pavlov's experiments that led to the discovery of classical conditioning, however, these innate responses were not yet defined.

What is an unlearned reaction to a given stimulus?

In classical conditioning, an unconditioned response is an unlearned response that occurs naturally in reaction to the unconditioned stimulus. 1 For example, if the smell of food is the unconditioned stimulus, the feeling of hunger in response to the smell of food is the unconditioned response.

What is a type of stimulus that does not produce an automatic response?

A neutral stimulus is a stimulus that does not produce an automatic response.