Lev Vygotsky’s cultural-historical theory of human development which places great emphasis on the role of culture in fi rst defi ning and then transmitting the sign and symbol systems used in that culture is a good example of a theory rooted in a contextualist worldview. Sign and symbol systems are the ways in which cultures note and code information. They are refl ected in the nature of the language, in ways of quantifying information, in the expression of the arts, and more generally in the ways in which people establish, maintain, and transmit social institutions and relationships across generations.
THEORIES FROM THE MECHANISTIC WORLDVIEW
eventually comes to trigger a response similar to the UR. This becomes the CR. The CR and UR are the same. Just the difference is that they are responses to different stimuli. When the response was to the unconditioned stimulus it was called the unconditioned response. The same response when following the conditioned stimulus becomes the conditioned response.
The Process of Classical Conditioning: Classical conditioning has been demonstrated in numerous species using a variety of methodologies.
Classical conditioning fi rst starts with the teacher/therapist/investigator identifying the response to be achieved. The next step is to fi nd out which stimulus elicits this response naturally. After this the therapist identifi es the stimulus to which she/he wants the natural response to follow.
The US and the CS are paired for several trials after which the UR, which initially followed US starts following CS too. The connection, or the association between the hitherto unconnected CS and CR is formed.
Thus “unconditioned” means unlearned, untaught, pre-existing, already-present- before-we-got-there and “conditioning” means to associate, connect, bond, link something new with the old relationship.
Initially (Pavlov’s experiment),
Unconditioned Stimulus Æ Unconditioned Response (Food) (Salivation)
Conditioned Stimulus Æ No Response (Bell)
Then,
Unconditioned Stimulus + Conditioned Stimulus Æ Unconditioned Response
(Food) (Bell) (Salivation)
After several trials,
Conditioned Stimulus Æ Conditioned Response (Bell alone) (Salivation)
The stimulus, which did not originally evoke a particular response, on close temporal proximity with another, which does elicit a particular response naturally, starts to elicit a similar response.
Types of Classical Conditioning
1. Forward conditioning
The onset of the CS precedes the onset of the US. Three common forms of forward conditioning are: short-delay, long-delay, and trace.
2. Simultaneous conditioning
The CS and US are presented at the same time.
3. Backward conditioning
The onset of the US precedes the onset of the CS. In this method the CS actually serves as a signal that the US has ended rather than being a reliable predictor of an impending US (such as in forward conditioning).
4. Temporal conditioning
The US is presented at regular time intervals, and CR acquisition is dependent upon correct timing of the interval between US presentations.
5. Unpaired conditioning
The CS and US are not presented together. Rather they are presented as independent trials that are separated by a variable, or pseudo-random, interval. This procedure is used to study non-associative behavioral responses, such as sensitization which is a progressive amplifi cation of a response following repeated administrations of a stimulus.
6. Response extinction
The CS is presented in the absence of the US and eventually, the CR frequency is reduced to pretraining levels.
7. Stimulus discrimination/reversal conditioning
In this procedure, two CSs (CS+ and CS–) are identifi ed which can be similar (different intensities of light) or different (auditory and visual). The US is paired only with the CS+ and not with CS–. After a number of trials, the organism learns to discriminate CS+ trials and CS– trials such that CRs are only observed on CS+
trials.
During reversal training, the CS+ and CS– are reversed and subjects learn to suppress responding to the previous CS+ and show CRs to the previous CS–.
8. Stimulus generalization
This is the tendency for the conditioned stimulus to evoke similar responses after the response has been conditioned.
9. Secondary conditioning
The CS takes on the role of the US and is paired with another and the process of conditioning continues further. Attitudes, values, beliefs, and thinking patterns
are quite often learned in this manner. Thus to help make changes in therapy the same principle can be applied to unlearn what has been learned. Many behavior modifi cation techniques for example, aversion therapy, fl ooding, systematic desensitization, and implosion therapy owe their origin to the classical conditioning theory.
Operant (Instrumental) Conditioning Model
This model examines the relationship between a behavior and its consequence. It was developed by Edward Thorndike, John Watson, and B.F. Skinner. This theory proposes that learning is the result of the consequences. The learners begin to connect certain responses with certain stimuli causing the probability of the response to change (i.e., learning occurs). As a model of human development, it demonstrates how changes in the consequences of one’s behavior can in turn modify that behavior.
Responses are more likely to increase if followed by a positive consequence and less likely if followed by a negative consequence.
Thorndike labeled this type of learning as instrumental. Skinner renamed instrumental as operant, i.e., in this learning, one is “operating” on, and is infl uenced by, the environment. The stimulus follows a voluntary response which then changes the probability of whether the response is likely or unlikely to occur again. The two types of consequences, positive (sometimes called pleasant) and negative (sometimes called aversive) can be added to or taken away from the environment in order to change the probability of a given response occurring again (Huitt, W., & Hummel, J. 1997).
Classical conditioning illustrates S–>R learning, whereas operant conditioning is often viewed as R–>S learning. It is the consequence that follows the response that infl uences whether the response is likely or unlikely to occur again. Voluntary responses are learned through operant conditioning. (Huitt, W. and Hummel, J.
1997).
General principles: There are four major techniques or methods used in operant conditioning. They are the result of combining the two major purposes of operant conditioning (increasing or decreasing the probability that a specifi c behavior will occur in the future), the types of stimuli used (positive/pleasant or negative/aversive), and the action taken (adding or removing the stimulus).
There are fi ve basic processes in operant conditioning: reinforcements (positive:
pleasant, and negative: unpleasant or aversive) strengthen behavior; punishment, response cost, and extinction weaken behavior.
Positive reinforcement: A positive reinforcer is added after a response and increases the frequency of the response. For, e.g., reward, appreciation, etc., increase the probability that a particular behavior will occur again.
Negative reinforcement: After the response the negative reinforcer is removed, which increases the frequency of the response. For, e.g., stop scolding once the child apologises. (Note: There are two types of negative reinforcement: escape and avoidance. In general, the learner must fi rst learn to escape before he or she learns to avoid.)
Response cost or omission: Omission or response cost weakens behavior by subtracting a positive stimulus. After the response the positive reinforcer is removed, which weakens the frequency of the responsee. Withdrawal of privileges like watching TV, books, lunch hour play, etc.
Punishment: It weakens a behavior by adding a negative stimulus. After a response a negative or aversive stimulus is added which weakens the frequency of the response.
Extinction: No longer reinforcing a previously reinforced response (using either positive or negative reinforcement) results in the weakening of the frequency of the response.
Outcome of Conditioning
Increase Behavior Decrease Behavior Positive
Stimulus
Positive Reinforcement [add (positive) stimulus]
Response Cost
[remove (positive) stimulus]
Negative Stimulus
Negative Reinforcement [remove ( negative) stimulus]
Punishment
[add (negative) stimulus]
Schedules of Reinforcement
Skinner found that the timing of the contingent reinforcement is an equally signifi cant variable. Continuous reinforcement is generally seen as being more effective in establishing a response; variable or intermittent reinforcement is seen as being more effective at maintaining a response at a high level once it has been established.
Continuous reinforcement simply means that the behavior is followed by a consequence each time it occurs.
Intermittent schedules are based either on the passage of time (interval schedules) or the number of correct responses emitted (ratio schedules). This results in four classes of intermittent schedules:
1. Fixed interval: The fi rst correct response after a set amount of time has passed is reinforced (i.e., a consequence is delivered). The time period required is always the same.
2. Variable interval: The fi rst correct response after a set amount of time has passed is reinforced. After the reinforcement, a new time period (shorter or longer) is set with the average equaling a specifi c number over a sum total of trials.
3. Fixed ratio: A reinforcer is given after a specifi ed number of correct responses. This schedule is best for learning a new behavior.
4. Variable ratio: A reinforcer is given after a set number of correct responses.
After reinforcement, the number of correct responses necessary for reinforcement changes. This schedule is best for maintaining behavior.
Behavior Genetic Model or the Nature-Nurture Debate
A behavior genetic model tries to bring about some understanding of the perennial nature-nurture debate. It offers a different approach altogether. These debates concern the relative importance of an individual’s innate qualities versus personal experiences in determining or causing individual differences in physical and behavioral traits They attempt to determine, through elaborate statistical procedures, how much of the individual differences can be said to be due to genetic factors and how much due to environmental factors.
Behavior genetic researchers cannot of course do research which involves selective breeding procedures with humans, which is the preferred technique when working with animals so they look for situations that they believe allow for “experiments in nature.”
The two most common research designs behavior genetic researchers employ for humans involve:
1. The comparison of individuals of different degrees of genetic relatedness, i.e., twin and family studies, and
2. The comparison of adopted children to both their biological and adopted parents (adoption studies).
Behavior genetic researchers report that many characteristics show a signifi cant genetic contribution. That is, identical twins appear more similar than fraternal twins or siblings, who are in turn more similar than cousins, who are in turn more similar than unrelated individuals. Further, adopted children share many characteristics with their biological parents, even if they are adopted at birth.
Two different types of environmental effects are distinguished during the investigations: shared family factors (i.e., those shared by siblings, making them more similar) and non-shared factors (i.e., those that uniquely affect individuals, making siblings different). In order to express the portion of the variance that is due to the “nature” component, behavioral geneticists generally refer to the heritability–
the extent to which variation among individuals in a trait is due to variation in the genes those individuals carry––of a trait.
Also another component of the nature-nurture debate is the gene-environment interaction. Environmental inputs affect the expression of genes, that is, the environment infl uences the extent to which a genetic disposition will actually manifest. Individuals with certain genotypes are more likely to fi nd themselves in certain environments. Thus, it appears that genes can shape (the selection or creation of) environments.
Thus, there are the predominantly environmental traits (specifi c language, religion), predominantly genetic (blood type, eye color) and interactional (height, weight, skin color).