A1: In SOP theory, the maximal state to which elements in a memory node can be activated when the corresponding conditional stimulus or unconditional stimulus is presented.
Acquisition: The phase in a learning experiment in which the subject is first learning a behavior or contingency.
After-image: The visual image seen after a stimulus is removed; typically, it is an opposite color than the stimulus.
After-reaction: The reaction after a stimulus is removed; according to opponent-process theory, it is typically the opposite of the initial reaction to the stimulus.
Agoraphobia: An abnormal fear and avoidance of open or public places that often accompanies panic disorder.
Analogous: Two or more traits that are similar in function but not in structure or evolutionary origin.
Animal cognition: A subfield of learning theory that examines the cognitive (mental) processes and abilities of animals, often by using stimulus control techniques. Sometimes involves comparisons across species.
Antecedent: An event that precedes another one. Respondent behaviors are responses to antecedent events.
Artificial selection: When humans intervene in animal or plant reproduction to ensure that desirable traits are represented in successive generations. Individuals with less desirable traits are not allowed to reproduce.
Association: A connection or relation between two things, such as sense impressions, ideas, stimuli, or stimuli and responses.
Atomistic: Consisting or made up of many separate elements. The British empiricists were said to have an atomistic view of the mind because they believed that complex thoughts resulted from the accumulation of many different associations.
Attentional priming: The finding that recent exposures to a stimulus or to cues associated with that stimulus can decrease the time it takes to find the stimulus when it is presented among distractors.
Backward blocking: The finding (primarily in humans) that little or no conditioning occurs to a conditional stimulus if it is combined, during conditioning trials, with another conditional stimulus that is later paired with the unconditional stimulus. Backward blocking differs from ordinary blocking (i.e., “forward blocking”) in that conditioning with the other stimulus occurs after (rather than before) the compound conditioning.
Backward conditioning: A classical conditioning procedure in which the conditioned stimulus is presented after the unconditioned stimulus has occurred. Can lead to either no conditioning, conditioned excitation, or conditioned inhibition depending on the timing of the two stimuli.
Beacon: A cue that is close to a goal that can be detected from a distance and approached.
Behavior chain: A sequence of behaviors that is theoretically put together with the help of discriminative stimuli that reinforce the preceding behavior and set the occasion for the next behavior.
Behavior systems theory: A type of theory that proposes that the behaviors that emerge in classical and instrumental conditioning situations originate in systems of behaviors that have evolved to optimize interactions with the unconditional stimulus (or reinforcer) in the natural environment.
Behavioral economics: An approach that incorporates economic principles in understanding operant behavior.
Behavioral theory of timing: A theory of interval timing that proposes that animals use changes in their own behaviors to measure the passage of time.
Bidirectional response system: An experimental setup where it is possible to measure both excitation and inhibition because response levels can go either above or below a starting baseline.
Bliss point: An organism’s preferred distribution of behavior.
Blocking: In classical conditioning, the finding that little or no conditioning occurs to a new stimulus if it is combined with a previously conditioned stimulus during conditioning trials. Suggests that information or surprise value is important in conditioning.
b-process: In opponent-process theory, the process underlying an emotional response that is opposite the one controlled by the a-process. The b-process functions to compensate for the a-process, and starts and then decays relatively slowly.
British Empiricists (also British Associationists): British philosophers (including John Locke and David Hume) who proposed that the mind is built up from a person’s experiences.
Category learning: Learning to identify specific items as members, or not, of a larger group or set of items.
Causal learning: Learning about the causes of an event.
Chained schedule: A set of two or more reinforcement schedules, each signaled by its own discriminative stimulus, that must be completed in sequence before the primary reinforcer occurs.
Charles Darwin: (1809-1882) British biologist who proposed the theory of evolution in his 1859 book, On the Origin of Species.
Circadian rhythm: A daily activity cycle, based roughly on 24-hour intervals.
Classical conditioning: The procedure in which an initially neutral stimulus (the conditional stimulus, or CS) is repeatedly paired with an unconditional stimulus (or US). The result is that the conditional stimulus begins to elicit a conditional response (CR). Nowadays, classical conditioning is important as both a behavioral phenomenon and as a method used to study simple associative learning.
Comparator theory: A theory of classical conditioning which proposes that the strength of the response to a conditional stimulus depends on a comparison of the strength of that stimulus’s association with the unconditioned stimulus and that of another stimulus.
Complements: Two or more commodities or reinforcers that “go together” in the sense that increasing the price of one will decrease the demand for both of them. For example, chips and salsa; bagels and cream cheese.
Compound potentiation: In classical conditioning, the finding that there is more conditioning to a weak conditional stimulus if it is combined with a more salient conditional stimulus during conditioning. Mainly known in flavor aversion learning, where conditioning of a weak odor may be especially strong if it is combined with a salient taste during conditioning. The opposite of overshadowing.
Compound: In classical conditioning, the presentation of two or more conditional stimuli at about the same time. In a “simultaneous” compound, the conditional stimuli are presented at the same time; in a “serial” compound, the stimuli are presented in a sequence. Also called compound CS.
Concurrent measurement studies: Experiments in which Pavlovian responses and instrumental (or operant) responses are measured at the same time in order to investigate their relationship.
Concurrent schedule: A situation in which the organism can choose between two or more different operant behaviors; each behavior pays off according to its own schedule of reinforcement.
Conditional discrimination: A discrimination in which two stimuli are presented, and the correct stimulus is determined based on which of the two stimuli is present or was presented recently.
Conditional response (CR): The response that is elicited by the conditional stimulus after classical conditioning has taken place. The response is “conditional” in the sense that it depends on the conditioning experience.
Conditional stimulus (CS): An initially neutral stimulus (like a bell, light, or tone) that begins to elicit a conditional response after it has been paired with an unconditional stimulus.
Conditioned compensatory response: In classical conditioning, a conditional response that opposes, rather than being the same as, the unconditional response. It functions to reduce the strength of the unconditional response, as in drug tolerance.
Conditioned emotional response (CER): A method for studying classical conditioning in which the conditional stimulus is associated with a mild electric shock and the CS comes to suppress an ongoing behavior, such as lever-pressing reinforced by food. Also called conditioned suppression.
Conditioned inhibition: Inhibition that is learned through classical conditioning. The term also refers to a specific inhibitory conditioning procedure in which one conditional stimulus is always paired with an unconditional stimulus, except when the CS is combined with a second conditional stimulus. The second stimulus acquires inhibition. The procedure is also known as the feature-negative discrimination.
Conditioned inhibitor (CS-): A conditional stimulus that evokes inhibition; e.g., one that suppresses or reduces the size of the conditioned response that would otherwise be elicited by a second conditional stimulus. See retardation-of-acquisition test and summation test.
Conditioned reflex: Another name for a conditional response, i.e., the response that is elicited by a conditional stimulus after classical conditioning has taken place. The term “reflex” is used here to connect the concept with the tradition of studying reflexes in physiology.
Conditioned reinforcer or secondary reinforcer: A stimulus that has acquired the capacity to reinforce behavior through its association with a primary reinforcer.
Conditioning preparation: Any of several methods for studying classical conditioning.
Configural cue: The unique new stimulus that is present when two or more conditional stimuli are combined.
Configural theory: A theory that assumes that, when organisms receive classical conditioning with a compound conditional stimulus, they associate the entire compound with the unconditional stimulus rather than forming separate associations between each of its elements and the unconditional stimulus.
Connectionism: An approach in cognitive psychology and artificial intelligence in which knowledge is represented by a large number of connections between nodes or units in a network that bears a metaphorical resemblance to connections in the brain. Also called parallel distributed processing or neural networks.
Consequence: Something that follows from an action. Operant behaviors are actions that are controlled by their consequences (such as the reinforcers or punishers they might produce).
Context or contextual stimuli: External or internal stimuli that are in the background whenever learning or remembering occurs.
Contiguity theory: Guthrie’s idea that learning depends on a stimulus and response occurring together in time rather than depending on reinforcement.
Continuous reinforcement schedule: A schedule of reinforcement in which a reinforcer is delivered after each response.
Counterconditioning: A conditioning procedure that reverses the organism’s response to a stimulus. For example, by pairing the stimulus with a positive event, an organism may be conditioned to respond positively to a stimulus that would otherwise conditionally or unconditionally elicit fear.
Cumulative record: A graph in which the cumulative number of operant responses is plotted as a function of time. The slope of the line gives the rate of responding. Usually created by a cumulative recorder.
Cumulative recorder: A device used to analyze operant behavior in which a pen that rides on a slowly-moving piece of paper is deflected upward with each response (press of a lever, for example). This creates a graph or cumulative record which shows the cumulative number of responses as a function of time.
Dead reckoning: A method of navigation in which an animal travels to its goal by using an internal sense of direction and distance.
Declarative memory: Memory for things other than actual behavioral procedures.
Delay conditioning: A classical conditioning procedure in which the conditional stimulus commences on its own and then terminates with presentation of the unconditional stimulus.
Delayed matching-to-sample (DMTS): A procedure used to study working memory in which the organism is reinforced for responding to a test stimulus if it is the same as a “sample” stimulus presented earlier.
Demand curve: A graph showing the demand for a product at different prices. In behavioral economics, the amount of a commodity (or reinforcer) that is taken when the experimenter varies the amount of work that is required to earn it.
Differential inhibition or discriminative inhibition: A procedure in classical conditioning in which a conditional stimulus is paired with the unconditional stimulus on some trials and another conditional stimulus is presented without the unconditional stimulus on other trials. The second CS may acquire inhibition.
Discriminative stimulus: In operant conditioning, a stimulus that signals whether or not the response will be reinforced. It is said to “set the occasion” for the operant response.
Drive: A theoretical construct which corresponds to motivation arising from biological needs, such as the need for food or water.
Drug tolerance: A reduction in the effectiveness of a drug that can occur with repeated exposure to the drug.
Early comparative psychologists: A group of primarily British biologists (e.g., C. Lloyd Morgan and George Romanes) who were active in the late 1800s and who sought to study the evolution of the mind by inferring the mental activities of animals from their behavior.
Edward L. Thorndike: (1874-1949) American psychologist whose experiments with cats learning to get out of puzzle boxes profoundly influenced our thinking about the importance of instrumental conditioning and the central place of animal learning experiments in psychology.
Edward Tolman: (1886-1959) American psychologist whose ideas about the value and scientific validity of using intervening variables to explain behavior had a profound impact on all of scientific psychology. Tolman also ran many important experiments that emphasized cognitive and motivational factors in behavior and learning.
Elemental theory: A theory that assumes that, when organisms receive conditioning with a compound conditional stimulus, they associate each element of the compound separately with the unconditional stimulus.
Elicited: Brought on by something that comes before. Respondent behaviors are elicited by an antecedent event.
Emitted: Literally, “to send forth.” Organisms are said to emit operant behaviors in the sense that such behaviors are not elicited by an antecedent event; they appear spontaneous (but are really controlled by their consequences).
Episodic memory: Memory for personal, often autobiographical, experiences and events that typically involve what, where, and when information.
Ethology: The study of how animals behave in their natural environments, typically with an emphasis on the evolution of the behavior.
Exaptation: A trait that has adaptive value but was not originally selected for its current function.
Excitation: In classical conditioning, the potential of a conditional stimulus to signal an unconditional stimulus or elicit a conditional response.
Excitor or CS+: A conditional stimulus that is associated with an unconditional stimulus, and has the potential to elicit a conditional response.
Exemplar theory: An approach to categorization which assumes that organisms store representations of a large number of individual members of a category and then respond to new items depending on how similar they are to the items that were presented before.
Explicitly unpaired: In classical conditioning, a procedure in which a conditional stimulus is presented alone and the unconditional stimulus is presented at another time.
Exposure therapy: A form of cognitive behavior therapy in which a patient is exposed, without consequence, to stimuli that elicit undesirable cognitions, emotions, or behaviors in order to weaken their strength. A form of either extinction (if the undesirable responses were learned) or habituation (if the undesirable responses were not learned).
External inhibition: Weakening of a conditional response elicited by a conditional stimulus when a neutral stimulus is added. Usually thought to occur through generalization decrement; that is, the organism does not generalize well between the conditional stimulus alone and its combination with the second stimulus.
Extinction: Reduction in the strength or probability of a learned behavior that occurs when the conditional stimulus is presented without the unconditional stimulus (in classical conditioning) or when the behavior is no longer reinforced (in operant or instrumental conditioning). The term describes both the procedure and the result of the procedure. Behaviors that have been reduced in strength through extinction are said to be “extinguished.”
Fading: A procedure in which a prompt or discriminative stimulus for a desired behavior is gradually withdrawn so that the organism is able to emit the behavior without the prompt.
Fear potentiated startle: An exaggerated startle reaction to a sudden stimulus that occurs when the stimulus is presented while the organism is afraid, e.g., in the presence of a fear excitor.
Feature stimulus: In feature-positive and feature-negative discriminations, the second conditional stimulus that is added to the other (target stimulus) conditional stimulus to signal trials on which the unconditional stimulus will or will not occur.
Feature theory: An approach to categorization which assumes that organisms associate the many features of category exemplars with reinforcers (or category labels) and then respond to new items according to the combined associative strengths of their features. Learning rules like the Rescorla-Wagner model would tend to isolate the most predictive features.
Feature-negative discrimination: A conditioning procedure in which a conditional stimulus is presented with the unconditional stimulus on some trials and without the unconditional stimulus on other trials. A second conditional stimulus is added to signal when the unconditional stimulus will not occur. See also conditioned inhibition.
Feature-positive discrimination: A conditioning procedure in which a conditional stimulus is presented with the unconditional stimulus on some trials and without the unconditional stimulus on other trials. A second conditional stimulus is added to signal when the unconditional stimulus will occur.
Fitness: An individual’s ability to survive and reproduce in a particular environment�\and to have offspring that will survive and reproduce.
Fixed action pattern: An innate sequence of behaviors that is triggered by a specific stimulus and continues to its end without regard to immediate consequences or feedback.
Fixed interval schedule: A schedule of reinforcement in which the first response after a fixed amount of time has elapsed (since the last reinforcer) is reinforced.
Fixed ratio schedule: A schedule of reinforcement in which a fixed number of responses is required for the delivery of each reinforcer.
Focal sets: In probabilistic contrast theory, the idea that the contingency between two events is calculated over a relevant subset of the trials.
Frustration: Motivational response that occurs when a reward is smaller than expected.
Generalization: The transfer of a learned response from one stimulus to a similar stimulus.
Generalize: To respond to a new stimulus to the extent that it is similar to another stimulus that has been reinforced or trained.
Geometric module: A representation of the global shape of the environment that is thought to be separate from the representations of individual landmarks.
Geons: Short for geometric ions; primitive components of visual perception according to recognition by components theory.
Habituation: A decrease in the strength of a naturally elicited behavior that occurs through repeated presentations of the eliciting stimulus.
Hall-Pearce negative transfer: Interference with conditioning that is produced by pairing a conditional stimulus with a weak unconditional stimulus before pairing it with a stronger unconditional stimulus.
Hedonic shift: The observation that in taste aversion learning, the flavor conditional stimulus actually becomes unpleasant.
Hedonism: The pursuit of pleasure and the avoidance of pain.
Hidden units: Nodes or units in a connectionist network that come between the input and output units and usually have no other connections outside the network (and are thus are not “visible” to outside systems).
Homeostasis: The tendency of an organism to maintain an internal equilibrium.
Homologous: Two or more traits that are similar in structure and evolutionary origin.
Immanuel Kant: (1724-1804) German philosopher who thought that the mind comes into the world with certain inborn assumptions or predilections with which it molds experience.
Imprinting: Learning in very young organisms that establishes attachment to a parent (or an object identified as a parent; sometimes called “filial imprinting”). In “sexual imprinting,” a similar process may influence later sexual behavior.
Incentive learning: A process in which organisms learn about the value of a specific reinforcer while they are in a particular motivational state.
Independents: Two or more commodities or reinforcers that do not “go together” in the sense that increasing the price of one causes its consumption to decrease without changing consumption of the other. Umbrellas and compact disks, for example.
Information processing: A model of cognition, based on a computer metaphor, in which the organism receives sensory input from the environment and then proceeds to operate on that information through a sequence of activities in sensory memory, short-term memory (working memory), and long-term memory (reference memory).
Inhibition of delay: In classical conditioning, inhibition that develops to the early portion of a conditional stimulus in a delay conditioning procedure. The early part of a conditional stimulus signals a period without the unconditional stimulus.
Inhibition: In classical conditioning, the capacity of a conditional stimulus to signal a decrease in the probability of the unconditional stimulus. More generally, an active process that suppresses excitation or reduces the strength of a response.
Inhibitor (CS-): A conditional stimulus that signals a decrease in the probability or intensity of the unconditional stimulus and therefore evokes inhibition.
Instrumental conditioning or instrumental learning: Any situation based on Thorndike’s method in which animals can learn about the relationship between their actions and consequences. Essentially the same as operant conditioning, except that in instrumental learning experiments the experimenter must set up each and every opportunity the organism has to respond.
Interference: Memory impairment caused by conflicting information that was learned at some other time.
Interim behaviors: Stereotyped behaviors that occur early in the interval between regularly delivered reinforcers.
Internal clock: A hypothetical cognitive device that codes or represents the passage of time.
Intertrial interval: The period of time between two successive trials.
Interval schedule: A schedule of reinforcement in which a response is reinforced only if it occurs after a set amount of time has elapsed since the last reinforcer.
Intervening variable: A theoretical concept that cannot be observed directly, but is used in science to understand the relationship between independent and dependent variables. To be scientific, intervening variables must be carefully defined in terms of the events that lead to them and the behavioral outputs they lead to. Also known as theoretical construct.
Ivan Pavlov: (1849-1936) Russian physiologist who published the first systematic observations of classical conditioning (also known as Pavlovian learning) and introduced many of the terms that are still used to describe such conditioning today.
Julien de la Mettrie: (1709-1751) French writer who believed that the body affects the mind.
Landmark: A cue that has a fixed relationship with a goal, but is not close to it, which organisms learn about and use to get around in space.
Latent inhibition or CS-preexposure effect: Interference with conditioning that is produced by repeated exposures to the conditional stimulus before conditioning begins.
Latent learning experiment: An experiment by Tolman and Honzik (1930) in which animals were not rewarded during initial trials, and then were rewarded for correct responding in a second phase. After the first rewarded trial, the rats began responding efficiently, as if they had previously been learning without reward. Although the reward was not necessary for learning, it did appear necessary to motivate performance.
Law of effect: Originally, Thorndike’s idea that responses that are followed by pleasure will be strengthened and those that are followed by discomfort will be weakened. Nowadays, the term refers to the idea that operant or instrumental behaviors are lawfully controlled by their consequences.
Learned helplessness effect: Interference with learning a new instrumental action, typically an escape response, that is produced by exposure to uncontrollable (inescapable) electric shock.
Learned helplessness hypothesis: The theoretical idea that organisms exposed to inescapable and unavoidable shocks learn that their actions do not control environmental outcomes.
Learned irrelevance: In classical conditioning, the finding that when there is no contingency between a CS and a US in an initial phase, animals have difficulty learning an association between the two events when the events are later paired.
Learning theory: The modern field in which principles of learning, cognition, and behavior are investigated by studying animals learning under controlled laboratory conditions.
Learning/performance distinction: The idea that learning is not the same as performance, and that behavior may not always be an accurate indicator of knowledge.
Long-delay learning: Conditioning that occurs when there is a long period of time between the conditional stimulus and the unconditional stimulus.
Long-term memory: A theoretical part of memory that has a very large capacity and can retain information over long periods or retention intervals. Also used to characterize situations in which an experience has a long-lasting effect on behavior.
Matching law: A principle of choice behavior which states that the proportion of responses directed toward one alternative will equal (match) the percentage of reinforcers that are earned by performing that alternative.
Matching-to-sample: A procedure in which the organism is reinforced for responding to a test stimulus if it is the same as a “sample” stimulus.
McCollough effect: In color perception, the evocation of an opposite-color after-image by black-and-white stimuli that have been associated with a color.
Mediated generalization: Treating two stimuli as alike not because they are physically similar but because they are associated with a common stimulus.
Melioration: An explanation of matching which claims that the organism will always respond so as to improve the local rate of reinforcement. This ultimately leads to a steady state of behavior that matches the rates of reinforcement on the two alternatives.
Memory reactivation: Restoration of forgotten information after reexposure to part of the learning situation.
Modulation: When a stimulus influences behavior by increasing or decreasing the response evoked by another stimulus, rather than by eliciting a response itself.
Modules: Hypothetical specialized cognitive mechanisms that have evolved to deal with information in a restricted domain.
Morgan’s Canon: A law proposed by C. Lloyd Morgan which states that a behavior should always be explained by the simplest mental process possible (also known as the law of parsimony).
Multiple oscillator model: A model of interval timing that represents time in terms of the status of a set of hypothetical units that cycle between different values, each with a different fixed period over time.
Multiple schedule: A procedure in which two or more reinforcement schedules, each signaled by its own discriminative stimulus, are presented one at a time and alternated.
Multiple-time-scale model: A model of interval timing which assumes that the start of a trial is recorded in short-term memory and then gradually fades over time. Animals time events by associating them with the strength of this memory at a given point in time.
Natural selection: A process that allows individuals with certain features to leave more offspring in the next generation; typically, individuals without those features are less successful.
Negative automaintenance: The finding that pecking at a keylight conditional stimulus in pigeons may persist even when the peck prevents the reinforcer from occurring.
Negative contingency: A situation where the probability of one event is lower if another event has occurred. In classical conditioning, if the unconditional stimulus is less probable when the conditional stimulus has occurred, the conditional stimulus becomes a conditioned inhibitor. In instrumental conditioning, a biologically significant event may likewise be less probable if a behavior occurs. If the significant event is negative or aversive, then escape or avoidance learning occurs; if the significant event is positive, it is called omission. Also called negative correlation.
Negative contrast effect: When “expectation” of a large positive reward decreases the positive reaction to a smaller positive reward.
Negative occasion setter: In classical conditioning, a type of modulator that decreases the response evoked by another conditional stimulus in a way that does not depend on the modulator’s direct inhibitory relation with the unconditional stimulus.
Negative patterning: In classical conditioning, a procedure in which two conditional stimuli are paired with an unconditional stimulus when they are presented alone, but occur without the unconditional stimulus when they are combined. It is difficult for an elemental theory to explain why an organism can respond accordingly.
Negative reinforcement: A situation in which an operant behavior is strengthened (“reinforced”) because it removes or prevents a negative (aversive) stimulus.
Negative sign tracking: Movement away from a stimulus that signals either an aversive event or the reduced probability of a positive event.
Negative transfer: When learning one task interferes with learning or performance of a second task.
Network: A set of interconnected memory nodes.
Nodes: Memory representations of items in the world.
Occasion setter: In classical conditioning, a stimulus that may not itself elicit a response, but modulates behavior to another stimulus.
Omission: An instrumental or operant conditioning procedure in which the behavior prevents the delivery of a positive (reinforcing) stimulus. The behavior typically decreases in strength.
Operant conditioning: Any situation based on Skinner’s setup in which an organism can learn about its actions and consequences. The same as instrumental conditioning except that in an operant conditioning experiment the organism is “free” to make the operant response (e.g., lever-pressing) as often as it “wants” to.
Operant experiment: An experimental arrangement in which a reinforcer (such as a food pellet) is made contingent upon a certain behavior (such as lever-pressing).
Operant: A behavior that is controlled by its consequences. The canonical example is the rat’s lever-pressing, which is controlled by the food-pellet reinforcer.
Operant-respondent distinction: Skinner’s distinction between operant behavior, which is said to be emitted and controlled by its consequences, and respondent behavior, which is said to be elicited and controlled by its antecedents.
Operational behaviorism: An approach, started by Edward Tolman, which departs from radical behaviorism by using unobservable intervening variables (theoretical constructs) in the explanation of behavior. The approach is scientific as long as the theoretical constructs are carefully defined and falsifiable. It is the approach generally accepted by most modern scientific psychologists.
Opponent-process theory: A theory that emphasizes the fact that emotional stimuli often evoke an initial emotional reaction followed by an after-reaction of the opposite valence. With repeated exposure to the emotional stimulus, the after-reaction grows and the initial reaction weakens, which may fundamentally change the motivation behind instrumental behavior controlled by positive and negative stimuli.
Ordinal prediction: A hypothesis that specifies a greater-than or less-than relationship between two conditions or two groups.
Overexpectation effect: In classical conditioning, the finding that two conditional stimuli that have been separately paired with an unconditional stimulus may actually lose some of their potential to elicit conditional responding if they are combined and the compound is paired with the same unconditional stimulus.
Overshadowing: In classical conditioning, the finding that there is less conditioning to a weak conditional stimulus if it is combined with a more salient conditional stimulus during conditioning trials.
Panic disorder: A psychological disorder characterized by recurrent panic attacks and the fear of having additional ones.
Paradoxical reward effects: Any of several behavioral effects in which exposure to nonreinforcement appears to increase the strength of instrumental behavior (as in the partial reinforcement extinction effect), or exposure to larger reinforcers appears to decrease the strength of instrumental behavior (as in the “magnitude of reinforcement extinction effect”). Often involves frustration.
Partial reinforcement extinction effect (PREE): The finding that behaviors that are intermittently reinforced are more persistent (take longer to extinguish) than behaviors that are reinforced every time they occur.
Pavlovian-instrumental transfer: An effect in which a Pavlovian conditional stimulus is shown to influence the rate of an ongoing instrumental behavior if the conditional stimulus is presented while the organism is engaged in that behavior.
Peak procedure: A method for studying timing processes in which the first response after a fixed interval after the start of a signal is reinforced. Response rate as a function of time in the signal is used to assess the accuracy of timing.
Peak shift: In discrimination learning, a change in the generalization gradient surrounding S+ such that the highest level of responding moves away from S+ in a direction away from the S-.
Perceptual learning: An increase in the discriminability of two stimuli that results from simple exposure to the two stimuli.
Place cells: Cells in the rat hippocampus that become active when the animal is in a particular location.
Positive contingency: A situation where the probability of one event is higher if another event has occurred. In classical conditioning, if the unconditional stimulus is more probable when the conditional stimulus has occurred, the conditional stimulus becomes a conditioned excitor. In instrumental conditioning, a biologically significant event may likewise be more probable if a behavior occurs. If the significant event is negative or aversive, then punishment occurs; if the significant event is positive, then reward learning occurs.
Positive contrast effect: “Expectation” of a small positive reward can increase the positive reaction to a larger positive reward.
Positive occasion setter: In classical conditioning, a type of modulator that increases the response evoked by another conditional stimulus in a way that does not depend on the modulator’s direct association with the unconditional stimulus.
Positive patterning: In classical conditioning, a procedure in which two conditional stimuli are presented with the unconditional stimulus when they are presented together, but without the unconditional stimulus when they are presented alone.
Positive reinforcement: An instrumental or operant conditioning procedure in which the behavior is followed by a positive stimulus or reinforcer. The behavior typically increases in strength.
Pre-commitment strategies: A method for decreasing impulsiveness and increasing self-control in which the individual makes choices well in advance.
Premack principle: The idea that reinforcement is possible when a less-preferred behavior will allow access to a more-preferred behavior.
Preparedness: The extent to which an organism’s evolutionary history makes it easy for the organism to learn a particular association or response. If evolution has made something easy to learn, it is said to be “prepared.”
Primary reinforcer: An event that unconditionally reinforces operant behavior without any particular training.
Primed: When a node or representation has been activated in short-term memory.
Proactive interference: Memory impairment caused by information learned or presented before the item that is to be remembered.
Probabilistic contrast model: A model developed to explain associative learning in humans that computes contingencies between events by defining and comparing the probability of an event in the presence and absence of selected cues.
Procedural memory: Memory for how to automatically execute or perform a particular behavioral or cognitive task.
Protection from extinction: In classical conditioning, the finding that extinction trials with a conditioned excitor may be ineffective at reducing conditional responding if the excitor is combined with a conditioned inhibitor during extinction.
Prototype theory: An approach to categorization which assumes that organisms learn what is typical or average for a category and then respond to new exemplars according to how similar they are to the average.
Prototype: Representation of what is typical or average for a particular category.
Pseudoconditioning: A process whereby a conditional stimulus can evoke responding because the organism has merely been exposed to the unconditional stimulus, rather than true associative learning.
Punisher: An aversive stimulus that decreases the strength or probability of an operant behavior when it is made a consequence of the response.
Punishment: An instrumental or operant conditioning procedure in which the behavior is followed by a negative or aversive stimulus. The behavior typically decreases in strength.
Quantitative law of effect: A more general, but still quantitative, statement of the matching law in which an operant response is viewed as being chosen over all other potential responses.
Radial maze: An elevated maze that has a central area from which arms extend in all directions.
Radical behaviorism: The type of behaviorism identified with B. F. Skinner which emphasizes the exclusive study of external events, such as observable stimuli and responses, and avoids any inferences about processes inside the organism.
Rapid reacquisition: In classical conditioning, the quick return of an extinguished conditional response when the conditional stimulus and unconditional stimulus are paired again. In instrumental conditioning, the quick return of extinguished behavior once the response and reinforcer are paired again.
Ratio schedule: A schedule of reinforcement in which the delivery of each reinforcer depends on the number of responses the organism has performed since the last reinforcer.
Rationalism: Term used to refer to Kant’s school of thought, in which the mind was thought to act on experience with a set of inborn predilections and assumptions.
Recuperative behaviors: Behaviors, such as licking a wound, which are elicited by tissue damage and function to promote healing.
Reference memory: Another name for long-term memory.
Reflex action: A mechanism through which a specific environmental event or stimulus elicits a specific response. Originated from Rene Descartes.
Reinforcement theory: A phrase used to describe learning theories, like Thorndike’s, which assume that reinforcement is necessary for learning.
Reinforcement: An instrumental or operant conditioning procedure in which the behavior’s consequence strengthens or increases the probability of the response. See positive reinforcement and negative reinforcement.
Reinforcer devaluation effect: The finding that an organism will stop performing an instrumental action that previously led to a reinforcer if the reinforcer is separately made undesirable through association with illness or satiation.
Reinforcer: Any consequence of a behavior that strengthens the behavior or increases the probability that the organism will perform it again.
Reinstatement: Recovery of the learned response in either classical or instrumental conditioning when the unconditional stimulus or reinforcer is presented alone after extinction.
Relapse: The return of undesirable cognitions, emotions, or behaviors after apparent improvement.
Relative validity: In classical conditioning, an experimental design and result that supports the view that conditioning is poor when the conditional stimulus is combined with a better predictor of the unconditional stimulus.
René Descartes: (1596-1650) French philosopher and mathematician who distinguished between mind and body, and also discussed reflex action as a mechanical principle that controls the activity of the body.
Renewal effect: Recovery of responding that occurs when the context is changed after extinction. Especially strong when the context is changed back to the original context of conditioning.
Respondent: A behavior that is elicited by an antecedent stimulus.
Response deprivation hypothesis: The idea that restricting access to a behavior below its baseline or preferred level will make access to that behavior a positive reinforcer.
Response form: The qualitative nature of the conditional response. Determined by both the unconditional stimulus and by the nature of the conditional stimulus.
Retardation-of-acquisition test: A test procedure that identifies a stimulus as a conditioned inhibitor if it is slower than a comparison stimulus to acquire excitation when it is paired with an unconditional stimulus.
Retrieval failure: Inability to recover information that is stored in long-term memory. A common cause of forgetting.
Retrieval-generated priming: Activation of an item, node, or representation in short-term memory that occurs when a cue that is associated with that item is presented.
Retroactive interference: Memory impairment caused by information learned or presented after the item that is to be remembered.
Reward learning: An instrumental or operant conditioning procedure in which the behavior is followed by a positive event. The behavior typically increases in strength.
rG-sG mechanism: A theoretical process that allowed Hull, Spence, and others to explain in S-R terms how “expectations” of reward motivate instrumental responding.
R-S* learning: Another term used to describe instrumental and operant conditioning that emphasizes the theoretical content of that learning (an association between a behavior, R, and a biologically significant event, S*).
Scalar property: A property of interval timing in which the probability of responding is a similar function of the proportion of time in the interval being timed, regardless of the actual duration of that interval.
Schedule-induced polydipsia: Excessive drinking that is observed if animals are given food reinforcers at regular intervals.
Search image: An attentional or memory mechanism that helps predators search for specific cryptic prey.
Second-order or higher-order conditioning: A classical conditioning procedure in which a conditional response is acquired by a neutral stimulus when the latter is paired with a stimulus that has previously been conditioned.
Self-generated priming: Activation of an item, node, or representation in short-term memory that occurs when the item is presented.
Semantic memory: A subset of declarative memory that corresponds to memory for various invariant facts about the world.
Sensitization: An increase in the strength of an elicited behavior that results merely from repeated presentations of the eliciting stimulus.
Sensory preconditioning: A classical conditioning procedure in which two neutral stimuli are first paired with each other, and then one of them is paired with an unconditional stimulus. When the other neutral stimulus is tested, it evokes a conditional response, even though it was never paired with the unconditional stimulus itself.
Sequential theory: A theory of the partial reinforcement extinction effect that suggests that extinction is slow after partial reinforcement because the behavior has been reinforced while the organism remembers recent nonrewarded trials.
Shaping or shaping by successive approximations: A procedure for training a new operant behavior by reinforcing behaviors that are closer and closer to the final behavior that is desired.
Short-term memory: A theoretical part of memory that has a small capacity and can retain information only briefly. Also used to characterize situations in which an experience has only a short-lasting effect on behavior.
Sign tracking: Movement toward a stimulus that signals a positive event or the reduced probability of a negative event.
Simultaneous conditioning: In classical conditioning, a procedure in which the conditional stimulus and unconditional stimulus are presented at the same time.
Skinner box: An experimental chamber that provides the subject something it can repeatedly manipulate, such as a lever (for a rat) or a pecking key (for a pigeon). The chamber is also equipped with mechanisms that can deliver a reinforcer (such as food) and other stimuli (such as lights, noises, or tones).
SOP theory: A theory of classical conditioning that emphasizes activation levels of elements in memory nodes corresponding to conditional stimuli and unconditional stimuli, especially as the activation levels change over time.
Species-specific defense reactions (SSDRs): Innate reactions that occur when an animal encounters a predator or a conditional stimulus that arouses fear. They have probably evolved to reduce predation. Examples are freezing and fleeing.
Specific hungers: The tendency for animals to seek and prefer certain foods that might contain specific nutrients they are currently deprived of.
Spontaneous recovery: The reappearance, after the passage of time, of a response that had previously undergone extinction. Can occur after extinction in either classical or instrumental conditioning.
S-R learning: The learning of an association between a stimulus and a response.
S-S learning: The learning of an association between two stimuli.
S-S* learning: Another term used to describe classical or Pavlovian conditioning that emphasizes the theoretical content of that learning (an association between a stimulus, S, and a biologically significant event, S*).
Standard operating procedures (SOP): An established procedure to be followed in carrying out a given operation or in a given situation. In SOP theory of classical conditioning, the standard dynamics of memory.
Standard pattern of affective dynamics: According to opponent process theory, the characteristic sequence of responses elicited by a novel emotional stimulus.
Stimulus control: When operant behaviors are controlled by the stimuli that precede them.
Stimulus elements: Theoretical stimuli or features that make up more complex stimuli.
Stimulus generalization gradient: A characteristic change in responding that is observed when organisms are tested with stimuli that differ in increasing and/or decreasing steps from the stimulus that was used during training.
Stimulus relevance: The observation that learning occurs more rapidly with certain combinations of conditional and unconditional stimuli (such as a taste and illness) than with other stimulus combinations (such as taste and shock).
Stimulus sampling theory: A mathematical theory proposed by Estes which extended Guthrie’s idea of stimulus elements.
Stimulus substitution: In classical conditioning, the idea that the conditional stimulus is associated with the unconditional stimulus and becomes a substitute for it (eliciting the same response).
Structuralism: A school of psychology, especially active in the late 1800s and early 1900s, which relied on introspection as a method for investigating the human mind.
Substitutes: Two or more commodities or reinforcers that can replace or be exchanged for one another, as demonstrated when increasing the price of one of them will decrease the consumption of it and increase demand for the other. For example, Coke and Pepsi.
Successive negative contrast: A negative contrast effect in which exposure to a large positive reward decreases the subsequent positive reaction to a smaller positive reward than would ordinarily be observed.
Summation test: A test procedure in which conditional stimuli that are conditioned separately are then combined in a compound. The procedure can identify a stimulus as a conditioned inhibitor if it suppresses responding evoked by the other stimulus (and does so more than a comparison stimulus that might reduce responding through generalization decrement).
Superposition: The common finding in research on interval timing that responding as a function of the proportion of the interval being timed is the same regardless of the duration of the actual interval being timed—the curves appear identical when they are plotted on the same graph. Demonstrates the scalar property.
Superstitious behavior: A behavior that increases in strength or frequency because of accidental pairings with a reinforcer.
Suppression ratio: The measure of conditioning used in the conditioned emotional response or conditioned suppression method. It is the value obtained by dividing the number of responses made during the conditional stimulus by the sum of the responses made during the conditional stimulus and during an equal period of time before the stimulus. If the value is .50, no conditioned suppression has occurred. If the value is 0, a maximum amount of conditioned suppression has occurred.
Surprisingness of the US: The difference between the actual magnitude of the unconditional stimulus and that which is predicted by conditional stimuli present on a conditioning trial. In the Rescorla-Wagner model, learning only occurs if there is a discrepancy between the unconditional stimulus that is predicted and the one that actually occurs.
SΔ (S-): A discriminative stimulus that suppresses operant responding because it signals a decrease in the availability of reinforcement or sets the occasion for not responding.
Taste aversion learning: The phenomenon in which a taste is paired with sickness, and this causes the organism to reject that taste in the future.
Taste-reactivity test: A method in which experimenters examine the rat’s behavioral reactions to tastes delivered directly to the tongue.
Temporal bisection: A procedure used to study interval timing in which one response is reinforced after a signal of one duration, and another response is reinforced after a signal of another duration. When responding to stimuli with intermediate durations is tested, the middle point (the duration at which the animal makes either response with equal probability) occurs at the geometric mean of the two reinforced durations (e.g., 4 seconds if 2 and 8 second cues have been reinforced).
Temporal generalization: A procedure for studying interval timing in which an animal is first reinforced if it responds after stimuli of a specific duration and then stimuli of increasing and/or decreasing durations are tested.
Terminal behavior: Stereotyped behaviors that occur toward the end of the interval between regularly delivered reinforcers.
Touchscreen: A device used for detecting the location of responses directed at a screen. Usually consists of perpendicular sets of photobeams across the surface of the screen.
Trace conditioning: A classical conditioning procedure in which the unconditional stimulus is presented after the conditional stimulus has been terminated.
Trace decay: The theoretical idea that forgetting is due to the actual loss or destruction of information that is stored in memory.
Transfer tests: A procedure in which an organism is tested with new stimuli or with old stimuli in a new situation. In categorization experiments, this is the method of testing the animal’s ability to categorize stimuli it has not categorized before.
Transfer-of-control experiments: Experiments that test for Pavlovian-instrumental transfer, and thus demonstrate the effects of presenting a Pavlovian conditional stimulus on the rate of an ongoing instrumental behavior.
Transposition: Differential responding to two stimuli, apparently according to their relation rather than their absolute properties or individual features. For example, after discrimination training with two stimuli that differ along a dimension (e.g., size), the organism might choose a more extreme stimulus along the dimension rather than the stimulus that was previously reinforced.
Two-factor theory (two-process theory): A theory of avoidance learning that states that (1) Pavlovian fear learning allows warning stimuli to evoke conditioned fear that motivates avoidance behavior and provides the opportunity for (2) reinforcement of the instrumental avoidance response through fear reduction. More generally, the theoretical idea that Pavlovian learning is always a second process at work in instrumental learning situations.
Unconditional response (UR): In classical conditioning, an innate response that is elicited by a stimulus in the absence of conditioning.
Unconditional stimulus (US): In classical conditioning, the stimulus that elicits the response before conditioning occurs.
US preexposure effect: Interference with conditioning that is produced by repeated exposures to the unconditional stimulus before conditioning begins.
Variable interval schedule: A schedule of reinforcement in which the behavior is reinforced the first time it occurs after a variable amount of time since the last reinforcer.
Variable ratio schedule: A schedule of reinforcement in which a variable number of responses are required for delivery of each reinforcer.
Warning signals: Environmental stimuli in avoidance learning situations that are associated with the aversive stimulus through Pavlovian conditioning.
Water maze: An apparatus used to investigate spatial learning in which the rat or mouse subject swims in a circular pool of milky water to find a submerged platform on which to stand.
Within-compound association: A learned association that may be formed between two conditional stimuli when they are presented together in a compound.
Working memory: A system for temporarily holding and manipulating information; another name for short-term memory.