A  COGNITIVE  PROCESS  MODEL

 

OF PERSON EVALUATION AND IMPRESSION FORMATION

 

BASED ON A COMPUTER SIMULATION OF NATURAL LANGUAGE PROCESSING

 

 

 

 

by

 

Rainer von Königslöw

 

 

 

 

A dissertation submitted in partial fulfillment

of the requirements for the degree of

Doctor of Philosophy

(Social Psychology)

in The University of Michigan

1974

 

 

 

 

Doctoral Committee:

 

Professor Dorwin P. Cartwright, Co—Chairman

Assistant Professor Michael H. O’Malley, Co—Chairman, The University of California at Berkeley

 

Professor John P. Crecine

Professor Joyce B. Friedman

Professor Melvin Manis

 


 

DEDICATION

 

 

 

 

 

 

 

 

 

 

to  Ernest Lindner

Canadian artist and educator

 

 

 

 

 

 

 

in appreciation for many good discussions


 

ACKNOWLEDGMENTS

 

I would like to thank the members of my committee, Dorwin Cartwright, Pat Crecine, Joyce Friedman, Melvin Manis, and Michael O’Malley. I particularly wish to thank Melvin Manis and Michael O’Malley for their help and encouragement and for arranging financial support. I would also like to thank Robert Axelrod for reading the thesis and sitting in on my oral defense.

 

Throughout my years in the program, Mrs. Catherine Hoch has always had my best interests at heart. She has been a good friend; without her things would have been much more difficult.

 

Bruce Wilcox helped a great deal by being willing to consider my special problems while he constructed the NTS—LISP programming language. Nina Macdonald was helpful in editing the thesis and Marilyn Miller was an excellent and dedicated typist.


TABLE OF CONTENTS

 

 

DEDICATION                                                                                                                                              ii

ACKNOWLEDGMENTS                                                                                                                          iii

 

CHAPTER 1: INTRODUCTION                                                                                                             1

 

CHAPTER 2:      COMPREHENDING AND ACTING ON SENTENCES: A PERSPECTIVE FOR CONCEPTUALIZING AND SIMULATING COGNITIVE PROCESSES                                                 14

 

Cognitive processing as a type of information                 processing                            23

 

CHAPTER 3:      STRUCTURES AND PROCESSES: ANALYSIS AND EVALUATION         29

 

Experiments, theories and models:  analysis and

                                reconstruction by simplification and analogy                                                    35

                              Finding an appropriate experimental task to

                                preserve the similarity between the experimental

                                situation and the ordinary social situation .                                                        39

                              Knowledge about the experiment: Data structures

                                and measurement                                                                                                   41

                              An analysis of the experimental situation as a

                                 social interaction using sentences as basic data points                                45

                              Data representations for the cognitive content of sentences                            50

Theoretical claims associated with cognitive

                                process models: decomposition and delineation                                            60

                              Evaluation of the simulation of hypothesized sub processes where

                                 the output of the process is not immediately or directly reflected

                                 in observable behavior                                                                                          72

                              Specifying the stimulus situation: the context in

                                which the prediction is made                                                                                75

                              Knowledge, capabilities and goals the subject brings

                                 to the experiment                                                                                                    79

                              Memories and expectancies developed during the

                                experimental interaction                                                                                        83

Evaluating decomposition and structural delineation claims about

 the subject’s knowledge, capabilities and expectancies                                85

 

CHAPTER 4:      REPRESENTING THE SUBJECT’S KNOWLEDGE ABOUT THE TERMS

AND CONCEPTS NEEDED FOR COPING WITH IMPRESSION

FORMATIONTASKS                                                                                               89

 

CHAPTER 5:      MODELLING THE SENTENCE PROCESSING CAPABILITY OF THE

                              SUBJECT                                                                                                              109

                              Parsing                                                                                                                    118

Sentence comprehension                                                                                      123

Sentence execution                                                                                                 135

 

CHAPTER6:       CONCLUSION                                                                                                        144

APPENDIX:         PROGRAM LISTING                                                                                             149

BIBLIOGRAPHY                                                                                                                                                  150


 

Chapter 1

 

INTRODUCTION

 

 

The main goal of this thesis is to explore the feasibility of constructing a cognitive process model that can account for some of the phenomena found in impression formation and person evaluation studies. It is hoped that simulating the cognitive processes of the subject will lead to more comprehensive and integrative theoretical accounts of these phenomena.

 

It is assumed that the cognitive processes involved in coping with impression formation and person evaluation tasks are mainly based on the ability of the subject to comprehend and act on sentences. This includes the descriptive sentences on which the impressions and the evaluations are based. It also includes the instructions for the experiment as well as any questions on the questionnaire or asked by the experimenter. A computer simulation model was constructed therefore to simulate some of the processes that might be involved in comprehending and acting on these sentences. Let us now briefly illustrate how some of the impression formation and person evaluation behavior is simulated by the model.

 

Person evaluation is simulated in terms of the ability of the model to evaluate individuals on a scale. These evaluations are based on the impression formed as a result of reading sentences describing the individual to be evaluated. The sentences may contain one or more descriptors such as adjectives or verbs. One area of concern, investigated extensively, is how subjects combine the evaluative information from several descriptors to form an overall evaluation of the individual. Numerical algorithms, such as averaging and summation, that can represent this information inte­gration have received particular attention. For a review of this area see Anderson (1971) and Slovic and Lichtenstein (1971). In this model the focus is more on how the descriptive information about the individual may be processed and integrated. It is hoped that a model for these processes will help to account for semantic problems such as redundancy.

 

The model simulates the complete experimental interaction, including reading the instructions for the task and then following the instructions to do the evaluative task itself. The instructions given to the model are simplified to fit the sentence processing restrictions of the model. The following reflects an actual inter­action with the model. The sentences are typed in on a computer terminal. The input to the model is in italic.

 

 

?Take instructions!

Please enter the instructions for taskl:

?Ima gins students! ?Read an assertion! ?Evaluate the individual ?Repeat!

?End instructions!

•   This completes the instructions for task]..

 

For the sake of convenience in describing the interaction we shall anthropomorphize the model, as if it were like one of the subjects it is supposed to simulate.

 

The model indicates readiness to receive a sentence by printing a question mark on the left—hand side of a new line. By entering the command “Take instructions!”, the model is alerted to the fact that any commands that follow are not to be acted on immediately but rather should be remembered as the description of a task. To differentiate the tasks it knows how to do, it gives them names like ‘task1’ or ‘task2’. The next command, “Imagine students!”, tells the model that the task does not involve particular students that it already knows about. The command therefore implies that it cannot use any prior information about the characteristics of particular students while doing the task. The next two commands specify the body of the task, and the instruction “Repeat!” tells it to start the task over again. An imperative like “End instructions!” or “Quit!” then informs the model that the instructions are complete, that it should now remember the task and return to a normal mode of interaction. We could now proceed normally by making assertions or asking questions, or we may ask the model to do the task it has just learned about.

 

?Execute task1!

?Bill is intelligent.

$p leasant 1 2 3 4 5 6 7 unpleasant

•             $

$nil

 

?Bill is lazy.

$pleasant 1 2 3 4 5 6 7 unpleasant

.                           $

$nil

 

?Bill is intelligent and lazy.

$pleasant 1 2 3 4 5 6 7 unpleasant

.              $

$ni1

 

? Quit!

•   This statement is not a ASSERTION, it is a IMPERATIVE.

*   Should I proceed with it? (Yes No) yes

•   End of the task.

 

 

The model now executes the commands that make up the task. It first imagines the students and then expects to read an assertion. The model again indicates its readiness to read and interpret a sentence by printing a question mark. The expression “the individual” in the next command causes it to examine its memory of the sentence or sentences it just read to locate this individual. It will then proceed to evaluate that individual on a scale.

 

To do so it examines the impression it has of the individual, based on whatever it has learned about him from preceding assertions. Assuming that the individual is a student, since the model had been told to imagine students, its knowledge about him is restricted to what it learned from the preceding assertion. Before making an evaluation it must know the dimension in terms of which the characteristics of this individual should be evaluated. Therefore it must first read a scale. Readiness to read a scale is indicated by a dollar sign prefix in the next line. The model determines the dimension to be used in the evaluation by examining the scale—end terms. It then uses the attributes specified in its impression of the individual to determine his evaluation on that dimension. This evaluation must be expressed in terms of the scale alternatives. The model responds by putting a dollar sign underneath the scale category it has chosen.

 

Since more than one scale could be used to evaluate an indivi­dual, it will expect to read further scales. The term ‘nil’ is used here to tell the model that there are no more scales for that evaluation. To stop the task we can issue the command “Quit!” when the model is expecting a sentence. Since this command is not an assertion it will know that it is not part of the task. It will therefore interrupt and ask whether to act on the command. After completion of the task we are back in the normal interactive mode.

 

To make this process more explicit, we might ask the model to evaluate the evaluations.

 

?Verbalize evaluations!

.   Yes Sir.

?Execute task 1!

?Bill is intelligent.

The impression on which the evaluation is based:

BILL IS A PERSON AND A STUDENT.

BILL IS INTELLIGENT.

$pleasant 2 2 3 4 5 6 7 unpleasant

The terms “PLEASANT” and “UNPLEASANT” suggest that the evaluation should be based on the dimension “AFFECTIVE”.

Based on the predicate(s): INTELLIGENT and on the variable (s): STUDENT

the evaluation on the dimension “AFFECTIVE” is: 0.70

•    I shall assume that the SEVEN response alternatives

•    divide the dimension into categories of equal width.

•    I therefore chose the scale alternative that is the THIRD from the left, the high-valued end.

 

$nil

 

?Bill is lazy.

•          The impression on which the evaluation is based:

BILL IS A PERSON AND A STUDENT.

•       BILL IS LAZY.

$pleasant  1 2 3 4 5 6 7 unpleasant

The terms “PLEASANT” and “UNPLEASANT” suggest that the evaluation should be based on the dimension “AFFECTIVE”.

Based on the predicate(s): LAZY

•          and on the variable(s): STUDENT

the evaluation on the dimension “AFFECTIVE” is: 0.35

 

I shall assume that the SEVEN response alternatives divide the dimension into categories of equal width.

•   I therefore chose the scale alternative that is the THIRD from the right, the low-valued end.

 

$ nil

 

?Bill is intelligent and lazy.

•          The impression on which the evaluation is based:

• BILL IS A PERSON AND A STUDENT.

BILL IS INTELLIGENT AND LAZY.

$pleasant 1 2 3 4 5 6 7 unpleasant

•          The terms “PLEASANT” and “UNPLEASANT” suggest that the evaluation should be based on the dimension “AFFECTIVE”.

Based on the predicate(s): INTELLIGENT and LAZY and on the variable (s): STUDENT

the evaluation on the dimension “AFFECTIVE” is: 0.50 I shall assume that the SEVEN response alternatives

•    divide the dimension into categories of equal width.

•          I therefore chose the scale alternative that is the FOURTH from the left, the high-valued end.

 

$nil

 

? Quit!

•    This statement is not a ASSERTON, it is a IMPERATIVE.

*      Should I proceed with it? (Yes No) yes End of the task.

 

The model incorporates an averaging algorithm for combining the evaluations. The first two evaluations were obtained by combining the values for the term ‘student’ (0.6) with the values for ‘intelli­gent’ (0.8) and ‘lazy’ (0.1) respectively. The third evaluation is based on the average of all three terms. For this illustration the values were chosen freely, but they could be based on values obtained from a specific subject or on population norms. But the main concern in the simulation is not with the particular algorithm for combining evaluations but rather with the cognitive processes involved in following the instructions, comprehending the descriptive sentences, forming an impression of the person, reading the scale and perceiving the scale dimension, and with the cognitive processes involved in utilizing the impression to make an evaluation of the person on that scale.

 

It is hoped that such an investigation of cognitive processes can help to account for some linguistic and semantic effects that may influence the evaluation. It has been shown for instance that the evaluation may be affected by redundancy among the attribute trait descriptors presented to the subject (Dustin and Baldwin, 1966; von Königslöw, 1970). We have also tried to conceptualize how sub­jects might interpret descriptions that are expressed negatively such as “Bill is not intelligent”, and how they might interpret descrip­tions that are ambiguous about what traits should be attributed to the individual: “Bill is intelligent or he lies”.

 

?Execute task1!

?Bill is cautious and indecisive.

$hey                                    ‘indecisive’ is seen as lying on

   Yes, I’m listening.                                                         the same dimension as ‘cautious’,

?Describe Bill!                         and as being more extreme so

   BILL IS A PERSON AND A STUDENT.       that it implies a cautious dis—

   BILL IS INDECISIVE.                  position as well. ‘cautious’

                                        therefore is redundant and does

?Resume!                                not appear in the characteriza—

$pleasant 1 2 3 4 5 6 7 unpleasant              tion of Bill.

.                $

$nil

 

?Bill is not intelligent.

$hey

Yes, I’m listening.

?Describe Bill!

BILL IS A PERSON AND A STUDENT.

•    BILL IS UNINTELLIGENT.

 

?Resume!

$pleasant 1 2 3 4 5 6 7 unpleasant

.              $

$ nil

 

?hey

Yes, I’m listening.

? Verbalize evaluations!

?Restart!

?Bill is intelligent or he lies.

The impression on which the evaluation is based:

           BILL IS A PERSON AND A STUDENT.

    It may be true that:

            BILL IS INTELLIGENT AND HE/SHE LIES.        The description was

•        or it may be true that:                                 ambiguous. All inter­-

      BILL IS HONEST AND INTELLIGENT.                  pretations are con­-

    or it may be true that:                                                                 sidered for the

            BILL IS UNINTELLIGENT AND HE/SHE LIES.                                evaluation.

$pleasant 1 2 3 4 5 6 7 unpleasant

•         The terms “PLEASANT” and “UNPLEASANT” suggest that the evaluation

          should be based on the dimension “AFFECTIVE”.

          Based on the predicate(s): INTELLIGENT and LIE

•   and on the variable(s): STUDENT

    the evaluation on the dimension “AFFECTIVE” is:                             0.50

•         Based on the predicate ( s): HONEST and INTELLIGENT

•         and on the variable (s): STUDENT

•         the evaluation on the dimension “AFFECTIVE” is:                              0.70

•         Based on the predicate(s): LIE and UNINTELLIGENT

          and on the variable(s): STUDENT

    the evaluation on the dimension “AFFECTIVE” is:                              0.33

 

.         The overall evaluation on the dimension “AFFECTIVE” is: 0.51

.         I shall assume that the SEVEN response alternatives

.         divide the dimension into categories of equal width.

.         I therefore chose the scale alternative that is

.         the FOURTH from the left, the high-valued end.

 

$nil

? hey

• Yes, I’m listening.

?Don’t verbalize evaluations!

 ?Restart!

?Bill is not indecisive. $hey

Yes, I’m listening.

?Assess Bill!

•   BILL IS A PERSON AND A STUDENT.

.    It may be true that:

BILL IS CAUTIOUS.

.    or it may be true that:

BILL IS BOLD.

?Resume!

$pleasant 1 2 3 4 5 6 7 unpleasant

.              $


$ nil

 

 

 

 

 

?Bill is not cautious. $hey

Yes, I’m listening.

?Assess Bill!

•      BILL IS A PERSON AND A STUDENT.    Since Bill is not cautious he

•      BILL IS BOLD.                                                   is unlikely to be indecisive,

?Resume!                                 which is even more extreme on

$pleasant 1 2 3 4 5 6 7 unpleasant               that dimension.

.            $

$nil

.

?Qujt!

This statement is not a ASSERTION, it is a IMPERATIVE.

*      Should I proceed with it? (Yes No) yes

•               End of the task.

 

It has been shown that subjects may not simply rely on the information obtained in the descriptive sentences but that they may also make inferences about further characteristics of that person. For instance, Asch (1946) presented a group of subjects with a list of trait descriptors that supposedly all referred to the same indivi­dual. This list was composed of the terms: intelligent, skillful, industrious, warm, determined, practical, and cautious. Another group of subjects was given the same list but with the term ‘cold’ substituted for ‘warm’. Subjects were then given pairs of antonyms and asked to choose the terms that would best characterize the per­son. Asch found that 90 percent of those who had been given the descriptive list including the term ‘warm’ chose the term ‘generous’, while 92 percent of those who had been given the list containing ‘cold’ checked its antonym ‘ungenerous’. Similar results were found for some other pairs of antonyms.

 

It appears that the subjects used the descriptions of the individual to make inferences about him. These inferences must have been based on knowledge that the subject brought to the experiment. This knowledge could be conceptualized in terms of theories of the form: “If a person is warm then he must also be generous”. To represent the ability of the subject to make these inferences, this kind of theory can be defined in the model. The inferences can then be simulated by asking the model to apply these theories.

 

?Define theories!

Enter THEORY1.

?An intelligent student who is lazy is bold.

Enter THEORY2.

? Quit!

This is not an assertion and therefore cannot be a theory.

I shall act on it immediately.

ONE theory was defined. ?Bill is intelligent and lazy.

•          Ok.

?Is Bill bold?

No.

?Apply theory 1!

THEORY1 applies to BILL.

 

?Is Bill bold?

• Yes. ?Describe Bill!

BILL IS A PERSON AND A STUDENT.

BILL IS INTELLIGENT.

BILL IS LAZY.

BILL IS BOLD.

 

To simulate the inferences the subject might make we have to have a representation of the theories on which these inferences are based. In view of this use of theories to make inferences, theories are conceptualized as conditional assertions, and can be entered into the model in this form. The command “Define theories!” alerts the model to the fact that the assertion that is following is to be interpreted and remembered as a theory. It therefore does not inter­pret the sentence as asserting something that should be remembered immediately. The sentence is used as an assertion only when we ask the model to apply the theory.

 

There is also some evidence that the impression one has of a person can affect how one behaves toward the person. Kelley (1950) had a guest lecturer lead a twenty minute discussion in three sec­tions of a psychology course. Before the lecturer arrived, students were handed biographical notes describing him as a graduate student who was considered to be a rather warm (cold) person, industrious, critical, practical, and determined. The lecturer then led the class for twenty minutes of discussion. The number of times indivi­dual students initiated interaction with the guest was recorded. Students who had received ‘warm’ descriptions tended to initiate more interactions than those to whom the lecturer had been described as cold.

 

While the difference was not quite significant statistically, it does support the notion that the behavior of individuals toward another person may be affected by their impressions of that person.

 

An interesting problem arising from these findings is to con­sider what sort of cognitive processes might be involved in that sort of situation. Thus we might posit that the students in the discussions will attempt to approach and interact with the lecturer if they have a favorable impression of that lecturer. On the other hand we might assume that they would tend to avoid the lecturer if they have a negative impression of him.

 

We might now conceptualize this behavior of approaching and interacting with a person as a task that will be attempted whenever the circumstances are appropriate for it. We can therefore identify at least two facets involved in executing this task. On one hand the behavior is conditional in that it will only be done in the appropriate circumstances. On the other hand it is spontaneous and recurrent in that it might be engaged in repeatedly without specific instructions to do so. In the model we try to simulate only these two aspects of this kind of behavior.

 

?Take instructions!

Please enter the instructions for task2:

?If an intelligent student who is industrious is present, approach him!

?If an intelligent student who is lazy is present, avoid him! ?Qui t!

*      Do you want me to execute this immediately? (Yes No) yes This completes the instructions for task2.

 

This defines the task. The condition that the person be present represents the appropriate circumstances, while the condition that the person be a student who is intelligent and who is industrious or lazy represents the conditions for the behavior, the impression.

 

?Attempt task2!

• Yes, Sir.

 

Normally the model will only execute a task if explicitly asked to do so. This command instructs the model to attempt executing this task whenever it has completed processing any other sentence or task that it was explicitly asked to deal with, and before it receives new instructions.

 

?Bill is intelligent and industrious.

Ok.                     Nothing happens because the model has not been told

?Bill is present.                     that Bill is present.

Ok.

.          I would like to speak to you, BILL.

?Bill is lazy.

Ok.

.    I don’t want to speak to you, BILL.

?Bill is absent.

Ok.

 

Before examining in more detail how the model simulates these phenomena we have to consider what it might mean to simulate cognitive processes, and how cognitive process models might differ from other types of models such as neurologically based models or behavioral models. We also have to develop some perspective for how such a model may be evaluated. This is particularly problematic for this type of model since we are attempting to describe and simulate pro­cesses that are not directly observable. Another problem that is particular to a simulation approach is to differentiate those aspects of the model that have substantive theoretical import from those that are merely heuristic. We should reiterate once again that this thesis is not intended to provide solutions for these problems, but rather that it is intended to explore and to provide some perspective on the feasibility and theoretical utility of constructing a cogni­tive process model to account for at least some of the phenomena found in impression formation and person evaluation studies.


Chapter 2

 

COMPREHENDING AND ACTING ON SENTENCES:

 

A PERSPECTIVE FOR CONCEPTUALIZING AND SIMULATING

 

COGNITIVE PROCESSES

 

 

In this chapter we try to gain a perspective on what it might mean to simulate the cognitive processes involved in doing an impression formation or person evaluation task. We shall start out by considering how to conceptualize these cognitive processes. We shall then go on to consider how we can analyze and decompose these cognitive processes so that they can be specified sufficiently to allow us to simulate them with computer algorithms. At the same time we have to consider how they can be linked to observable behavior so that we can evaluate the validity of our analysis and so that we can test the simulation.

 

In trying to account for the behavior of a subject in an experi­mental situation we can differentiate two kinds of problems toward which we might direct the account. The first type of problem focuses on the responses of the subject in the experimental situation. We would like to be able to predict how the subject will behave in a given situation. For person evaluation we would like to be able to predict from the information provided to the subject about an indivi­dual how he will evaluate that individual on some particular scale. A solution to this problem therefore would consist of an algorithm that allows us to predict the responses of the subject.

 

The second type of problem is concerned with accounting for the process which leads to the behavior observed in the experiment. We would like to know what internal processes the subject goes through in the evaluation, in dealing with the stimulus, and in coming up with the response. A solution to this problem would therefore con­sist of an accurate description or possibly a simulation of these processes.

 

In this thesis we are concerned with the second problem. We therefore face the problem of how these processes should be concep­tualized and how they should be described or simulated. A physiologi­cal approach to this problem would base an account of these processes on a description or simulation of the physiological and neural pro­cesses that might be involved. Even though the account may be idealized, the description would still refer to the behavior of potentially observable elements such as neurons or neural nets. A cognitive approach on the other hand would base an account of these processes on a description of the thinking and reasoning of the subject, and of the knowledge that might be utilized. The description therefore is expressed in terms of concepts that do not directly refer to observable objects or processes. We therefore face two further problems. The first concerns the conceptualization and specification of these processes. The second problem is the evalua­tion of whatever descriptions or simulations we may propose.

 

To be able to describe these processes we have to find methods for analyzing or decomposing them. To be able to simulate these processes, we have to be able to describe them in sufficient detail so that we can construct algorithms or computer programs to represent the processes. We therefore have to analyze and decompose highly abstract cognitive notions such as “forming an impression”, “making inferences”, and “evaluating an individual” into concrete component processes that can be represented by algorithms. It would therefore help if we could find a basic set of fairly well defined underlying processes that could be used to account for these more abstract notions.

 

This thesis is based on the assumption that we can use a limited set of processes concerned with the ability of the subject to recog­nize, comprehend, and act on sentences to account for and simulate most of the cognitive processes involved in impression formation and person evaluation tasks. We therefore have to show how we account for and simulate the sentence processing capabilities of the sub­ject. We also have to show how these processes can be used to account for some of the more abstract processes found in impression formation and person evaluation.

 

There is a second problem that arises from the fact that cogni­tive processes are not directly observable. Even though there pre­sumably are some physiological processes that correspond to the cognitive processes involved in sentence processing or in impression formation, we cannot use any direct observational methods to evaluate the correctness of our assumptions about the nature of the cognitive processes involved in impression formation and person evaluation tasks. Similarly we cannot use direct observational methods to evaluate the correctness or adequacy of the algorithms used to simulate the sentence processing capability of the subject. We there­fore have to rely on fairly indirect evidence to support the validity of our assumptions.

 

There are several kinds of arguments that can be used to support the assumption that the cognitive processes involved in doing impression formation and person evaluation tasks can be analyzed in terms of the capability of the subject to comprehend and act on sentences. We can assume that the subject depends on the instructions for his knowledge of what to do in a particular task. These instruc­tions are normally expressed in terms of a sequence of sentences. The cognitive dynamic involved in doing the tasks, i.e. what cog­nitive processes are involved and in what order they are invoked therefore presumably depends on the subject’s comprehension of the sentences in the instructions. The subject has to remember what the instructions asked him to do.

 

We can therefore assume that the subject uses the instructions to construct a characterization of the task that is then used in executing the task. This characterization can be seen as a cognitive map that specifies what to do at any given point in the task. This cognitive map must be based on some aspects of the meanings of the sentences comprising the instructions. This assumption is corro­borated by the fact that we can ask the subject to describe the task. The subject then uses sentences to describe the task. This descrip­tion will presumably be based on his understanding of the task, on his cognitive map for the task. But the subject can use sentences to express this knowledge. Furthermore, his description will presumably be a loose paraphrase of the original instructions. In the model we simulate the ability of the subject to use the sentences making up the instructions to construct a cognitive map that is then used in executing or in describing the task.

 

?Take instructions!

Please enter the instructions for taskl:

?Imagine students!

?Read an assertion!

?Evaluate the -individual!

?Repeat!

?Quit!

*      Do you want me to execute this immediately? (Yes No) yes

•    This completes the instructions for taskl.

 

?Execute task1!

?Bill is honest.

$pleasant 1 2 3 4 5 6 7 unpleasant

.                   $

$nil

.

?Bill lies.

$pleasant 1 2 3 4 5 6 7 unpleasant

.                         $

$nil

.

?Quit!

This statement is not a ASSERTION, it is a IMPERATIVE.

*      Should I proceed with it? (Yes No) yes

-           End of the task.

 

?Describe task 1!

TASK1 IS A TASK.

    Label: START-OF-TASK

             Execute the IMPERATIVE:

•   IMAGINE ALL STUDENTS!

       Execute the IMPERATIVE:

    READ SOME ASSERTIONS!

       Execute the IMPERATIVE:

          EVALUATE ALL IND[VIDUALS!

    Start again from label: START-OF-TASK

          Label: END-OF-TASK

•           Execute the IMPERATIVE:

•   UNIMAGINE ALL STUDENTS!

 

The model infers that if it imagined students for the task, after the task it should again be referring to the students it originally knew before the task. It therefore ‘unimagines’ the students to again associate the original characteristics of those students with their names.

 

Let us now look at the cognitive processes that are involved in following specific instructions. Let us consider the person evalua­tion task and let us assume that the subject has reached a point in the task where he has just completed reading an assertion about an individual. The cognitive map for the task specifies that he should now evaluate the individual. We might assume that the subject had received specific instructions on how to proceed in evaluating an individual on a scale. In this case the evaluation could be seen as another task for which he has been provided with a cognitive map. The cognitive map for the person evaluation task would then specify that the evaluation task should be executed at this point. We might thus get a hierarchy of tasks, where the most abstract, most general task is defined in terms of sub—tasks.

 

The subtask:  describing and evaluating the individuals under consideration

 

 

?Take instructions!

Please enter the instructions for task3:

?Describe the individual!

?Evaluate the individual!

?End instructions!

•         This completes the instructions for task3.

 

The main task

 

?Take instructions!

•   Please enter the instructions for task14:

?Imagine students!

?Read an assertion!

?Execute task3!      At this point the subtask is called on.

?Repeat!

?Quit!

*       Do you want me to execute this immediately? (Yes No) yes

•    This completes the instructions for task4.

 

?Execute task4!

?Bill is intelligent.

•         BILL IS A PERSON AND A STUDENT.

•   BILL IS INTELLIGENT.

.                                               Executing task3 as a

$pleasant 1 2 3 4 5 6 7 unpleasant subtask of task4

.                    $

$nil

.

?Bill is lazy.

BILL IS A PERSON AND A STUDENT.

BILL IS LAZY.

•                                  A second call on task3

$pleasant 1 2 3 4 5 6 7 unpleasant

.

$nil

.

?Quit!

• This statement is not a ASSERTION, it is a IMPERATIVE.

* Should I proceed with it? (Yes No) yes

• End of the task.

 

 

We could also assume that the subject may not have received instructions on how to evaluate an individual on a scale. In that case he could not rely on a set of instructions to construct a cog­nitive map for how to do the evaluation. We therefore have to assume that the subject already knows how to do this task, that he knows what cognitive processes to invoke for doing the evaluation. This knowledge or ability that the subject brings to the experiment must therefore specify a cognitive map somewhat like those considered above, that tells him what to do and in what order.

 

We must assume that the subject has at least some knowledge and some capabilities of this sort since we could not hope to fully explain all the tasks involved in doing the experiment. Thus we would not want to have to explain to the subject what he has to do to understand an assertion. Let us therefore assume that the subject brings to the experiment the capability of evaluating individuals on scales. The question then arises how we can decompose this capability, and how we can test the validity of our representation of this knowledge and our description or simulation of the processes assumed to be involved.

 

We might ask the subject to describe and explain what he is doing by reporting what he is thinking about while he is doing the task. This verbalization might then provide us with a cognitive map of what cognitive processes the subject goes through and what know­ledge he utilizes while doing the task. We can then use this report based on the awareness of the subject about what he is doing to help us in representing and simulating these processes. We might even go further and try to simulate this ability to report from awareness. To the extent that we can get the model to generate reports that are similar to the subject’s we would have additional, though indirect, support for the validity of our analysis of the cognitive processes involved.

 

?Verbalize evaluations!

•   Yes Sir.

?Execute task 1!

?Bill is intelligent.

The impression on which the evaluation is based:

•               BILL IS A PERSON AND A STUDENT. BILL IS INTELLIGENT.

$pleasant 1 2 3 4 5 6 7 unpleasant

The terms “PLEASANT” and “UNPLEASANT” suggest that the evaluation

•          should be based on the dimension “AFFECTIVE”.

•          Based on the predicate(s): INTELLIGENT

•          and on the variable(s): STUDENT

•          the evaluation on the dimension “AFFECTIVE” is: 0.70

•          I shall assume that the SEVEN response alternatives divide the dimension into categories of equal width.

•   I therefore chose the scale alternative that is the THIRD from the left, the high-valued end.

$nil

 

?Bill is lazy.

The impression on which the evaluation is based:

BILL IS A PERSON AND A STUDENT.

•       BILL IS LAZY.

$pleasant 1 2 3 4 5 6 7 unpleasant

•          The terms “PLEASANT” and "UNPLEASANT” suggest that the evaluation

•          should be based on the dimension “AFFECTIVE”.

•          Based on the predicate(s): LAZY

•          and on the variable(s): STUDENT

•          the evaluation on the dimension “AFFECTIVE” is: 0.35 I shall assume that the SEVEN response alternatives divide the dimension into categories of equal width. I therefore chose the scale alternative that is the THIRD from the right, the low—valued end.

.

$unpleasant 14 131211 10 9 8 7 6 5 4 3 2 pleasant

.         The terms “UNPLEASANT” and “PLEASANT” suggest that the evaluation

•          should be based on the dimension “AFFECTIVE”.

•         Based on the predicate(s): LAZY

•          and on the variable(s): STUDENT

•          the evaluation on the dimension “AFFECTIVE” is: 0.35 I shall assume that the 114 response alternatives

•         divide the dimension into categories of equal width.

•          I therefore chose the scale alternative that is

•          the FIFTH from the left, the low-valued end.

.

$ nil

 

?Bill is intelligent and lazy.

•          The impression on which the evaluation is based:

•               BILL IS A PERSON AND A STUDENT.

.      BILL IS INTELLIGENT AND LAZY.

$pleasant 1 2 3 4 5 6 7 unpleasant

•          The terms “PLEASANT” and “UNPLEASANT” suggest that the evaluation

•          should be based on the dimension “AFFECTIVE”.

•          Based on the predicate(s): INTELLIGENT and LAZY

•          and on the variable(s): STUDENT

•          the evaluation on the dimension “AFFECTIVE” is: 0.50 I shall assume that the SEVEN response alternatives

•          divide the dimension into categories of equal width.

•          I therefore chose the scale alternative that is

•          the FOURTH from the left, the high-valued end.

$ nil

.

?Qui t!

•   This statement is not a ASSERTION, it is a IMPERATIVE.

·     Should I proceed with it? (Yes No) yes

·      End of the task.

 

Above we have conceptualized the processes involved in doing impression formation and person evaluation experiments as tasks that can be learned from instructions, or as tasks that the subject is capable of doing when he comes to the experiment. While this point of view is useful in analyzing some of the cognitive dynamic under­lying the experimental tasks, it is not as helpful in clarifying the role of the experimental stimuli or that of the response. In accor­dance with our cognitive approach we also have to explain what it might mean to form an impression, or to have an evaluation of the individual. The impression presumably is based on knowledge that the subject has about the personality or the characteristics of that individual. The evaluation might be seen as based on an emotional or affective reaction to the traits of that individual, or it might be based on a sober, intellectual calculation of how much one would value an individual with these traits. In either case, the subject has to be aware of the evaluation since he has to express this evalu­ation in terms of a choice of alternative scale categories. We have also mentioned the awareness the subject may have of what he is doing.

 

Cognitive processing as a type of information processing

 

What is common to all of the above concepts is that they seem to relate to some notion of information exchange. The experimental stimulus can be seen as information the experimenter provides to the subject, while the response, the behavior of the subject, can be seen as providing information to the experimenter. We might thus see the experimental situation as an interaction between the experimenter and the subject that is primarily devoted toward an exchange of information. This interaction is public, and so the information exchanged is observable and can be represented as data that describes the interaction.

 

The information corresponding to the impression or the evaluation on the other hand is private to the subject and cannot be observed publicly. But even here we might posit a flow of information, where the subject’s knowledge of the language will affect how he interprets the descriptive statement, and where this interpretation in turn determines his impression of the individual. His evaluation will then be based on that impression as well as on his interpre­tation of the scale on which he is to express the evaluation.

 

We might therefore conceptualize cognitive processes as being involved in transforming and exchanging information. The subject’s knowledge and awareness can therefore be seen as information that may be utilized in doing the experimental tasks. This information may have been brought to the experiment or it may have been generated by preceding cognitive processes while doing the task or in learning about it. For a general discussion of an information processing perspective for verbal and other forms of information from both a physiological and a cognitive approach, see Lindsay and Norman (1972).

 

This information processing perspective suggests that the cognitive processing may be broken into stages, where each stage in the process takes some particular information as input and produces some other information. We would expect that the major stages in the information processing of our evaluative task would roughly correspond to each of the instructions in terms of which the task was defined.

 

Let us just take the instructions “Read an assertion!” and “Evaluate the individual!” In doing the task, the subject may per­ceive the sentence “Bill is intelligent” at the time that he is executing the command “Read an assertion”. The input information at this point might be analyzed as corresponding to the string of words ‘Bill’ ‘is’ ‘intelligent’. The cognitive process invoked by the command “Read an assertion” then presumably scans that list of words and recognizes it as an assertion. In the process of recog­nizing the stimulus sentence, the subject may also have to utilize other information besides that provided in the input. For instance he will have to draw on his knowledge about words and their meanings. If we assume that reading a sentence also involves comprehending it and at least briefly remembering the information contained in it, then we might assume that the output to this process is some know­ledge about the individual Bill.

 

This knowledge can now be seen as the input to the next stage in processing, following the instruction: “Evaluate the individual”. The output from this process would be the information about which scale alternative the subject chooses to represent his evaluation of the individual.

 

We now have to consider how to represent this information. For the publicly observable information, the stimulus and the response, we normally use data representation schemes and data structures that allow us to identify and select those features of the stimulus and of the response in which we are particularly interested. We there­fore simplify the information that could be gained from the observations. According to this perspective we are not concerned with all the details of the actually observed response behavior. Rather we want only that information generated by the response that is relevant for our research goals. In the data representation for the response we would presumably not want to differentiate between such diverse behaviors as underlining the chosen scale category, circling it, or indicating the choice through a sentence such as “the third from the left”. We might therefore use the notion of information reduction to refer to the process of selecting only the relevant information.

 

We can apply the same notions to the internal, private information of the subject. We shall also have to find information representation schemes and information structures to represent the knowledge and the awareness of the subject. Also we shall not be concerned with all the information, and all the knowledge that the subject may have. Rather we are concerned with finding out the minimum amount of information that the subject must have to be able to do the experimental tasks. In the context of constructing a simulation model, this minimal amount is fairly naturally defined as that information that allows us to simulate the behavior of the subject within some given set of constraints. The main constraints on the model under consideration are that only a few of the phenomena relevant to impression formation and person evaluation will be simulated, and that the stimulus sentences must fall into a very restricted subset of English sentences.

 

Let us now consider how we could investigate the cognitive information structures that are used in a task. To follow our example above, how can we find out what knowledge the subject has gained about the individual after reading the descriptive assertion about him. As mentioned, this cognitive information structure is not observable and therefore cannot be investigated directly. But there are some indirect methods we can use to determine what information must be represented in that cognitive structure. We can ask the subject to describe the individual or we can ask questions about that individual. There are two types of experimental proce­dures we might follow to gather this information. On one hand, we could modify the task by inserting an instruction asking the subject to describe the individual. On the other hand we could ask the subject to interrupt the task so that we could ask questions. We could specify these breaks in the instructions for the task or we could just interrupt the subject while he was doing the task.

 

?Take instructions!

Please enter the instructions for task5:

?Imagine students!

?Read an assertion!

?Interrupt!                                 This instruction tells the model to stop

?Evaluate the individual!                   and interact with the experimenter when—

?Repeat!                                                                                        ever it reaches this point in the task.

?End instructions!

This completes the instructions for task5.

 

 

?Execute task5!

?Bill is intelligent and lazy.

.   Yes, I’m listening.          The model indicates its readiness to interact.

?Is Bill intelligent?

Yes.

?Is Bill industrious?

No.

?Is Bill lazy?

Yes.

?Isn’t Bill lazy?

Yes.

?Resume!                                                                                The model is told to continue with the task.

$pleasant 1 2 3 4 5 6 7 unpleasant

.              $

$nil

.

? Quit!

·     This statement is not a ASSERTION, it is a IMPERATIVE.

·      Should I proceed with it? (Yes No) yes

End of the task.

 

 

?Execute task1!

?Bill is intelligent and lazy.

$hey                       The model is expecting a scale but it gets interrupted with the inter­jection “hey".

Yes, I’m listening.

?Is Bill intelligent?

• Yes.

?Is Bill lazy?

Yes.

?Is Bill industrious?

No.

?Isn’t Bill lazy?

Yes.

?Resume!

$pleasant 1 2 3 4 5 6 7 unpleasant

•                       $

$nil

.

?Qui t!

*     This statement is not a ASSERTION, it is a IMPERATIVE.

Should I proceed with it? (Yes No) yes

End of the task.

 

 

It is hoped that this and the preceding chapter have established some perspective on how the problem of constructing a cognitive process model is envisaged here.


Chapter 3

 

STRUCTURES AND PROCESSES:

ANALYSIS AND EVALUATION

 

 

In this chapter we shall consider how we conceptualize and represent sentences as data structures. We need such data structures to represent the cognitive information the subject gathers from the sentence. The other problem we address here is how to analyze and decompose and delineate the cognitive processes.

 

According to Hempel (1949) “. . . the meaning of a proposition is established by the conditions of its verification”. We shall therefore have to examine what sort of claims we want to make about the cognitive processes to be represented in the model, and what kind of data we would have to collect to support these claims. It is hoped that this discussion will help to bring out and clarify what it is that we might consider the substantive theoretical aspects of the model. Since the model was conceptualized and implemented as a computer simulation model, there is no easy and clearcut differen­tiation in its representation as a program between the substantive aspects of the model and the incidental and heuristic features that are necessary for its implementation.

 

There are two types of questions that may be asked about a model such as the one suggested here. On one hand we may wonder what the model is good for, what it contributes to our understanding of the phenomena under consideration, what we gain from it. This kind of question calls for justification of the approach in terms of the substantive theoretical contribution such models can make, a kind of cost—benefit evaluation. On the other hand, we have to test the model to seek assurances that it is true, correct, or valid, that it accurately reflects reality within some limits. Here we have to examine what is hypothesized in the model, what consequences we can derive from these claims and how we can test these empirically. In other words how we can operationalize the substantive aspects of the model and test them.

 

The two types of evaluation are of course not independent. Thus a model that cannot be operationalized, whose validity we cannot test is not likely to be judged to be of much benefit or use. On the other hand, a model that is easily operationalizable and testable but does not deal with questions that we are interested in or that does not contribute toward a better understanding of some problem that we are concerned with is also not of much benefit.

 

A traditional approach, as conceptualized in behaviorism, would restrict our accounts, our hypotheses, and theoretical claims to deal only with directly observable phenomena. “. . . Its principal methodological postulate is that a scientific psychology should limit itself to the study of the bodily behavior with which man and the animals respond to changes in their physical environment, every descriptive or explanatory step which makes use of such terms from introspective or “understanding” psychology as ‘feeling’, ‘lived experience’, ‘idea’, ‘will’, ‘intention’, ‘goal’, ‘disposition’, ‘repression’, being proscribed as non—scientific.” (Hempel, 1949). But those “mentalistic” terms point to some of the most interesting processes and problems in social psychology. Especially in areas such as impression formation, attitudes, and influence where we are dealing with the communication of ideas it would seem worthwhile to have an account that deals with the comprehension of messages as well as with thinking and reasoning with ideas and concepts. The problem is utilizing the insights gained from introspection and “understanding” that are expressed in terms of these notions without getting lost in vague, speculative theory. Rather we would want to represent these insights and notions in theories and models that can be as exact, predictive, operationalizable, and testable as behaviorist theories.

 

Now in evaluating such a model it is not sufficient to deter­mine how successfully it predicts the behavior of individuals in different situations. We also have to evaluate how well the model embodies or represents the insights gained from intuition. Let us illustrate this by examining a simplified conceptualization of behaviorist theories and discussing why it might not be sufficient to simply introduce mental states and processes as intervening variables in a behaviorist model.

 

Let us start with a view of behaviorist theories as given above. We then have to represent the features or characteristics of the physical environment which might change and to which men or animals might respond. Secondly we have to represent their “bodily behavior”. In one such theoretical approach, the features of the physical envi­ronment are conceptualized as stimuli that impinge on the individual and cause him to act. This conceptualization brings along such notions as stimulus vs. background, where the stimulus refers to the features of the environment that cause the individual to act and the background to features that do not. Just noticeable differences in stimuli conceptualize differences in environmental features that might cause different actions. The behavior or action is concep­tualized in a similar manner, as having features so that the experi­menter can select some features as relevant and categorize behaviors in terms of noticeable differences in those relevant features. The causal process by which the stimulus causes or brings about a response is conceptualized as an internal association between a stimulus and a response so that the stimulus automatically triggers the response. Alternatively the causal process can be seen as a nondeterministic process, where the different kinds of behaviors form a set from which one must be selected, and where the stimulus is seen as affecting the choice process. (According to the physi— calistic orientation of the behavioristic approach, this selection process should be understood as a non—deterministic, random process such as in quantum mechanics rather than as a deliberate, conscious choice process where the actor consciously evaluates the alternatives and uses his free will.)

 

We could now take a similar approach for a cognitive theory and represent mental states either as mental, cognitive stimuli that can be differentiated so that different behaviors can be seen as caused by different mental states, or we can treat them as mental behaviors so that the individual might ‘respond’ with different mental states to different environmental stimuli. Mental processes such as thinking could then be conceptualized as sequences of mental states where a mental state that was in response to a previous state might in turn act as stimulus to the next, somewhat similar to conditioning models of learning.

 

This conceptualization reflects the approach cited above, where the goal of a theory might be seen as describing or predicting different behaviors that are associated with changes or differences in features of the environment. Basic to this approach is the notion of causation, where the stimulus somehow “causes” or determines the responses. But one of the most fascinating notions from “under­standing” is the view of an individual as being ‘motivated’ to act, as having ‘will’ or ‘intentions’. It is because of this kind of notion that the insights of the intuitive approach cannot simply be introduced as intervening variables in a behaviorist model. Above an information processing approach was outlined in which the individual is seen as expecting and looking for information from the environment that is relevant in dealing with a task. His motivation or goal—orientation is conceptualized in terms of a control structure that represents a cognitive map or impression of the task the subject is trying to accomplish. The behavior thus is not seen as determined by the stimulus alone but rather is a function both of what the individual is attempting to do, the task he is engaged in, as well as the information he finds in the environment that is relevant for doing the task. The mental states and functions are conceptualized in terms of information structures and processes, where the awareness of the subject is analyzed in terms of his capability to examine the internal information structures and processes involved in his thinking. To the extent that we can simulate these structures and capabilities in terms of analogous information processing in a computer model, we can take the structures and algorithms of the computer program as a fully specified predictive theoretical model for at least some of the mental states and functions derived from intuition.

 

Consequently in evaluating such a model it is not sufficient to evaluate the accuracy of the predictions the model makes about the responses of the subject to the stimuli in the experiment. We also have to evaluate how adequately the information processing in the computer model represents our intuition and insights about what individuals do, how they do it and why they do what they do. We therefore have to see what sorts of tests we might perform to evaluate these aspects of the model, and what data we could collect from subjects that might be relevant, and what evaluation procedures we might follow. Unfortunately we have no solutions to the problem of evaluation. Rather, the remainder of the chapter will be devoted to giving more of a perspective to show how such simulation models might fit into experimentation, data collection, and hypothesis testing paradigms, what sort of criteria and judgments might be involved in evaluating such models, and what lines of data collection and reasoning one might follow in developing evaluation procedures.

 

 

Experiments, theories and models:

analysis and reconstruction by simplification and analogy

 

 

We commonly study naturally occurring social behavior and the associated psychological processes under simplified and somewhat arti­ficial laboratory conditions that allow greater control and closer, more detailed investigation than otherwise possible. We shall pro­pose here that experimental studies can be seen as simulations of the naturally occurring social situations, where only relevant aspects of the situation are reconstructed in simplified form. Experimental studies also contain features that differentiate them from the ordi­nary social situations. One can therefore see the experimental situation as well as the processes and the behavior in that experiment as analogous to the cognitions and behavior of the individual in naturally occurring social situations. The features and characteris­tics common to both the experiment and the ordinary social situation form the positive analogy, and those in which they differ the negative analogy. Other characteristics of the experimental situation whose application in the naturally occurring social situation we are uncer­tain of might be called the neutral analogy (Hesse, 1966, p. 8).

 

 

empirical

 

The naturally occurring social situation with its attendant behaviors and psychological processes

 

The analysis and reproduction of these

behaviors and processes in experiments

 

The data representation of the relevant fea­tures of the experimental situation and of the behavior observed in it

simplification,

information reduction

 

 

Similarly, theoretical models can be analyzed as functioning by analogy, where again we can separate positive, negative, and neutral analogies. Also, again like the experimental reconstruction of the naturally occurring social behavior, the theoretical model is a highly simplified and abstracted representation of only some of the features of the empirical situation. Below is a schematic represen­tation of some of these relationships.

 

theoretical

 

claims about processes

claims about structures

delineation claims:

The hypothesized processes are described in sufficient detail so they can be simulated. Application of a process to some informa­tion as input generates further information as output. For the model as a whole, the input is the stimuli and the features of the experimental situation as represented in the data. The output is the data representation of the observed behavior.

What the subject knows and uses in the exper­iment is specified in sufficient detail so that it can be repre­sented by information structures in the model and used by the algorithms in the model.

decomposition claims:

Hypothesized processes are identified and differentiated and placed into sequential as well as super— and sub­ordinate relationships.

What the subject knows and uses is identified and differentiated, and associated with the processes where the knowledge is either used or generated.

 

 

Let us now look at these relationships for our main example, impression formation, both as an illustration for the perspective suggested here and to develop more of a sense of what the critical issues are in evaluating simulations of the cognitive processes and the behavior of subjects in experimental situations. It is proposed that the evaluation depends on an examination of the basis for, the limits of, and the closeness of the analogy. Both the positive and the negative aspects of the analogy have to be investigated in terms of what substantive contribution to our understanding of the behaviors and processes we expect the model to make, what we desire to know or to be able to do with the help of the model that we did not know or could not do without it. Thus the criteria for evaluating the model, or for that matter of the experiment as a successful simulation, will differ if we only want to know whether a certain kind of information is likely to affect the behavior of the subject in some way, whether we want to predict his behavior in a given set of circumstances, or whether we want to explore and explain some of the processes involved in making his responses.

 

Let us examine some of these issues in the context of our example. In impression formation we are concerned with investigating how an individual utilizes information he receives about a person to evaluate that person. This process is conceptualized as having three aspects. The evaluator has to form an impression of the person to be evaluated. This impression then is used to make an evaluation. Since there may be more than one item of information received, information has to be consolidated and integrated in some way. So ‘impression’, ‘information integration’, and ‘evaluation’ are the three central theoretical concepts.

 

Let us look at some of the variety of social situations and processes that might be seen as involving impression formation. We might include courtship and marriage or job—interviews, where the impression is based on direct, personal interaction with the other person. On the other hand the impression may be based on very indirect information such as reading newspaper reports about poli­tical candidates or listening to evidence about the activities of an alleged criminal in a courtroom. These latter two are examples of cases where the evaluation is very specific and restricted: voting for or against the political candidates, and voting for or against conviction during the polling of a jury. Assigning grades to students is another example of an evaluation that involves a choice from a small set of alternatives.

 

On the other hand, the evaluation may be very complex and in terms of many dimensions such as in biographies or psychiatric case reports. The evaluation may be emotional, affective such as when we react with love or affection or with outrage and anger, or it may be detached and intellectual. Evaluating job applicants in terms of their potential value to the organization, based only on impersonal, standard application forms may be such an example.

 

The information on which the impression and evaluation is based may come in many forms. A very common form is verbal, language based material such as answers on application forms, gossip over the telephone, or paragraphs in newspapers. But visual and oral information may also be used, especially with television, tapes, and films. Often all forms of information may be involved. In a trial for instance the evidence may include audio tapes from wire—tapping, written statements, and cross—examinations where the jury may be watching for facial expressions and tone of voice of the witness as well as listening to what he says. In this paper we shall only consider written language material. The information may also contain inconsistencies and direct contradictions that have to be considered and somehow dealt with both in forming an impression and in making an evaluation.

 

Finding an appropriate experimental task to preserve the similarity between the experimental situation and the ordinary social situation

 

To investigate these processes in the laboratory we can identify several areas of choice. We have to decide on the kind of information to be presented as experimental stimuli, and on the type of judgments to be made by the subject. But most importantly we have to decide on the nature of the task we will ask our subjects to do. The naturally occurring processes and behaviors we include under the notions of impression formation and evaluation are very diverse and different one from the other. Different purposes are involved, with very different behaviors and different anticipated consequences, and with judgments based on very different sorts of information.

 

However, in our theoretical analysis we assume that these various situations and behaviors are not totally unconnected and unrelated to one another, and propose that they share common cogni­tive processes and structures, those identified as impression formation, information integration, and evaluation. Because of the divergence of the social situations, these processes and structures must be quite general so that they can be more or less independent of the type of information and of the particular kind of behavior involved. It is this assumption about common structures and processes that allows us to have laboratory investigations since we do not have to reproduce these social situations with any veri­similitude but only have to have some task for the experiment that is sufficiently similar to the naturally occurring social situations to involve the same general processes.

 

At some stage in our investigation we will probably want to test whether this assumption is true, whether and to what extent the processes in the experiment are the same or similar to those that are involved in the various social situations in daily life. In this context we would have to explore the effects of the features of the experiment that are in negative analogy to the naturally occurring social situations, those features that identify the exper­iment as an artificial situation, that differentiate it from an ordinary, daily life occurrence. This evaluation would tell us about the limitations in making inferences from experimental results, and the validity of generalizing our theoretical accounts.

 

But this does not become relevant until we do have good theories for these processes and behaviors. Let us therefore return to the problem of selecting suitable experimental tasks. A major considera­tion in the construction of experiments concerns the benefits we hope to gain from the results. In other words, the nature of the task will depend to some extent on its role in the ongoing investi­gation, what sort of data we need to collect for developing and testing the theories under consideration. The task as well as the stimuli and the responses will have to suit the particular type of model under consideration. For instance for investigating sequential models such as for order of presentation effects the stimuli will be presented in a time sequence and we may ask the subject to make a response after each stimulus presentation or to wait and make a single response after all the stimuli for that task have been pre­sented. The nature of the responses elicited from the subject will also be determined in part by the measurement techniques used for generating the type of data required for the hypothesis or model being tested.

 

Knowledge about the experiment:  Data structures and measurement

 

 

To construct and verify theoretical accounts of what is happening in the experiment we have to find some representation of what we know about the experimental situation and what we have observed about the behavior of subjects in that situation. Since we want to maintain the comparability of the data from different experiments, and collected from different observers, we have to find some fairly precise, standardized, and repeatable way of observing, specifying, and formulating what we know and have learned from a given experiment. In other words we have to find measurement tech­niques and data structures for the information we would like to have as a result of the experiment.

 

This problem has been discussed extensively in the psychologi­cal literature, such as by Stevens (1951), Edwards (1957), and Coombs (1964), as well as in Fishbein (1967). In terms of the perspective we are developing here we might just note that the description of the results of the experiment in terms of data points in a given set of data structures represents another level of abstraction. These data structures then are a part of an analogical model where various aspects of the experimental situation as well as the behavior of the subjects and their cognitive states and pro­cesses are conceptualized in terms of data points and relations between data points in these structures.

 

From our perspective of viewing the experiment as an exchange of information between the subject and the experimenter, we can see that the experimenter is not the only one who has to have some representation of what has happened in the experimental situation. While he is in the situation, the subject must also have some know­ledge and therefore some representation of what has happened in the experimental situation. This representation would presumably include his knowledge of what he has been asked to do, and his know­ledge of what information, what stimulus, has been provided for him to act on. Presumably he also has some cognitive representation of what has gone on before, and of what he intends to do. We can therefore see a parallel between the experimenter’s data represen­tation of the experimental interaction and the subject’s cognitive representation.

 

Furthermore, since the experimenter’s representation of the information in the experiment is supposed to account for the behavior of the subject, and since the subject’s behavior is presumably based on some features of his cognitive representation of the situation, the experimenter’s representation has to be in positive analogy to these features of the cognitive representation. Now some of the subject’s representation, particularly his knowledge of the task, could be represented by the algorithms the experimenter applies to his data to predict the behavior. In this case the experimenter’s data representation would only have to account for the subject’s understanding of the stimulus. (In the model we also try to account for the subject’s understanding of the task, based on the instruc­tions. The instructions therefore are also seen as stimulus presentations.) To evaluate this analogy we can ask whether the experimenter’s data representation of the stimulus is sufficiently analogous to the relevant features of the cognitive representation so that a trained subject could be given the data representation instead of the stimulus and still come up with the same behavior. Because of this functional equivalence, a good data representation would therefore also be an at least minimally adequate representation of the cognitive information structure used by the subject. We shall therefore go on to examine some alternative types of data representations.

 

In the most common type of data representation the experimental situation and the behavior is conceptualized in terms of independent and dependent variables. These variables are, in turn, conceptualized as geometric dimensions in a Euclidian space. The social and psychological processes that lead to the behavior are then conceptualized in terms of algorithms and functions that map the values of the independent variables into values of the dependent variables. The formalization of cognitive balance (Cartwright & Hararay, 1956) represents another such paradigm, where structural aspects of the cognitive state of the individual are represented in terms of linear graphs. Some cognitive processes and behaviors of the individual can then be conceptualized as attempts to effect changes in this cognitive structure.

 

As in the construction of the experimental situation to repre­sent the naturally occurring social situation, simplification and information reduction is involved in selecting which aspects of the experimental situation to observe and to represent in the data structures. Similarly we can see the role of analogy in the measure­ment processes and the data structures, in that only some of the features of the situation and of the behavior can be reflected in the resultant data representation. The positive analogies should be obvious, for instance in the way that different numerical values on a given variable represent different choices, where the ordering of the numbers (greater than) might reflect relations such as preference or similarity judgments (preferred to, more similar than). The negative analogy arises from using features of a particular data representation that are not reflected in or justified by the judg­ments or behavior on which the representation is based. This may arise through limitations in the measurement technique or through limitations in the nature of the underlying dimension. For instance let us consider an Osgood semantic differential scale, e.g., “cold 1 2 3 4 5 6 7 warm”. Let us assume that a subject has rated two stimulus items on this scale, assigning a value of 6 to one and 3 to the other. If we were to assume that the resultant data repre­sentation had ratio properties we might infer that one stimulus item is judged as twice as warm as the other. However if we were to question the subject about this he might not confirm this inference.

 

As mentioned above, what information we need depends mainly on the type of theoretical account we are trying to construct and verify. So not only what is to be observed and measured but also the form of the data representation will depend heavily on the conceptual framework used in the theoretical analysis, and on what sort of analogies are utilized in expressing and representing the information that is contained in the theory. It is the theory that determines what sort of information we need to obtain from the experiment to test the theory. It is the theory again that deter­mines in what manner this data must be represented, and with what sort of precision it must be measured and expressed to be useful in testing the theory.

 

An analysis of the experimental situation as a social interaction using sentences as basic data points

 

 

We shall have to come back to the problem of data collection and data representation when we discuss theories below. At this point, however, we consider what basic types of data representation might be appropriate for testing models based on simulating verbal information processing. Given that our model should account for and reflect the comprehension and interpretation of verbal messages, we need to collect data that helps us to evaluate the language processing aspect of the model.

 

It stands to reason therefore that the data representation should reflect this orientation toward language and verbal messages. So at one level of analysis we might take sentences as basic data elements. For instance the descriptions on which the impressions and evaluations are based are usually provided in sentence form. These sentences themselves could then be seen as the data representa­tion of the stimulus presentations. Also if the subject utters or writes sentences as experimental responses we could then take these sentences as the data representation of the responses. In developing the notion of reporting from awareness above, we discussed briefly how we might elicit such responses from the subject and how they might be used in evaluating the stimulation of cognitive processing in the model. Similarly we mentioned that the sentences which com­pose the instructions can be seen as stimuli that influence or par­tially determine the flow of cognitive processing. So at this level of analysis it would seem that we can use sentences as basic representations for the stimuli and for some of the responses. Other responses, such as the choice of scale items on semantic differential scales could of course be better represented in terms of traditional data structures, for instance as data points in a Euclidian data space.

 

Let us further consider this notion of using sentences as basic data points in a data structure representing the experimental situation and the behavior of the subject in the experiment. This perspective would lead us to analyze the experiment as a verbal interaction with the subject. The prototype would be a dyadic group composed of the experimenter and the subject, or of the interviewer and the interviewee. Of course in most impression formation studies we would have to substitute reading and answering a question­naire for listening to the experimenter and giving spoken responses.

 

But we can analyze both the interview or the questionnaire taking situation basically as an exchange of verbal information where some of the messages come from the experimenter and where the subject responds. Most of these messages will be composed of sen­tences. Thus the instructions are usually conveyed in terms of a list of sentences that are then followed by one or more descriptive stimulus sentences. The subject then responds to the stimulus according to the instructions, and so on. As part of this exchange there may be requests by the experimenter for the subject to report from his awareness, to explain his reasons for making a given response, to paraphrase the stimulus sentence or to make further inferences on what the described person may be like.

 

The data we would therefore generate in such an experiment would consist of exact protocols, transcriptions of the sentences the subject read or listened to as well as the sentences the subject produced in response. We would of course also have to include a representation of his other response behavior, such as scale markings.

 

By ‘sentence’ in this sentence level analysis we would therefore mean the string of words that would appear in a transcription. It should be clear that this data collection and representation method would satisfy most standard measurement criteria for precision and reliability. In fact somewhat similar data collection techniques are used in sociolinguistics and in ethno—methodological studies.

 

The major problem with this level of analysis and with using the sentence as the basic unit of data representation is that it is considerably beyond the power of any type of psychological model at the moment to cope with the complexity of sentences usually used in instructions, or to predict exactly, word for word, the sentences the subject might use in responding to the task or to requests for information. If we want to go beyond descriptive analyses and build predictive models we therefore have to find some way of reducing the complexity of sentences and the amount and detail of information in the data so that we can start building models to account for experi­mental behavior in terms of language use and comprehension. We can of course reduce the magnitude of the problem by using only very simple sentences in the instructions and the stimulus presentations and by asking the subject to express his responses in short and simple sentences. We may also have to look for other evaluative methods that are less stringent than complete prediction. Thirdly, and most relevant to our discussion here, we can try to find methods of representing the data that allow us to simplify and to reduce the amount of detail we need to consider both in interpreting the stimulus presentations and in predicting the responses.

 

Because of the kind of simulation approach we are investigating here, a second consideration is that the data representation should if possible help in the construction and evaluation of models of cog­nitive processing. For the simulation model we have to find cognitive representations of the meaning of sentences that allow for the kind of information processing we want to simulate. But these model— specific cognitive representations that arise from the theories underlying the simulation might also be generalized and used as data representations for the results of experiments. As mentioned above, we can see the data representations as serving a dual function. On one hand, the form and structure of the data representation may be an integral aspect of the theoretical model. On the other hand, the data structures and measuring techniques form an analogical model of the experimental situation and of the behavior of the subject within that situation. The example we used above is one of numerical scale representations that serve both as variables in theoretical models expressed as mathematical functions and as experimental data representations. If we keep the assumption that the sentence should form a basic data unit then we can see the problem as finding a good representation for the content and meaning of sentences. Some requirements for the representation of the meaning of sentences:

 

a) The derivation of the representation should be reliable. The representation optimally can be obtained mechanically from the actually observed sentences, or it can be derived with a fairly fixed and precise set of rules to guarantee reliability and repeatability.

 

b) The representation should be valid in the given theoretical context. It should fit into the theory and should reflect the assumptions made in the theory about what information the subject derives from the sentence or about what he intends to convey with the sentence. It should also reflect theoretical considerations about how the information is ‘cognitively’ represented internally, how the information is processed, used and remembered.

 

c) The method of obtaining the representation should be useful for data reduction. It should allow for simplification of the actually observed sentences to reduce the complexity of the experimen­tal situation, to help with problems of within and between indivi­dual variation.

 

Data representations for the cognitive content of sentences

 

One such data representation that basically reflects a sentence level analysis has been developed by Bales (1950). This categorical coding scheme represents the speaker of the sentence, to whom it is addressed, and whether the content is primarily addressed to the task or to socio—emotional aspects of the interaction. The scheme has been used in analysing small group interactions (Mills, 1953). This measurement scheme is very effective in reducing the informa­tion and is fairly reliable. But for our purposes it simplifies the data too much. It does not represent enough of the meaning or cog­nitive content of the sentences. For our impression formation pro­blem it does not allow us to represent those aspects of the content of a descriptive stimulus sentence to which the subject attends in forming an impression or in making an evaluation. In other words if we were to explain the Bales scheme to a subject and then present the subject with a Bales representation of the stimulus presentation instead of presenting him with the stimulus directly he could not use the information in the data representation to form an impression or to make an evaluation.

 

Another scheme that does preserve some of the cognitive content of the stimulus sentence could be based on Osgood’s semantic differential (Osgood, Suci, and Tannenbaum, 1957). In this case we would obtain semantic differential ratings for all the content words in the sentence and use a list of these ratings as the data represen­tation. We could then for instance have a separate data representa­tion for the evaluative content of the sentence by listing only the ratings of the words on the affective dimension. This scheme is also very effective in reducing the information. Its reliability is determined by the reliability of the semantic differential ratings.

 

This data representation would appear to be fairly valid at least for the evaluative aspect of the impression formation task. This general type of scheme can be seen as underlying most of the mathematical models for the evaluative aspect of impression formation. To evaluate the usefulness of this scheme for the impression forming (descriptive) and inferential aspects of impression formation tasks we can use the same criteria as above. If we presented an informed subject with the lists representing the ratings, including the dimensions (i.e. affective, potency . . .) on which the ratings were based, he would still have difficulties in forming an impression that was sufficiently concrete to allow him to describe the individual. Certainly he would not be able to describe the person in the same way that a subject who had received the original stimulus sentence would describe the person. He would presumably also have difficul­ties in making inferences about other traits of that person (Asch, 1946) and in using that information to guide his behavior (Kelley, 1950).

 

The limitations of this data representation scheme become even more obvious if we examine to what extent we could model his under­standing of the instructions for the task in terms of this kind of data scheme. Since we are assuming that the cognitive processing the subject goes through in doing the experimental task is dependent directly on the instructions he receives, we clearly need some data representation for the instructions. We furthermore assumed that the subject comprehends the cognitive content of the instructional sentences and forms some sort of impression of the task since he can describe the task and presumably could even evaluate it. It also seems reasonable to suppose that the subject uses a somewhat similar cognitive representation for both his impression of the task and of a person, so that we could best model his cognitive structures by using a common data representation scheme for both.

 

But if we were to present the subject with semantic differential ratings for the words of each of the sentences in the instructions he would not be able to get enough information out of this to know how to do the task. We would therefore conclude that while this data representation scheme might be valid for the evaluative aspects it is not sufficiently valid for the impression formation and inferential aspects and that it also is not valid for representing his impres­sion of the task and thus for his cognitive processing.

 

Another representation we might consider would represent a sen­tence as a list of the content words in that sentence. We might form this list by taking the common form of each of the main content words, say as it would appear in a dictionary and by listing them in alphabetic order. This type of scheme would of course be much less of a reduction in information since we could now have an almost infinite set of alternate data points made up of the combination of words that could possibly co—occur in a sentence. The limited information reduction we would obtain with this type of scheme is due to the different sentences that could be constructed from these words. If we also wanted to maintain the word order in the sentence we would have even less information reduction since we would essen­tially eliminate only alternative linguistic structures formed from the use of different tenses, the use of different transformations, the use of different connectives such as: and, or, not, if. . . then, the use of different determiners and prepositions, and the use of singular, plural and possessives. On the other hand, we might be able to gain a little further reduction in information by replacing a word with a synonym from some standardized list of such synonyms whenever possible.

 

Using the sort of rules outlined above, this representation scheme could be extremely reliable. Since the subject can presumably make his own evaluative ratings for each of the words, this representation would have at least the same validity for the evaluative aspect of the impression formation task as the semantic differential scheme described above. Since the subject could also use the words or their synonyms to form descriptions and to make inferences, this form of representation might be valid as long as the original sentences did not contain any negations, disjunctions, or other con— nectives other than conjunction. With the use of an alphabetized list we would have problems with transitive verbs since “the boy hits the priest” and “the priest hits the boy” would both be repre­sented as “boy hit priest”. We would also not be able to differen­tiate order of presentation- effects.

 

In using this scheme to represent the instructions, we again might run into problems if the sentences were complex and contained transitive verbs or conditionals or other connectives. However if the instructions were to be composed only of very simple sentences it might be possible for the subject to reconstruct them sufficiently from the word list representation to be able to carry out the task. For instance an instruction sequence such as:

 

“Read a sentence. Form an impression. Make inferences. Evaluate the person.”

 

would be represented as a sequence of alphabetized lists:

 

“read sentence — form impression — inference make — evaluate person.”

 

So on the whole this type of representation scheme comes much closer toward being a valid representation of the information in­volved in an impression formation task, since, at least under some conditions, the subject might be able to do the task when presented with the information necessary for doing the task only in schematic form. However, this type of data representation still is inadequate for complex sentences. This type of representation scheme is also not very helpful for modelling the kinds of cognitive structures that might be involved in forming an impression of the person or in forming an impression of the task to be done. This limitation becomes more evident if we consider what sort of cognitive processes might be involved in doing the task, in interpreting and comprehending the sentences, in forming an impression, in making inferences or in doing the evaluation.

 

We might assume that the cognitive structures the subject uses for representing the information necessary for the task would be closely related to the algorithms he utilizes in actually doing the task. Thus his internal representation for the information in the sentences making up the instructions should be closely related to the algorithms he then uses in interpreting the meaning of these sentences to do the task. Similarly the internal representation of the infor­mation in the descriptive sentences on which the impression is based should fit into some model of the implicit personality theories (or other theories and assumptions he utilizes) to form an overall impression of the person, to make inferences about that person, and finally to evaluate that person.

 

Secondly we might assume that our data representation should somehow reflect whatever we might hypothesize about the internal cognitive representations and about the cognitive processes the sub­ject might utilize to interpret and comprehend the sentences and to do the various aspects of the task. Some of these structures and pro­cesses have been considered in linguistics, for interpreting the sen­tences, and in psycho—logic, and in logical theories, for modelling how the subject might reason with and act on the information in the sentences. Let us go on therefore and consider linguistic and logi­cal representation schemes for the sentences. In the model these schemes are used to simulate the subject’s internal, cognitive repre­sentation of the meaning of sentences.

 

Linguistic surface and deep structure trees can be seen as an­other type of data representation for sentences. There are fairly well defined rules for generating surface as well as deep tree repre­sentations for a large class of sentences (e.g., Burt, 1971). They therefore satisfy our criteria for reproducibility and reliability.

 

Used in the normal way, linguistic surface and deep structure tree representations do not simplify the sentences. However since they make the structure of the sentence explicit and identify various components, it seems quite feasible to build in information reduction methods by introducing extra transformations that might eliminate parts of the structure or transform it into a standard form. For instance we could delete superordinate parts of the sentence that have performative or modal functions:

 

“I think that Bill is rich” ——— “Bill is rich”

“It is likely that Bill is rich” ——— “Bill is rich”

 

We could also transform passives into active form, tenses into the present, and so on.

 

Now many of these transformations will introduce some change in meaning and therefore the reduced or standardized representation would not be completely psychologically equivalent to the original sentence. However the type and extent of the differences introduced by such transformations can be investigated experimentally. At least one such transformation, active—passive, has been investigated for stimulus sentences in an impression formation context (Howe, 1970) and was found to have little or no effect. In conclusion, then, the representation of sentences in terms of linguistic structures does allow for information reduction.

 

The third question is how well linguistic structures fit into the type of model we want to construct, how well they represent the meaning, and how adequately they allow for the kind of information processing envisaged in the model. At one level, of course, the linguistic analysis fits fairly well in that it provides models of how sentences are analyzed by the subject to provide internal cogni­tive structures, since the structural elements of the deep structure representation do identify what is being referred to (the NP’s) and what sort of characteristics or relations are being mentioned (the verbs or adjectives). Unfortunately, however, the linguistic struc­tures do not readily lend themselves for use in further information processing algorithms so that we can model how this information is being utilized by the subject.

 

The linguistic structures also do not lend themselves as struc­tural representations of processing algorithms. At least, at present, to my knowledge there are no computer languages that specify algorithms in terms of structures corresponding fairly directly to linguistic deep structure trees. Thus the deep structure represen­tation of an imperative sentence such as in the instructions could not be interpreted directly as the specification of an algorithm for carrying out this instruction. Yet the subject has to go somehow from the cognitive representation for the meaning of the imperative to a representation of that meaning in algorithmic form that would allow him to carry out the command conveyed by the sentence. In other words part of the meaning of that imperative sentence is its specification of what the subject should do.

 

Assuming again that our representation should somehow reflect these aspects of the cognitive representation of the sentence we would therefore conclude that linguistic structures by themselves are still not totally satisfactory as data representations for sentences.

 

Another possible kind of structure for representing the meaning of sentences is a representation in terms of logical formulas based on mathematical logic. In this case we have information processing methods that either deal directly with logical formulations such as inference and theorem proving, or theories that are closely related to logical formulations such as psychologic (Abelson & Rosenberg, 1958). Another advantage of using logical formulas as representations is that there are computer languages such as LISP that represent both algorithms and data structures in terms of structures that are very similar to logical formulas. Such a representation therefore allows for the information processing we want to model. The problem now is to find a method for reliably deriving logical representations for the sentences and for coping with simplification.

 

Logical formulations do allow for some forms of information reduction through the conversion of a given logical expression to a logically equivalent normal form, such as prenex normal form. The main problem in this context is to find one or more such normal forms, with a minimal, or at least reduced, set of connectives and a standardized structure that will not only maintain logical equiva­lence but that will also be adequate for expressing and working with commands as well as questions and assertions. Besides the usual conjunctive and disjunctive normal forms, alternative interpretations might be expressed through the notion of logically possible universes (Burks, 1963) by using exclusive disjunctions, while conditional commands (Rescher, 1966) might be expressed in another standardized form.

 

This leaves us with the problem of finding methods for reliably deriving logical representations for sentences. One possible solu­tion to this problem might be to derive the logical expression from the linguistic deep structure representation for the sentence. The advantage of this approach is that we could then use both the linguis­tic and the logical analyses for simplification. Also we would retain the natural advantage of the linguistic approach in accounting for the initial analyses or decomposition of the sentence by the subject. This possibility of conjoining linguistic and logical analysis to represent the meaning of sentences has been under con­sideration by generative semanticists such as Lakoff (1970) and others.

 

In conclusion it would seem that the most adequate data repre­sentation for the information contained in sentences would reflect a combination of linguistic and logical analyses of the sentence.

 

This is the form of representation used in the model. Since both linguistic and logical expressions can be represented as tree struc­tures, it seems feasible to use this kind of structure as a data representation that could also serve as a model of the internal or ‘cognitive’ structure that represents the meaning of the sentence for the subject. This data representation therefore has at least some theoretical validity for theories and models concerned with the cognitive processing of verbal materials, as in impression forma­tion. It would allow for fairly reliable measurement techniques: deriving the linguistic deep structure and then the logical expres­sions. It would also allow for simplification and information reduction both at the linguistic and logical levels of analysis.

 

 

Theoretical claims associated with cognitive process models: decomposition and delineation

 

 

In this section we are concerned with the kinds of theoretical claims that might be associated with cognitive process models, and with the kinds of arguments and evidence that might be adduced to support these claims. The primary concern, of course, is the problem of constructing and evaluating a cognitive process model to account for some of the phenomena found in impression formation and person evaluation studies. Since the overall perspective taken here is that models are analogies of the phenomena or processes we want to account for, the theories are seen as statements about the nature of the analogy. Evaluation of the theoretical claims then consists in probing the limits of the analogy. In this discussion we are mostly concerned with providing some perspective on the kinds of theoreti­cal claims that are involved in constructing such a model, and with identifying some of the problems that might be involved in evaluating these claims.

 

We shall consider four kinds of theories: claims about the decomposition and about the delineation of the processes involved, and claims about the decomposition and about the structural delinea­tion of the knowledge or information used in these processes. An example of a decomposition claim is the assumption that the ability to comprehend and act on sentences is an important aspect of the cognitive processing involved in dealing with impression formation tasks. Sentence comprehension then presumably identifies a process that can be further decomposed. While these decomposition claims identify and characterize the processes that are hypothesized to be involved, they do not have to specify fully how the process works nor do they have to propose any specific algorithms to simulate the process. This is done by delineation claims, where a particular type of algorithm is proposed as capable of simulating the process and as being analogous to, or in a restricted sense equivalent to the process under consideration.

 

Since these processes draw on the knowledge of the subject, and since they may develop and utilize new information such as a cognitive representation of a sentence and its meaning, we also need decompo­sition claims to identify and characterize what knowledge may be involved. Structural delineation claims then develop information structures to represent this knowledge. For instance, we propose that the linguistic and logical tree structure that was suggested above as a data representation for a sentence might also be seen as the cognitive representation of a sentence and its meaning. These structural representations then may be hypothesized to be partially equivalent in content and analogous in structure to the presumed ‘cognitive’ representation of the knowledge as used by the subject.

 

Since claims about decomposition do not propose full analogies but only suggest what should be considered in constructing such models they can be evaluated independently. Delineation claims on the other hand usually cannot be evaluated independently of the overall simulation model in which they are imbedded. The problem is that we cannot observe cognitive structures or processes directly but are restricted to observing the overt behavior of subjects. We therefore have to evaluate the information structures and algo­rithms specified by the delineation claims in terms of their func­tion in the overall simulation. One criterion for the evaluation is whether the given algorithm or information structure belongs to some combination of information structures and algorithms that is able or competent to simulate some observable behavior. We can obtain even stronger support for the adequacy of the algorithm or information structure if the model is competent to simulate the behavior of subjects in all contexts within the limitations of the model where the given algorithm or information structure is invoked, or if the model is furthermore able to predict the behavior of subjects in these contexts.

 

Let us therefore consider what such support would imply for our delineation claim. According to our perspective of viewing these algorithms and information structures as analogical models of cognitive structures or processes, we should be able to establish positive, neutral, and negative analogies and thus evaluate them in terms of the limits of the analogy between the information structure or algorithm and the cognitive structure or process they explicate. Support for the adequacy of an algorithm, for instance, would there­fore support an assumption that there is some positive analogy between the algorithm and the cognitive process it represents. But algorithms for a computer simulation model must be expressed in terms of a particular computer language. Many aspects of this represen­tation of the algorithm will be peculiar to the particular computer language and might be represented differently in some other computer language. Also some other aspects of the algorithm will be heuristic and not based on any particular theories or assumptions about the cognitive process. These aspects therefore would form a negative, or at best neutral analogy. But since the algorithm must be speci­fied in terms of some language, and since theories about the cogni­tive process are unlikely to be sufficiently detailed to specify all parts of the algorithm it may be difficult to isolate the features of the algorithm that are in positive analogy with the cognitive process. However we may be able to identify at least some aspects of the positive analogy by making decomposition claims on the process represented by the algorithm and then showing how the decomposed parts are represented in the algorithm.

 

 

Let us now proceed to examine how process decomposition claims may be evaluated and how we may evaluate the competence and predictiveness of the simulation model.

 

 

1.         Process Decomposition

 

 

Here we are concerned with claims that purport to identify and characterize different aspects of the cognitive processing that is involved in dealing with impression formation tasks, such as evaluating individuals that are described on questionnaire forms. Let us consider some examples.

 

One such claim is that the subject first forms an impression of the individual and then evaluates him on the basis of that impression. Another claim, concerned with a more particular aspect of the cognitive processing, is that the subject processes each sentence on that questionnaire through some form of linguistic analysis and is able to ‘understand’ or ‘comprehend’ the given sen­tence. Furthermore we might claim that the subject also goes through something akin to a logical analysis in understanding the sentence, since he can detect tautologies and contradictions as anomalous sentences. Yet another example might be taken from considering the evaluation process. Assuming that the subject is asked to evaluate the individual on a scale, we might claim that he first determines the dimension on which the evaluation is to be made, i.e. whether he is asked to rate the friendliness or the competence of the individual. If the response alternatives are indicated as numbers as in semantic differential scales, and if no explicit instructions about the dimension have been given, we might claim that the subject infers the dimension from the meaning of the end terms supplied with the scale.

 

For some claims of this sort we may have fairly direct exper­imental evidence, so that they can be evaluated on the basis of this evidence. Let us now consider the main sorts of arguments that might be adduced in support of such claims in the absence of direct experimental evidence.

 

a)         the decomposition is useful in identifying structures or component parts of algorithms, that are useful in eluci­dating how the model should be constructed.

 

b)         success in model construction: that we can build a model based on the suggested decomposition that is able to exhibit the desired behavior under at least some circum­stances.

 

c)         no better alternative decompositions of the process are known

 

d)         that the claim fits common sense and our intuitive analysis of what happens

 

e)         some indirect evidence that the negation of the claim, or some obverse claim, would not be valid. Thus for instance, to support the first example above, we might examine whether the opposite claim, that the subject evaluates the individual before forming an impression, would also be possible. In this case we might say that the subject has formed an impression if he can describe the individual, and that he has evaluated the individual if he can rate him on an evaluative scale. Now if we present the subject with a description as given by another subject to the same stimulus, our subject can still do the evaluation. But if he were only given the evaluation made by the other subject in response to that stimulus he would have diffi­culties in describing the individual concerned. We might conclude from this that the information contained in the evaluation is not sufficient to form a description, while the information in the description is sufficient to form an evaluation. Therefore, if the subject were to do the evaluation first, he would also have to remember extra information from the stimulus sentences to then be able to form an impression, i.e. to make a description. But he would not have to carry this extra information if he were to form the impression first.

 

 

As seen from this example, the argumentation in support of the claim may be very indirect and not very conclusive.

 

It should not of course be inferred that these decomposition claims on cognitive processes are not open to experimentation and formal verification. In fact model construction may help to identify such theories and assumptions so they can then be considered for experimental verification. In this discussion we were mainly con­cerned with discussing the notion of decomposition and with con­sidering how claims about decomposition might be evaluated in the absence of more definitive experimental evidence.

 

Decomposition claims are not restricted to cognitive process models. Thus the identification of significant variables for causal process models can be seen as argumentation in support of decom­position claims. Here also we are only identifying factors without delineating the process, i.e. without showing precisely what role these factors may play in the process. Thus for instance showing that persons who smoke have a higher incidence of lung cancer, or that wealthy individuals are more likely than poor people to vote Republican, by itself does not specify the particular relationship or process involved. Rather it helps to identify factors that may play a role in the processes that lead to getting lung cancer, or to voting Republican. In these cases, of course, there are well developed techniques, such as analysis of variance designs with sampling methods, that can be used in evaluating these claims.

 

Competence and prediction

 

Under this heading we are concerned with developing some per­spective on evaluating what the model actually does, and how its behavior compares to that of the subjects it is supposed to model. The structures and processes of which the model is composed must therefore be specified with sufficient completeness and detail so that one can mechanically determine what the behavior or output of the model would be for a given stimulus or input. We shall start out by considering the behavior of the model in one particular situation. The problem then is whether the model can account for and predict the behavior of subjects in this given situation, for the particular stimulus under consideration. To see whether the model successfully simulates the behavior of subjects in the given situation, we have to determine both the behavior of subjects in this situation and the ‘behavior’ or output of the model for this situation. Since the model is non—deterministic in some circum­stances, we may actually have to determine the set of alternatively possible behaviors in response to the given situation.

 

To be able to compare the behavior of the model to that of actual subjects, we have to determine the behavior of these sub­jects. Since a given subject might respond somewhat differently to the same stimulus at different times, and because of interpersonal variations in behavior, we actually have a set of responses. Now some of the variation in responses might be due to systematic differences in factors such as the sex, age, and personality of the subject or the knowledge and experience he brings to the experiment. Other sources of variation may be due to other aspects of the exper­imental situation that are not represented explicitly as part of the stimulus presentation. These include the specific instructions given to the subject, the history of the experimental interaction such as what other tasks and stimulus presentations have preceded the particular stimulus presentation under consideration. These factors influence the knowledge and expectations of the subject.

 

Variation due to the instructions and the history of the exper­imental situation is simulated by the model. Some of the other factors could hopefully also be represented in the model at some future time, possibly as parameters affecting some processes, or in terms of the information and information structures used by these processes. We therefore have to partition the subject’s responses according to the factors incorporated in the model so that we have response sets for each combination of stimulus, personal factors, and experimental circumstances that is not further differentiated in the model. We also have to be able to specify the behavior of the model and that of the subjects in terms of the same type of data represen­tation, so that we can compare the responses. For verbal responses, we could use actual sentences, or we could use linguistic—logical tree structure representations.

 

Above we discussed the set of possible behaviors of the model and of the subjects as if only one behavioral item were involved, i.e. a single response unit as analyzed by the relevant data repre­sentation scheme. This might be a single scale response or a single sentence represented according to the scheme discussed above. But many behaviors will consist of sequences of such responses. Thus for instance if we asked a subject to describe the individual about whom he has received information on the questionnaire he may respond with more than one sentence. We may therefore consider the behavior sets as containing ordered sequences as members, where single responses would now be represented as sequences containing only one element.

 

We can now evaluate the ability of the model to simulate the behavior of subjects for a given combination of experimental circum­stances and stimuli as well as for personal factors, by examining the relationship between the set of alternative behaviors produced by the model and that obtained from the subjects. The simulation fails if there is no overlap between the two sets, and it is doing well indeed if the sets are equal. We could go even further by ordering the behavior sequences in the sets in terms of the relative frequencies with which the given behavior is observed. This would allow for probabilistic evaluations.

 

There is also one further set of alternative behavior sequences we should consider. This is the set of behaviors that would be considered acceptable in the given situations for persons of that kind. This may of course include behavior sequences that we would not normally observe from actual subjects. For instance if we described an individual Bill as being intelligent, talented and hard—working, a subject might paraphrase this description in terms of the sentences: “Bill is hard—working. He is also intelligent and talented.” Our subject would be fairly unlikely to respond with a list of simple sentences such as: “Bill is intelligent. Bill is talented. Bill is hard—working.”, even though he would probably agree that this list would be an acceptable paraphrase. Without considering for the moment how and from whom to collect acceptability judgments, we see that we can partition the set of behavior sequences generated by the model into those sequences that are acceptable and those that are not.

 

We can now get to the notions of competence and prediction. We shall call a model minimally competent to simulate the behavior of a subject in a given situation if all or at least most of the behavior sequences in the set of possible behaviors produced by the model for the situation are judged to be acceptable as responses to the situation. We might also include here the case where the behavior produced by the model would not be appropriate as the sole behavior in the situation, but where it might be an acceptable fragment of a more complete response.

 

This fragmentary response is included for situations in which we do not want to model a whole behavior sequence but are mainly concerned with the conditions under which it occurs. We might therefore take only a fragment of the behavior that is symbolic of the complete sequence. The following may be an illustration. Kelley (1950) showed that to those to whom a lecturer was described as ‘warm’ tended to interact more with him than those to whom he was described as ‘cold’. In the model under consideration here we are concerned only with this effect as an aspect of impression formation. We are therefore not particularly interested in modeling such an interaction in any detail. We have therefore represented attempts at interaction with a given individual by a sentence that might be addressed to him to initiate the interaction. In other words the model cannot actually simulate the interactions observed by Kelley but it might be held to be minimally competent to account for the differential effect due to the impression if it can produce such symbolic behavior in the appropriate circumstances.

 

We might call a model competent to simulate the behavior of subjects in a given situation if it is minimally competent for that situation and if the set of possible behaviors produced by the model includes behavior sequences that are also members of the set of observed behaviors. The model is fully competent for that situ­ation if it is minimally competent and if the set of possible behaviors generated by the model for this situation includes most, if not all the actually observed behaviors in this situation that are also judged to be acceptable. The model is predictive to the extent that the frequency distribution over its set of possible behaviors for that situation corresponds to the frequency distribu­tion over the set of observed behaviors for the given situation.

 

 

Evaluation of the simulation of hypothesized sub—processes where the output of the process is not immediately or directly reflected in observable behavior

 

 

Assuming that we have decomposed the process leading from a given stimulus situation to the response into a sequence of sub— processes, we shall want to evaluate associated delineation claims by evaluating the adequacy of the algorithms in terms of which of these sub—processes are represented in the simulation model. But since none of these sub—processes will both start immediately with the stimulus and produce an observable response, we cannot use a comparison of observable behaviors for the evaluation.

 

Since any algorithms simulating such a sub—process must be invoked in some given situation that can be characterized more or less fully, since it may have to use some specific, known items of information, and furthermore since it will produce some response or effect, it should be possible to evaluate whether this response or effect is appropriate as the output of the algorithm in this situa­tion. In other words we can evaluate whether the output from the algorithm corresponds to the output or effect we would expect the hypothesized cognitive process to have. So we should be able to evaluate whether the algorithm is competent to simulate the hypo­thesized sub—process. This evaluation is made more complicated by the fact that the sub—process that is to be simulated may not be understood very well even in terms of the information it is hypo­thesized to act on, and what it is hypothesized to produce as output. We have touched on some of these problems in our discussion of decomposition.

 

Let us deal with an example. In the model under consideration in this paper, a stimulus sentence is first analyzed linguistically and logically to generate an information structure that is intended to simulate the internal ‘cognitive’ representation of the meaning of the sentence that the subject is hypothesized to construct to represent his ‘comprehension’ of the sentence. This process presum­ably takes place before the sentence is acted upon and therefore before any response might be generated. But this analytical process can be decomposed further into sub—processes. One such sub—process is invoked if there are pronouns in the sentence, to find the referents for these pronouns. Thus for instance: “Mary is talented and intelligent and she is honest”. The subject would infer that the pronoun ‘she’ refers to Mary and that the characteristic ‘is honest’ therefore is attributed to Mary. We might therefore posit that there is a cognitive process that determines the referent for the pronoun. We could therefore incorporate an algorithm into the model to deal with pronoun reference resolution.

 

The pronoun might of course also refer to a previous sentence that has already been acted upon, as for instance in the sequence:  “Describe Mary!” “Evaluate her!” In this case the algorithm has to refer to what is remembered about previous sentences to be able to assign a referent. It should be clear that this algorithm is at least partially independent of the other algorithms in the model in that it might be able to come up with the correct referent even though the model made inappropriate responses to the rest of the sentence or sentences. Thus it might be able to determine that ‘her’ referred to Mary in our second example even though the program could not correctly describe or evaluate her. It would therefore be useful to be able to evaluate this algorithm independently.

 

Now we can get an appropriate response from subjects for testing this sub—process by interrupting them and asking them: “Who does the pronoun ‘her’ in the previous sentence refer to?” We discussed this type of test above under the notion of reporting from awareness. If the model could also simulate this behavior in response to an interruption we would have a fairly direct behavioral test of the algorithm so that we could determine to what extent it was competent or predictive in simulating the hypothesized cognitive process. However the model may not be competent to simulate the interruption or the response to that type of question, or there may be other cognitive processes unlike this example that are not open to direct interrogation in this manner. In this case we have to rely on more indirect methods for determining whether the algorithm finds the correct referent or otherwise simulates the hypothesized cognitive process appropriately.

 

To summarize briefly, the notions of competence and prediction can be extended to include indirect evaluation of delineation claims f or algorithms which do not lead to responses directly but which only affect some aspects of the stimulus interpreting, response producing capabilities of the model. This indirect evaluation is very important also for assessing decomposition claims, since the main function of decomposition claims is to identify sub—processes and to hypothesize how they fit into the overall model. The useful­ness of such claims therefore depends on whether we can construct algorithms to represent these sub—processes, and whether we can make them function appropriately in the places allocated to them by the decomposition claim.

 

 

Specifying the stimulus situation: the context in which the prediction is made

 

 

Above we have discussed the problem of evaluating the ability of the model to simulate the behavior and the cognitive processes of the subject in response to some particular stimulus situation. We have also mentioned that the prediction may apply only to some set of subjects that have the knowledge, capabilities, and goals that the subjects are assumed to have and that are represented in the model, and that match the personal factors such as sex, age, and personality that may be incorporated in the model as parameter values or as restrictive conditions. We now have to consider how we can specify and represent the stimulus situation including the assump­tions about the subjects so that we can match the conditions under which we observe the responses of the model to the conditions under which we observe the behavior of subjects. Another reason for finding such a representation is that we have to be able to specify the subjects, stimuli, and the experimental conditions for which the model is capable of generating responses that simulate the behavior of the subjects.

 

Specifying and guaranteeing comparability of the complete stimulus situation for both the model and the subjects presents special problems for an interactive model that is not primarily oriented toward repeated one—shot predictions for some type of stimulus presentation, but where an attempt is made to include the instructions and possible interruptions by the experimenter as part of the experimental interaction. Thus we could treat most of the information given to the subject during the course of the experiment as stimulus presentations since he is expected to utilize this information in making his responses. This would of course include the sentences in the instructions.

 

Problems arise when the model cannot treat each stimulus pre­sentation as an independent trial, as when the response to a given stimulus may depend on the context in which it is presented, i.e. on previous presentations. An example of this is the imperative:  “Evaluate the individual!”. Under normal circumstances we might expect this request to be executed immediately, the subject proceed­ing directly to locate the individual and the scale and then responding by evaluating that individual on the scale. But if the sentence were part of the instructions for a task, the subject would not execute the request but rather remember it as part of the specifications for the task so that he could act on it later. Another example occurs when the subject is told that the successive descriptions all apply to the same individual, an individual he may or may not have known previously. In this case each description will modify the knowledge the subject has about the individual, so that the impression formed on the basis of the third such descrip­tive sentence will also depend on the knowledge gkined from the two preceding descriptions as well as on any knowledge about the indi­vidual he may have brought to the experiment. A good illustration of this is the Kelley (1950) experiment where the subjects could base a first impresssion of a lecturer on the description of his personality supplied on the hand—out announcing the lecture. They could then modify this impression by observing the lecturer and listening to his speech.

 

We now consider how to specify the experimental situation including the stimulus and the context based on the history of the interaction. Since the stimulus will usually consist of one or more sentences, we can represent these in terms of the data structures for sentences discussed above. At the moment we shall ignore the cases where the stimulus does not consist of sentences, such as scale presentations and where we try to catch the attention of the subject to interrupt him from what he is doing.

 

Aside from the stimulus itself, the subsequent processing of the stimulus can be seen as depending on the knowledge, expectancies, capabilities, and goals of the subject since we can represent the effect of the context and of previous presentations as changes in or additions to the knowledge, expectancies, capabilities, and goals of the subject. We can therefore approach the problem of specifying the present circumstances in two ways. On one hand we could start with a specification of what knowledge, capabilities, expectancies, and goals the subject entered the experimental situation and then supply a specific history of the experimental interaction up to this point. On the other hand we could attempt to specify the present cognitive state of the individual, including the knowledge, capa­bilities, expectancies, and goals he has at present. We would also have to include some specification of what he can presently remember about the preceding interaction, at least insofar as it may affect his subsequent behavior. This would have to include some memories of recent sentences in case he needs to refer back to them, for instance to resolve pronoun references. It would also have to include some account of his present expectations and goals such as which task he is presently executing and what he has to do to complete that task.

 

Since the history of the interaction can easily be specified by describing what was presented to the subject and how he responded, the main problem to be faced is in specifying the relevant aspects of the cognitive state and of the capabilities of the subject, either at present or as he entered the experiment. As most of the original knowledge and capabilities of the subject will be unaffected by the subsequent experimental interaction, we might start out by con­sidering just what we need to assume the subject knows and is able to do in order to be capable of participating in the experiment.  In effect then these are decomposition claims on what knowledge must be involved in doing the experimental tasks.

 

Knowledge, capabilities and goals the subject brings to the experiment

 

 We will of course only have to consider the knowledge and capa­bilities that are relevant to the particular experiment we are trying to simulate, since only these will have to be represented in the model. Let us begin therefore by considering the knowledge and the capabilities we would hypothesize a subject to have to be able to cope with the impression formation and person evaluation tasks that we are considering in this paper. Since we are assuming that the subject comprehends and then acts on the sentences presented in the experiment we must assume that the subject has the capability to recognize, comprehend, and act on the meaning of sentences. This includes the sentences making up the instructions, the descriptive sentences on which the impressions are based, and includes any questions the experimenter may ask of the subject.

 

This capability may be specified in terms of a decomposition of the general sentence processing capability into the set of processes that we may hypothesize to be involved in interpreting, analyzing, and executing different aspects of the sentence. Some of these processes will be involved in dealing with any sentence while other processes may be involved only for sentences with particular charac­teristics. We have already mentioned pronoun reference resolution, which would be dealt with by a process of this latter type. The sentence interpreting and executing processes must also leave a memory trace that may be referred to later, as for pronouns, or for sentences that describe tasks or theories that are not to be acted on immediately but that are to be remembered for later use. In this case something of the meaning of the sentences in terms of which the particular task or theory is defined must be stored in memory, since it is this meaning that must later be acted upon when executing the task or when applying the theory. This memory trace then corresponds to the information structure that represents the cogni­tive representation of the ‘meaning’ of the sentence.

 

Other memory structures will also be involved in interpreting and executing sentences. For assertions, for instance for the description “Bill is intelligent”, the sentence—executing processes must add to or modify the memory structures that represent what is known about the individual Bill. For questions, such as “Is Bill intelligent?” these memory structures then have to be searched and a response has to be generated. In the case of instructions such as “Describe Bill!”, if the request is to be executed at that time some of the information about what to do to satisfy this command, what process to invoke, will have to come from some memory structure that represents the knowledge about one of the uses of the term ‘describe’.

 

This brings us to the problem of a vocabulary. To be able to deal with sentences we have to assume some knowledge about words, their uses, functions, and meanings. We therefore need to assume that the subject brings to the experiment some information about the words that may show up in the experimental context. This know­ledge would have to include not only some information about what role these words may play in sentences, i.e. as nouns, adjectives, or verbs, but also about their meanings, uses, and functions when the sentences are to be acted upon. These uses and functions may depend very much on the context of the whole sentence in which the word appears. For instance “Mary quit” gives us information that we may remember about Mary. But the sentence “Quit!” has nothing to do with what we remember about Mary but gives us an instruction about what we should do at the time at which we execute this command. Executing this command presumably invokes a cognitive process that is associated with the meaning of the predicate ‘quit’. For our impression formation model we therefore have to assume that the subject brings to the experiment some knowledge about the uses and functions of the descriptive terms, names and common nouns that may be involved in the stimulus sentences on which the impressions are based. We also have to assume that the subject has sufficient know­ledge about the meanings of terms that may show up in the instruc­tions or in other requests by the experimenter to give him the capability of acting on these instructions. This knowledge of course has to be represented in the model as well.

 

If we look at sentences such as “Mary didn’t quit” and “Don’t quit!” we see that we may also have to require knowledge about the relations between terms so that negations can be interpreted and acted on. Similar information is needed for interpreting redundancy or for coping with sentences that are ambiguous, tautologous, or contradictory. We will also want to use the information conveyed by some of these words about some individual or object to make evaluative judgments about that individual or object. Assuming that the meaning of each of these terms somehow contributes something to the evaluation, we then have to assume that the subject must bring know­ledge about the evaluative contribution of these terms on the given dimensions to the experimental situation.We may also assume that the subject has attitudes, assumptions and prejudices about what some types of individuals are like or what they are likely to do. This type of information may be seen as know­ledge in the form of theories and assumptions about the world. These theories play a role in making inferences about further characteris­tics of individuals that have been described (Asch, 1946). Unlike the previous type of knowledge about the uses and functions of words, which is likely to be fairly standard for most members of the same language community, the theories and assumptions about the world may vary from individual to individual, even though we may find commonalities based on sex, age, and other personality and experience factors.

 

Some of the goals and expectancies that the subject brings to the experiment may also vary between individuals. One would assume that most subjects will be willing to cooperate with the experimenter in doing the experimental tasks. But the tasks in the experiment are purposefully left somewhat open—ended so that the responses of the subject to the stimuli will depend not only on the instructions but also on the knowledge, capabilities and goals that the subject brings to the experiment. We have already discussed some of these capabilities in terms of their ability to deal with the instructions and with the descriptions and evaluations, and in terms of the theories they may apply to make inferences about individuals. But in trying to account for the behavior of subjects in situations where they can interact with the person of whom they have formed an impression (Kelley, 1950), we have to assume that they also bring to the situation some other goals such as approaching and interacting with individuals whom they evaluate positively. These goals might be conceptualized as implicit, conditional tasks that are invoked only in the appropriate circumstances.

 

 

Memories and expectancies developed during the experimental inter­action

 

 

We now have to consider the knowledge, capabilities, expectan­cies, and goals of the subject at some point during the interaction just before a new stimulus is presented. The purpose of this dis­cussion is to enable us to specify the cognitive state and the expectancies of the subject as he encounters the stimulus and then makes a response. It is this cognitive state that provides the con­text to which the stimulus is responded. To be able to make a valid response comparison with the model, to be able to test the competence and predictiveness of the model, we have to know that the model is in the equivalent context when it encounters the same stimulus.

 

We shall assume that the preceding interaction in the experiment has not affected the basic abilities of the subject to cope with sentences. Nor should his vocabulary, his theories, and his goals have changed. However he may have learned new theories and he will instead finds a question, an imperative or something that is un­intelligible, he is likely to interrupt the task to ask the experi­menter about what to do next. The same would happen if he did not find a scale where he was expecting one.

 

We therefore have another possible context. That is where the subject has stopped in the middle of a task. This of course could also happen if the experimenter interrupted the subject. In either case the subject would presumably be willing to cooperate with the experimenter in answering any questions or in following his requests. But the subject would presumably also be willing to resume working on the task once the problem had been resolved. He would therefore have to remember what he was doing in the task when the interruption occurred.

 

 

Evaluating decomposition and structural delineation claims about the subject’s knowledge, capabilities and expectancies

 

 

We have already discussed how we might use the notions of competence and predictiveness to evaluate the representation in the model of the cognitive processing capabilities of the subject that are relevant for dealing with impression formation and person evalua­tion tasks. We have also discussed how we might test some of these processes separately. Implicit in the response comparison notion is the condition that the subject and the model are responding to the same stimulus in the same context. But in matching the context in which the behavior of the model could be compared we have to repre­sent the knowledge and expectancies of the subject. We therefore have to deal with the problem of how to evaluate this representation to assure that both the model and the subject are acting in the same context. But the best evaluation of the adequacy of that represen­tation is through the performance of the model, through testing the use of this knowledge in responding to stimuli.

 

It may seem that this represents a real problem since we cannot evaluate the behavior of the model unless we can assure comparability, unless we can evaluate the adequacy of the representation of the subject’s knowledge and capabilities by evaluating the behavior of the model. But if we specify part of the immediate context in terms of the history of the interaction, then we only have to evaluate the representation of the knowledge and expectancies the subject brings to the experiment.

 

The model will of course only represent some of the knowledge, capabilities, and goals of the subject that might be relevant to the experimental interaction. We could design our experiments to fit the limitations of the model so that the information provided to the subject, the instructions and the other stimulus presentations, draws only on knowledge and capabilities that are represented in the model. Both the model and the subject can then go through the same inter­action so that both are exposed to the same stimuli in the same sequence. In this case we can assume that the contexts are equiva­lent.

 

Any subsequent behavioral comparison between the model and the subject then tests not only the ability of the model to simulate the behavior of the subject in response to the current stimulus but it also tests the ability to respond correctly to previous stimuli by providing the correct context in which the current stimulus is inter­preted and acted on. The representation of the knowledge of course is also tested indirectly since that knowledge may be utilized both in dealing with the current stimulus and previous stimuli. Thus for instance if the model has received instructions for a task, and if it then provides an inappropriate response for a stimulus while executing the task, then the incorrect response may be caused by an inappropriate interpretation of that stimulus or it may be caused by an inappropriate interpretation of the instructions for the task.

 

A problem that arises in this context concerns the completeness of the evaluation. Above we mainly considered how to evaluate the behavior of the model for a specific stimulus in a particular context. Since the model allows the user to define much of the vocabulary as well as the tasks and theories, there are many stimuli and many con­texts that could be explored. To gain an overall impression of the competence or predictiveness of the model one would have to sample the capabilities of the model by presenting a selection of different stimuli in various contexts. We have presented a small sample of this sort in the first two chapters. In a sense one would have to experiment with the model just as one does with human subjects.

 

It is hoped that the last two chapters have provided some per­spective on how one may analyze the problem of constructing models that simulate some aspects of the cognitive processing of subjects. Since the main focus of this thesis is on examining the feasibility of constructing such a model and on investigating the theoretical problems and implications that may be associated with the development of such a model, no specific empirical tests of the predictiveness of the model have been done as yet. Rather the model in its present form has been designed to be minimally competent to account for some of the results of previous studies in impression formation and person evaluation, as illustrated in the first two chapters.

 

The discussion in the preceding chapter was mainly intended to further elucidate the model from the perspective of what sort of data structures and theoretical claims might be involved and to show how one might develop empirical methods to test these. While many problems still remain unsolved in developing exact measures of the competence and predictiveness of the model and of the goodness of fit of its analogical representation of cognitive processes, it is hoped that the preceding discussion and the illustrations have developed enough of a perspective to allow for an informal evaluation of the particular computer simulation model that is under considera­tion here.

 

In the remaining chapters we shall examine this model in more detail by describing how the knowledge and capabilities of the subjects are presented. At the heart of the model is the represen­tation of some of the knowledge about words, their linguistic, logical and semantic functions, and their relations to other words and concepts. While discussing sentence processing we shall also examine how knowledge about individuals is represented, and how impressions might be formed and how inferences are made.


Chapter 4

 

REPRESENTING THE SUBJECT’ S KNOWLEDGE ABOUT THE TERMS AND CONCEPTS

 

NEEDED FOR COPING WITH IMPRESSION FORMATION TASKS

 

 

In this section we discuss how the subject’s knowledge about the meaning of terms and their relations to other terms may be repre­sented. Of course we are not trying to represent all the words the subject may know, or reflect all the knowledge the subject may have about the uses and meanings of these words. Rather we are simply attempting to construct a representation that is adequate for coping with impression formation tasks and with the relevant experimental stimuli. To that end we want the representation for a given term to carry sufficient information expressed in a form that allows the algorithms in the model to operate correctly.

 

At this point we are particularly interested in those aspects of the representation and the use of terms that are utilized in the sentence comprehension and interpretation processes. We shall not at this point consider the representation of that aspect of the mean­ing of some terms that specifies the action to be taken when inter­preting some types of sentences containing this term. Thus for instance the sentence “Describe Bill!” includes the predicate ‘describe’. A part of the meaning of the term ‘describe’ is a specification of what the subject should do to describe someone. As another example we might consider the sentence “Describe the indivi­dual!”, where the use of the concept ‘individual’ in this context might invoke a process that locates the intended referent.

 

Let us now consider how the knowledge of the meanings and uses of terms that are used in the sentence processing and interpretation processes is represented in the model. We should reiterate that we are concerned only with terms that are needed for coping with impres­sion formation and person evaluation tasks, and that we are concerned only with making this representation adequate for simulating some of the phenomena found in impression formation and person evaluation studies. The structural delineation claims associated with these representations therefore are not of much independent theoretical interest since only a minimal, functional analogy is intended. The main positive analogy is based on the decomposition claims, that some of the differentiations between the types of knowledge made in the model must somehow be represented in the hypothesized cognitive structures embodying the knowledge of the subject, and that most of the information encoded in the representations must be included in some form in these cognitive structures.

 

Underlying the representation in the simulation model of the subject’s knowledge about the meaning and uses of terms are a series of decomposition claims that differentiate different types of terms. Since the model is based on processing sentences in written form, we also have to posit knowledge about the use and functions of symbols such as punctuation marks. We might start out by considering these symbols, as well as function terms, which play a role in sentence recognition and production algorithms but do not have independent semantic functions.

 

 

Punctuation marks: ‘.‘ ‘,‘ ‘?‘ ‘!‘ and ‘;‘

 

 

These symbols may be used to indicate whether a sentence is to be interpreted as an assertion, question, or command, such as in “Mary is intelligent?” In most contexts, however, these symbols are redun­dant in that the sentence structure indicates whether it is an assertion, question, or command.

 

 

Auxiliary verbs:                    is, isn’t, are, aren’t, do, don’t, does, and doesn’t

 

 

These terms function as tense markers and to indicate negation. Only sentences in the present tense are accepted by the program. For this reason terms such as ‘have’ do not occur as auxiliary verbs. Modals such as indicated by ‘may’ or ‘can’ are also excluded.

 

 

Quotation marks:   ‘  and “

 

 

The use of quotation marks in the model is somewhat peculiar and reflects the quoting conventions of the underlying computer language, LISP. Thus terms or expressions that are to be quoted are preceded by a single quotation mark. But if the expression to be quoted con­tains blank spaces or punctuation marks, the expression must be surrounded by double quotes. The single quotation mark then precedes this expression.

 

We shall now consider terms that do have some semantic function, but whose meaning is contextually defined. Knowledge about these terms therefore is not represented in terms of information structures associated with the terms themselves, but rather is represented in terms of their use in other information structures and in algorithms that act on these information structures.

 

 

Pronouns:     you, he, she, it, they, him, her, and them

 

Relative pronouns:           who, which, and that

 

Determiners:        a, any, all, each, every, none, some, that, the, these, those, and how — many

 

Prepositions:        of

 

Connectives:         not, and, or, unless, if — then, if, only — if, either — or, and neither — nor

 

 

The next type of term to be considered plays a special role that is only indirectly related to normal sentence processing.

 

 

Interjections:            hey

 

 

The interjection ‘hey’ has a semantic function that can be concep­tualized in terms of the cognitive processes that are invoked when the term is perceived by the subject. It might be claimed that interjections are special types of exclamations like “Help!” or “Fire!” and therefore have a meaning somewhat like the meaning of ordinary commands. In this model the exclamation “Help!” is pro­cessed like other sentences. But ‘hey’ causes an immediate inter­ruption as soon as it has been read. The use of this interjection therefore allows us to simulate the ability of the experimenter to interrupt the subject in whatever he is doing. The interruption can then be used to obtain reports about what the subject is doing and thinking at that moment, as discussed in chapter two. After the interruption the model can resume what it was doing previously or start the task again from the beginning.

 

Since we are simulating responses to written material, and since computer terminals at least at present cannot accept written and spoken input simultaneously, the use of the term ‘hey’ is not quite sufficient for our purposes. It may be spoken by the experimenter while the subject is reading some sentences on a questionnaire form. We have therefore also utilized the ATTENTION — BREAK capability provided on most terminals. This allows us to interrupt the model while it is reading a sentence and has the same effect as reading the term ‘hey’.

 

We now come to the representation of content words. Knowledge about these words is represented in terms of information structures that are uniquely associated with these terms. The semantic memory representation of the knowledge about these words and their meanings is not intended to be more than minimally adequate for the impression formation and person evaluation tasks. For a discussion of semantic memory representations from various points of view see Tulving and Donaldson, 1972).

 

We shall start out by discussing decomposition claims about the types and functions of words that are represented. We shall then consider the structural delineation claims about what information is associated with these terms and how this information is represented. Finally we shall discuss how knowledge about these words is introduced into the model.

 

A basic differentiation is made between words that refer, and words that are used to attribute some information to these referents or that are used to request that some action be taken. This corre­sponds to the linguistic differentiation between nouns, and verbs or adjectives. It also corresponds to the differentiation made between logical constants or variables, and predicates.

 

Starting with words that are used to refer, we differentiate between terms that name persons or objects, such as ‘Bill’ or ‘Mary’, terms that name concepts such as ‘Theoryl’ or ‘Taskl’, terms that name classes or sets of things, and terms that refer to but do not name persons or things, such as the nouns ‘student’, ‘person’, ‘task’, and ‘theory’. A type of term that is not represented in the model is a term that refers through specifying a relationship, such as ‘member of’ or ‘brother of’.

 

Let us now consider how knowledge about these terms is repre­sented in the model. The terms are uniquely linked with tree struc­tures that contain and represent whatever information we want to associate with that term. We shall therefore present an example for each type of term. (Special symbols such as are used to differen­tiate the special terms in the model from terms that may be defined by the user.)

 

The term ‘Bill’ refers to an individual who may be a member of a group of individuals called Groupl. He is a person and let us assume that he is also a student. This information is represented by the following tree structure:

 

 

 

Bill

 

individual#

                        member#         term# type#

                           I          I

                            Group1     Noun     (Person, Student)

 

 

There is more information about the characteristics of the individual that we shall later want to add to this tree structure. We shall therefore come back to this structural representation of the infor­mation about the individual Bill when we discuss sentence processing. We shall also consider other information structures associated with the individual when we discuss impression formation and evaluation.

 

The term ‘Task1’ refers to a concept, a task that may be a member of a set of tasks called ‘Task’.

 

 

Task1

 

concept#

                        member#    term#    type#

 

Task Noun (Task)

 

 

This term refers to a task that can be executed. In other words the subject must have some knowledge about what cognitive processing to go through to execute this task. This knowledge is represented in terms of an algorithm that is associated with this term. This know­ledge can be displayed when we ask the subject to describe the task. We shall discuss the representation of this knowledge when we consider how instructions are comprehended.

 

The task can also be characterized in other ways, such as in the assertion “Task1 is pleasant”. As is the case for individuals, we can form impressions of tasks and evaluate them. Theories are similar to tasks in that they can be evaluated and in that they invoke cognitive processes when they are being tested or applied.

 

The term ‘Group1’ refers to a set of individuals that belong to this group. Membership in this group may be restricted to persons and students.

 

 

Group1

 

class#

                 reference#                   term#

                 individual#                  Noun

 

(Bill, Mary) variable#

 

(Person, Student)

 

 

The term ‘student’ is a noun that is used to refer to Bill and Mary. In this case the references are known, and therefore can be listed.

 

Student

 

variable#

                          range#        term#

                        individual#     Noun

 

(Bill, Mary)

 

 

The term ‘referent’ is an example of a noun that is used to refer, but whose reference must be determined at the time the term is invoked. The term therefore invokes a cognitive process that is used to generate the relevant reference.

 

 

 

 

Referent

 

variable#

                        generator#          term#

 

algorithm representing                          Noun

the process that generates the reference

 

 

Let us now turn to adjectives and verbs. The only predicates that are represented in the model are terms that need at most one reference in the context in which they are used. This includes adjec­tives, transitive verbs, and verbs that are used in imperative sentences that take at most one referent. Adverbs are excluded.

 

There are four basic contexts in which predicates are used, that are represented in the model. These contexts can be characterized as assertion, test, scale dimension identification, and command. We shall also differentiate between different types of predicates that may be used in these contexts.

 

In the assertion context the predicate is used to attribute some descriptive characteristics or some behavioral disposition to an individual, or possibly to a task or theory. As an example we might use the sentences “Bill is fat” or “Bill is honest”. The pre­dicates in these sentences are usually in adjectival form even though transitive verbs can also be used, such as “Bill cheats”. These attributions are assumed to characterize some relatively invariant aspect of the person so that they can be used to form an impression of the personality of that individual and can therefore also be used in evaluating that individual.

 

As an example of predicates that do not fit this type, consider the sentences “Bill is tired” and “Bill is present”. The predicates in both of these sentences convey information that does not say any­thing about the personality of the individual. They may be concep­tualized as providing episodic information (Tulving, 1972). Some episodic information such as “Bill helped Mary yesterday” can of course be used to make inferences about the personality of the indi­vidual, and thus could be used in forming an impression or in evaluating the individual. However this type of information and the associated inferences and evaluations are not represented in the model.

 

In the second type of context, the test, the predicate is used to ascertain whether the individual is known to have the behavioral disposition or the descriptive characteristic that is associated with that predicate. As an example we may use the questions “Is Bill fat?” or “Is Bill honest?”. This context may also apply to other types of questions such as “Who is honest?” or “How many students are honest?” It may also apply to parts of sentences such as relative clauses or conditionals, where other parts of the sentence are used to convey an assertion or a command. As examples we may consider the use of the predicate ‘intelligent’ in the sentences “An intelligent student works”, “A student who is intelligent works” and “If Bill is intelli­gent, describe him!”

 

In the third type of context in which predicates are used in the model, scale dimension identification, the predicates occur as scale—end terms. These predicates specify the perspective or dimension in terms of which the individual is to be evaluated, as in the scale: “pleasant 1 2 3 4 5 unpleasant”. Since we are concerned with evaluating the personality of individuals the scale may very well use terms that can also be used to characterize that personality, such as ‘friendly — unfriendly’ or ‘warm — cold’.

 

The fourth type of context, the command, involves the use of predicates in the instructions for the experimental tasks, in commands or requests. Examples of this are the predicates in the imperatives “Describe Mary!” and “Evaluate Mary!” We are now considering a dif­ferent type of predicate from those used in the three preceding contexts. The type of predicate used in these commands could also be used in sentences that provide episodic information, such as in “Bill described Mary”. But, as discussed above, the model does not handle episodic information and it does not accept sentences contain­ing more than one referent. The only aspect of the meaning and use of these predicates therefore that is represented in the model is their use in imperatives that have at most one referent.

 

Let us now turn to some other considerations about the represen­tation of knowledge about predicates. The differentiation between verbs and adjectives is of course also represented in the model. Another differentiation between different types of predicates is based on the interpretation of negation. In the case of some predi­cates, the negation of the predicate in the sentence allows us to infer what other predicates must apply. For instance the sentence “Bill is not stupid” may allow us to infer that Bill is intelligent. On the other hand, from sentences like “Don’t describe Bill!” we cannot infer what we should do instead. We assume that the predicates whose negations are defined fit into a set whose members are mutually exhaustive. The negation of one of the members of the set therefore implies that one or more of the other members of the set must apply. If the predicates in the set are used to characterize individuals then we can view the set as defining a descriptive or evaluative dimension. We might then see a term like ‘intelligence’ as specify­ing such a dimension or set, where, for this example, the predicates ‘intelligent’ and ‘stupid’ would be members of this set. We might assume that it is this sort of dimension that is defined by the predicates that are used as scale—end terms. We shall furthermore assume that all the predicates that are used in the characterization of individuals fit into such dimensions.

 

Above we have discussed four types of contexts in which predi­cates are involved. The problem we have to discuss now is to what extent these different contexts invoke different cognitive processes that are specific to the use of the predicate in that particular context. We therefore have to consider whether we should represent these processes as different, context specific meanings for the terms.

 

For the assertion and scale dimension identification contexts the assumption made for this model is that the processes invoked in these contexts are not sufficiently peculiar to the particular pre­dicates to warrant representing them as meanings of the predicate rather than as general sentence processing capabilities. In other words it is assumed that there is not much difference between under­standing and remembering that Bill is fat, and understanding and remembering that Bill is honest. Furthermore whatever differences there may be would be unrelated to the specific meanings of ‘fat’ and ‘honest’. This assumption of course depends on what theories we have about memory representations. For instance if we assume that the information from the assertion “Bill is fat” is remembered as a visual impression, and that the information from the sentence “Bill is honest” is remembered in verbal form, then there would be significant differences in the processes involved in comprehending and remembering that information. These differences would therefore depend on some aspect of the meaning of these terms. But even under this assumption we may simply want to differentiate between predi­cates that specify visual characteristics and those that do not. We could then keep the assumption that the same processes are involved for all predicates of the same type. In this model, for the sake of simplicity and since visual information is not represented, it is assumed that all such information is remembered in verbal form and that the processes involved in remembering this information are part of the general language processing capability of the subject.

 

On the other hand, the command context clearly draws on the meaning of the specific predicate under consideration. At least some of the cognitive processes involved in executing the command “Evaluate Bill!” are directly related to the meaning of the predicate ‘evaluate’. Other predicates, when used in the same context, such as in “Describe Bill!”, invoke different cognitive processes. These different cognitive processes therefore have to be represented as parts of the meanings of those terms. These meanings are context specific since those cognitive processes are invoked only if the predicate is used in the command context. We discuss the representation of some of these processes in the model when we discuss the commands the system is capable of executing.

 

The test context could be seen as a context in which the pre­dicate is used to collect information, such as in “Is Bill fat?” There are two primary ways in which this information could be col­lected. On one hand the subject might simply search his memory to find out whether he knows that the individual is or is not fat. On the other hand he could go to the outside world to collect this information by looking at Bill or by asking others about him. In the second case the particular meaning of the predicate clearly is involved since the subject has to know what he is looking for. For instance he could not simply look at Bill and determine visually whether he is intelligent or whether he is honest. This type of information search is not represented in the model.

 

In the first case, the memory search, we might assume the cog­nitive processing involved to be similar to that for assertions in that it could simply be determined whether the information that Bill is fat is stored in the memory representation for Bill. On the other hand, if this information is not represented, the subject might use inferential processes to find out whether other aspects of what he remembers about the person would allow him to answer the question. These inferential processes would presumably depend on the meaning of the particular predicate involved. For instance the subject might inquire whether he knows Bill to be on a diet. While the model is not presently capable of inferential searches, it was decided to represent some of the cognitive processing involved in simple searches of memory as associated with the meaning of the predicate.

 

Let us now consider how knowledge about these terms is represen­ted in the system. Again the terms are uniquely linked with tree structures that contain and represent whatever information we want to associate with that term. As above we shall leave out the algo­rithms that represent the ability of the subject to act on the context specific meaning of the term.

 

The term ‘intelligent’ is a predicate on the dimension ‘intelli­gence’. It is’ used to characterize some aspect of the personality or behavioral disposition of persons.

 

 

intelligent

 

disposition#

                                                                                  member#         term#        type#

                                       I                   I I

                                               intelligence adjective                                           (1)

 

 

The list of integers under the node ‘type#’ indicates how many arguments the predicate takes, or, in other words, how many referents must be specified in the sentence or clause in which the predicate is encountered.

 

The term ‘intelligence’ is the name of a class or set of things like the term ‘Group1’. In this case, of course, it refers to a set of predicates.

 

 

 

Intelligence

 

Class#

 

                                                             Reference#          term#

                               I                  I

                             disposition#                              Noun

 

(intelligent, unintelligent)

 

 

For predicates that are used in commands, such as ‘describe’, we also have to know how many arguments they take when they appear in imperatives. If their negations are defined they may also belong to a set of system imperatives. (s#imperative)

 

 

describe

 

predicate#

                                                        member#    term#   type#

 

 

s#imperativel        verb     (2)       imperative#

 

(1)

 

We now come to the problem of evaluative scales and of represen­ting the contributions the various characteristics and dispositions might make to the evaluation. One assumption that could be made is that the evaluative contribution of these terms is an aspect of the meaning of these terms and therefore should be represented as part of the meaning of these terms. However in this model we have chosen to conceptualize the contributions of these terms to the evaluation of an individual on a dimension as inferences about the location or characterization of the individual on that dimension. We therefore represent these values as part of the meaning of the given evaluative dimension. Even though the evaluative, dimension may be a predicate dimension like the term ‘intelligence’, we display here only the representation of the information relevant to evaluation.

 

 

affective

 

class#

 

                                                                                                                                                 reference#                   term#

 

     scale#term              term#                  variable#           Noun

 

  high    low intelligent unintelligent   student

    I        I          I            I                 I

pleasant unpleasant 0.8      0.3           0.6

 

 

Let us now consider how knowledge about these words and concepts is introduced into the model. Knowledge about some of these terms is built into the model, while others can be defined in a pre—simulation interaction with the model. Basically all the names of indivi­duals and all the terms that can be used to describe them are defined during this interaction with the model. On the other hand the predicates used in the instructions, special concepts, and all the function terms are built into the model.

 

Before illustrating the interaction with the model in which words can be defined we shall just list the predefined concepts and predicates. Their uses and functions will be discussed in later chapters in the contexts in which they become relevant.

 

Concepts:        Assertion, Assumption, Class, Date, Evaluation, Imperative, Individual, Inference, Information, Instruction, Noun—phrase, Parameter, Question, Referent, Sentence, Task, Theory, and Time

 

 

System predicates that do not take an argument:

cooperate, debug, execute, help, label,

          no—action, paraphrase, repeat, save, signoff, and test

 

System predicates that take one argument:

apply, assess, change, define, describe, empty, end, evaluate, execute, go—to, make no—action, paraphrase, print, provide, read, save, set, take, and test

 

As can be seen from the above lists, there are a few predicates, such as ‘paraphrase’ that can appear in commands where the specification of a referent is optional. There are also a few more predicates that fit into dimensions. We shall start each such list with the name of the dimension.

 

Imagining:         imagine, unimagine

Next—action:      restart, resume, quit

Reacting:      approach, avoid

Task—execution:        attempt, discontinue

Verbalizing:          verbalize, unverbalize

 

Below is a sample interaction with the model where the names of some individuals are defined. An evaluative dimension and a few descrip­tive predicates are also defined in that interaction.

 

 

?Define Classes!

 

Defining classes and their members.

·           

·          Please enter class names in the singular.

·           

 

*      Do you wish to define object classes (Populations or Samples)? (Yes, No) yes

*      You are defining classes that have objects or Individuals as members.

*      Name one such class: Group1

Are the members of this class persons, some other animate creatures, or are they inanimate objects? (PERSON, CREATURE, OBJECT)

person

Please list the names of individual PERSONS that belong to this class

—Bill Mary nil

Are there terms (variables) other than “PERSON” that refer to

the members of this class? Please list the terms or type: NIL

—student nil

 

*      * More classes with objects, creatures or persons as members? (Yes, No) no

 

*      Do you want to define evaluative dimensions? (Yes No) yes

* You are defining dimensions for an evaluative semantic space.

The values for the dimension must lie between 0.0 and 1.0 inclusive.

You can also specify “NIL” if the term

does not contribute to the evaluation.

*      Name the dimension: Affective

*      Please enter the low value scale end term: unpleasant

*      Please enter the high value scale end term: pleasant

 

*      * More evaluative dimensions? (Yes, No) no

 

*      “AFFECTIVE” value for the variable: PERSON nil

 

*      “AFFECTIVE” value for the variable: STUDENT 0.6

 

*      Do you wish to define predicate classes (descriptive variables)? (Yes No) yes

* You are defining classes that have predicates as members.

These classes are assumed to function like dimensions, so that the members are jointly exhaustive.

 

 

*      Do you wish to define classes with adjectives as members? (Yes No) yes

*      Name the class: Affective

Please list the adjectives:

—pleasant neutral unpleasant nil

*      “AFFECTIVE” value for: NEUThAL 0.5

*      “AFFECTIVE” value for: PLEASANT 0. 8

*      “AFFECTIVE” value for: UNPLEASANT 0.2

.

*                    * More classes with adjectives as members? (Yes, No) yes

*             Name the class: intelligence Please list the adjectives:

—intelligent unintelligent nil

*                    “AFFECTIVE” value for: INTElLIGENT 0.8

*                    “AFFECTIVE” value for: UNINTELLIGENT 0.3

 

*                    * More classes with adjectives as members? (Yes, No) no

 

*                    Do you want to define classes containing intransitive verbs? (Yes No) yes

*                    Name the class: Action

Please list the intransitive verbs.

—work relax nil

*                    “AFFECTIVE” value for: RELAX 0.4

*                    “AFFECTIVE” value for:  WORK 0.6

 

Are there predicates (variables) that refer to the members of this class?

Please list the terms or type: NIL

—act nil

 

*                    * More classes with intransitive verbs as members? (Yes, No) no

 

*                    Do you want classes containing both adjective and verbs? (Yes No) yes

*                    Name the class: Honesty

Please list the adjectives.

—honest nil

Please list the intransitive verbs.

—lie nil

*                    “AFFECTIVE” value for: LIE 0.1

*                    “AFFECTIVE” value for:  HONEST 0.7

Are there predicates (variables) that refer to the members of this class?

Please list the terms or type: NIL

-nil

 

*                    * More classes containing both adjectives and verbs? (Yes, No) no

 

*                    * Any last minute additions? (Yes, No) no

 

 

For the sake of completeness we shall also list the other trait des­criptors that are used in examples, beginning each list with the dimension, and with the ratings in parentheses.

 

Attitude:   industrious (0.6) and lazy (0.1)

Competence:   talented (0.9), competent (0.7), and incompetent (0.2)

Decisiveness:   bold (0.7), cautious (0.4), and indecisive (0.1)

Presence:   present and absent

 

This last dimension does not refer to trait information and therefore has no ratings associated with the terms.


Chapter 5

 

MODELLING THE SENTENCE PROCESSING CAPABILITY OF THE SUBJECT

 

 

The ability of the subject to recognize, comprehend, and act on sentences is assumed to play a central role in accounting for what he does in impression formation and person evaluation tasks. The simulation of the cognitive processes hypothesized to be involved in sentence processing therefore is at the heart of this computer simulation model. The representation of this capability takes up more than three quarters of the computer program which constitutes the model.

 

As discussed above, the main concern is to construct a model that is minimally competent to simulate some of the cognitive pro­cesses that are hypothesized to be involved in comprehending and acting on sentences. Furthermore, the model is intended to simulate only a few of the phenomena observed in impression formation and person evaluation studies. Only the simplest sentences that would be just sufficient to simulate these phenomena are accepted and acted on by the model. Even though the model is very limited in accepting only a very restricted class of sentences and only simula­ting a very few actions that might be taken in response to these sentences, the algorithms and functions representing this capability occupy more than five hundred thousand bytes of core storage on an IBM 360 computer. It would therefore go beyond the scope of this thesis to discuss the algorithms in any detail. For some other approaches to the computer simulation of language processing compre­hension see Woods and Kaplan (1971) and Winograd (1972).

 

In any case, most aspects of the algorithms do not have any solid psychological foundation. Again only a very limited analogy is intended to hold between the algorithms and the subject’s capa­bility for cognitive processes that they represent. We shall proceed by discussing some of the decomposition assumptions that are made concerning the cognitive processes involved in comprehending and acting on sentences, and illustrate the minimal competence of the algorithms to simulate these processes by considering some sample sentences. We shall therefore use the perspective developed in chapter three for demonstrating the competence of the algorithm to simulate processes that do not go directly from observable input to publicly observable responses. In our examination of the behavior of these algorithms we shall utilize information structures that represent some aspect of what the subject might be expected to know about the sentence as a result of the particular cognitive process we are simulating.

 

We shall start out with some broad decomposition claims about the nature and sequence of the processes and about the information structures involved in comprehending sentences. We shall then examine each of these processes in more detail by examining further decomposition claims that might be made about them.

 

The cognitive processes might, very roughly, be separated into four components: parsing (the recognition and initial linguistic processing of the sentence), sentence comprehension, sentence execution, and response generation if a response or acknowledgement is required. We normally expect these processes to occur in that order. But it is certainly possible that some initial part of the sentence would start to elicit a behavioral response before all of the sentence had been heard. For instance, if someone called out:

 

“Look at . . .“, we might start looking where he was looking or pointing before we had heard the rest of the sentence.

 

A major problem, therefore, in decomposing the cognitive pro­cessing into successive steps is to find out how much of the sentence is analyzed by one process before other, successive processes are called into operation to deal with preceding parts of the sentence. In this model the whole sentence is analyzed by a given processing stage before it is passed on to the next. Thus we assume that the whole sentence is recognized before any of it is analyzed logically to find anaphoric references, to deal with redundancy and negation, or to detect semantic anomalies.

 

But it is quite likely that subjects ‘comprehend’ at least some initial parts of a long sentence before reading or listening to the remainder of the sentence. Thus if the instructions or some stimulus material is presented orally, the subject may interrupt during the middle of a sentence to ask about something at the beginning of the sentence.

 

In this model the first process, parsing, is represented as a process converting the initial string of words into a linguistic deep structure tree representation of the sentence. The second process, sentence comprehension, is represented as a conversion of the linguistic deep structure into various standardized logical forms. This process includes anaphoric reference resolution. It also includes the processing of semantic problems arising from the interpretation of negation or of redundancy, or from the detection of tautologies or contradictions.

 

The third process, the execution, is represented in terms of a set of algorithms that act in response to the sentence by locating the referents and by invoking whatever algorithms may be needed to produce the action the sentence calls for, depending of course on whether it is an assertion, a question, or a command. The algorithm that locates the referents selects those that fit the restric­tions that might be implied from relative clauses or from condi­tionals. If an assertion is involved, the representation of the knowledge about the given referent is modified if necessary to reflect the information obtained from the assertion. In the case of questions and commands, the algorithms associated with the use of the given predicate in that context are invoked and executed.

 

The fourth process, response generation, is primarily relevant for answering questions or for acknowledging assertions or commands. Most of the responses of the subject, however, are due to commands such as “Evaluate Bill!” or “Describe Mary!” These responses are part of the algorithms associated with the predicates that carry the meaning of the command.

 

There are at least two information structures involved in the sentence processing besides those associated with the particular words that are part of the sentence under consideration. On one hand, we have the sentence as a stimulus configuration, as a string of words. Since some words may be kept in memory while others are being processed, some memory structures may well be associated with this input process, especially for recognizing speech. Such a memory representation of parts of the input string would certainly be required if we assumed the processes involved look ahead or back—up. This model utilizes back—up.

 

On the other hand, we assume that a cognitive representation of the meaning of the sentence is being built up as the various pro­cesses act on the sentence. Succeeding processes utilize the information gained from preceding processes. Thus the sentence—exe­cuting process acts on the representation of the meaning of the sentence that is generated by a comprehending process. Similarly the response generating process must know what the result of executing the sentence has been, since this determines what response should be given. But this information must be passed in some form from one process to the next. We shall therefore assume that there is an information structure that is constructed by the various processes. Later processes then utilize the information found in this structure and may add further information.

 

We further hypothesize that this information structure may be interpreted as representing the ‘meaning’ of the sentence. We shall also assume that it is this structure that is kept In memory and that later represents what the subject knows about the previous sentence.  This structure then is referred to for pronouns and for indexical expressions such as “those students”, to locate referents. Another assumption we make is that aspects of this structure are utilized to represent the ‘meaning’ of theories and instructions. We can there­fore hypothesize that it is information in this structure that is remembered as a result of learning new theories or new tasks, and that it is this information that is invoked when the theory is being applied or when the task is executed.

 

This information structure, as represented in the model, may at best be claimed to be functionally analogous to whatever cognitive representation of the meaning of the sentence the subject may be constructing as he deals with the sentence. The only positive analogy between the two representations therefore is based on the claim that the information structure in the model carries sufficient information to allow us to simulate some of the capabilities of the subject to cope with simple sentences in an impression formation or person evaluation experiment. It also allows us to Simulate the capability of the subject to deal with pronouns or indexical expressions. It furthermore allows us to simulate the ability of the subject to learn new theories and tasks, to apply or test the theories and execute the tasks, and it allows us to more or less simulate the ability of the subject to describe these theories and tasks. The only support we can generate, therefore, is for decomposition claims concerning what sort of information should be represented. Beyond this it can only be claimed that the tree structure representation used in the model seems to be functionally adequate at least for the simple sentences that are accepted in this model. We shall come back to the problem of evaluating this representation when discussing the role it plays in simulating some of these capabilities of the subject.

 

Before going on to examine the different processes, let us briefly discuss some of the general contexts in which sentences may be used, and the type of sentences that may be involved. There are four general contexts in which sentences are processed in the model. The initial context is one of cooperation with the experimenter when any of the acceptable sentences are processed and acted on immediately.

 

In another context, the subject has been told that the sentences to follow are instructions for an experimental task. In the model this context is brought about through the command “Take instructions!” In this context most commands are not executed but rather remembered as parts of the task to be learned. Assertions and questions as well as special commands such as “Quit!” or “End instructions!” are intercepted. The model will then ask whether these statements are part of the task or whether they should be executed immediately. Nor­mal sentences in this context would be commands that specify an impression formation or person evaluation task, such as: “Read an assertion!” “Make Inferences!” “Describe the individual!” or “Evaluate him!” Conditional commands can also be used to simulate some of the behavior of individuals in impression formation tasks, such as: “If an intelligent student who is pleasant is present, approach him!” There are also a number of commands that have special meanings in this context, such as “Label!”, “Go—to . . .!“ and “Repeat!” These commands allow some control over the sequencing of instructions in executing the task.

 

For the third context, the subject has been told to read an assertion, or a noun—phrase. This may be a part of a task in which the subject is supposed to read the description of an individual whom he is then supposed to evaluate. If the sentence read is not of the type he has been told to read, the model interrupts and asks the experimenter what to do.

 

The fourth context is one in which theories or assumptions are defined. These theories allow us to simulate the ability of the subject to make inferences about the personality of the individual from the information that he knows about him. These theories are normally expressed as general assertions, such as “Intelligent students who are talented are pleasant”. The same information may of course also be expressed as a conditional assertion: “If a student is intelligent and talented, then he is pleasant”. These assertions are not executed but rather remembered as the definition of the theory. Theories can be tested to see whether they hold true of the individuals the subject knows about, or they can be applied, to make inferences. In this context questions and commands are intercepted and executed immediately.

 

Since the model is primarily concerned with sentences describing the characteristics of individuals, and with simple commands, only sentences containing at most one referent are accepted. This restric­tion greatly simplifies the kind of logical and other semantic pro­cessing required. All sentences must be in the present tense and may not contain adverbs or modals. This restriction again simplifies the problem since no logical and semantic representations for episodic information or for modals is necessary.

 

The sentences must be composed of words that have been defined in terms of their role in that sentence. Thus the term ‘bill’ could be a common noun or a verb as well as a name for an individual. It could be used in all these roles, but it can be recognized, say as a verb only if it has been defined as a verb. Ambiguity in the use of a word for the same function is not accepted. If the term ‘Bill’ is used as a name for an individual, it can only represent one individual. The only exception to this, and only for individuals, is if the model has been told to imagine that individual. In that case the next references, using that name, are to an individual of the same type, i.e. a person and a student, but without any known characteristics. This allows simulation of the description of fic­titious-individuals in experimental settings. Reference to the original individual can be restored if the model is told to unimagine the individual.

 

For ambiguous sentences, the model assigns a single interpre­tation. While subjects also do not normally stop and ask questions to disambiguate sentences, they may do so occasionally. This capability is not represented in the model.

 

We now present the ability of the model to simulate the four main processes that have been hypothesized. For each process we shall consider only a few sample sentences that bring out some of the features of the process that might be theoretically relevant. The tree structures displayed by the model have the roots on the left and branch to the right and downwards.

 

 

Parsing

 

It is assumed that there is some process whereby the subject recognizes a list of words as being a sentence by finding a pattern, a structure or relationship between the words. We differentiate between two such structures.

 

The surface structure reflects some of the linguistic relation­ships between the words. The sequential ordering of the words in the sentence is maintained. There is some evidence from psycho­linguistic investigations that the surface structure may reflect something about the cognitive processes involved in sentence recog­nition. For a general review of some of these results and their implications for sentence recognition and sentence comprehension see Bever (1968).

 

In the model the sentence recognition algorithm is decomposed so that the sub—algorithms essentially reflect the nodes that may appear in the surface tree. The main algorithms therefore deal with sentence composition, with referring expressions (NP’s and relative clauses), and with predicating expressions (VP’s). The flow of control between these algorithms reflects the structural relation­ships between the nodes in the surface tree representation for the sentence being parsed. While the algorithms can trace out a surface tree, that structure is not remembered as part of the information being built up about the sentence (e.g., Chomsky, 1965).

 

While the surface tree reflects the linguistic roles and relationships of the words in the sentence, the deep structure repre­sents more of the semantic structure of the sentence. In the model the deep structure tree becomes part of the cognitive representation for the sentence. The deep structure is developed in parsing the sentence. Sentence recognition is a byproduct of parsing. The different parts of this structure are translated into the logical representation that allows for the semantic processing that will be discussed under the next heading.

 

Parsing, and the construction of the deep structure tree, is accomplished by a hierarchy of algorithms. The sentence level algorithms deal with sentential connectives such as ‘if — then’. The sentence type, such as assertion, question, or imperative, is identified mainly from the structure of the sentence. These algorithms in turn call on the algorithms for referring and pred­icating expressions.

 

The algorithms for referring expressions again have to deal with connectives such as ‘and’ and for conjoined noun phrases. Four main modes of referring are differentiated. The most direct type of reference is through naming with terms such as ‘Bill’ and ‘Task1’.  A second and third mode of referring depend on earlier parts of the sentence or on references made in earlier sentences. The second type uses pronouns and the third type uses indexical expressions such as “those students”. A fourth mode of referring depends on a description of the intended referent through the use of terms such as ‘student’. For this type we differentiate between definite descriptions such as “the students . . .“ and indefinite descriptions. The third and fourth mode of referring may also use adjectival modifiers such as in “the intelligent student” and relative clauses “the students that are intelligent” to help select the intended referents. Complete referring expressions may utilize more than one mode, such as in:  “those of them who are intelligent”. In cases of reference by naming, relative clauses are interpreted as nonrestrictive.

 

Since the model accepts only the present tense, and at most one referring expression per basic sentence, the predicating expressions consist only of connectives, auxiliaries, adjectives, and verbs in the present tense. Verbs in present or past participial form can also be used as adjectives, as in “Bill is working”. No differentiation in meaning is made between this sentence and the sentence “Bill works”. Since, in the model, words are stored in un— inflected form, the algorithms for both predicating and referring expressions call on yet other algorithms to recognize words.

 

To illustrate the surface and deep structure trees we shall choose a sentence that was used in chapter one to define a conditional task.

 

?If an intelligent student who is industrious is present, approach him!

.    ss corn IF

s ref descr indef det AN

•                               mod adj INTELLIGENT

.                                                                                                                                                                                                                               class STUDENT

•                                  rs rpron WHO

                     pred aux IS

                                               adj INDUSTRIOUS

                               pred aux IS

                                        adj PRESENT

             ss s pred verb APPROACH

                 ref pron HIM

 

This statement is an imperative.

 

Deep structure tree:

COND PROP REF DESCR TYPE INDEF

QUANT ALL

QTERMAN

NUN SING

CLASS STUDENT

QUAL AND  INTELLIGENT INDUSTRIOUS

FRED PRESENT

PROP REF PRON TYPE DEF

QUANT NIL

QTERM NIL

NUN SING

TERMHIM

PEED APPROACH

 

 

The main concern of the sentence recognition algorithm in the model is to deal with sentences that may be used in impression formation and person evaluation studies. Since these studies have emphasized the integration of stimulus information, a particular concern was to cope with sentences that convey more than one person­ality attribution by the use of connectives in predicating expressions and in relative clauses. Unfortunately, ambiguities arise if more than one kind of connective is used in the same expression. Consider for instance the sentence: “Bill is honest and talented or bold and intelligent”. There are several interpretations, depending on how the descriptive terms are clustered with the connectives. We might have “Bill is (honest and talented) or (bold and intelligent)”, so that Bill might not be honest, or we might interpret the sentence as “Bill is honest and (talented or bold) and intelligent”. According to the second interpretation the sentence definitely asserts that Bill is honest. In the model only one syntactic interpretation is found.

 

?Bill is honest and talented or bold and intelligent.

ss s ref name BILL

pred aux IS

adj HONEST

corn AND

adj TALENTED corn OR adj BOLD corn AND adj INTELLIGENT

 

This statement is an assertion.

 

Deep structure tree:

PROP REF NAME BILL

PRED OR AND HONEST

TALENTED

AND BOLD

INTELLIGENT

 

 

The model assigns the interpretation by a fixed set of rules that does not consider the meaning of the descriptive terms. While most individuals do not seem to notice the ambiguity of such sentences, it seems likely that they will consider the meaning of the terms in arriving at an interpretation. One could easily substitute different algorithms in the model to arrive at different interpretations. For instance one could try out algorithms that utilize the evaluative ratings associated with the descriptive terms. These interpretations could then be tested empirically since they might lead to different evaluations. In this way, it is hoped, the model might be of use in uncovering linguistic and semantic rules that better describe how subjects disambiguate such sentences. Similar considerations apply to the interpretation of negation.

 

Sentence comprehension

 

Above we considered how the sentence might be recognized, and how the relationship between the words that is reflected in the deep structure tree might be determined. In this section we shall address the problem of how the information in the sentence might be further interpreted. We have to find a representation that allows us to simulate how the subject acts on the sentence. We would also like to simulate his ability to integrate the information in the sentence, to deal with redundancy and negation, and to detect semantic anomalies such as tautologies and contradictions. For this reason a logical representation was chosen.

 

There seems to be little direct evidence about the nature of the cognitive representation of the semantic information in the sentence. The main support for a logical representation therefore is its functional adequacy in allowing us to simulate the knowledge and behavior of the subject. For a discussion of logical representations from a linguistic point of view, see Lakoff (1970). For a discussion of logical representations from a psychological perspective, for their use in portraying the structure of semantic memory see Kintsch (1972).

 

Since we are primarily concerned with the usefulness of a sen­tence processing model as an account for impression formation and person evaluation tasks, we shall not consider any of the details associated with translating the deep structure tree into a logical representation. It might just be mentioned that indexical and descriptive referring expressions are seen as specifying classes of referents, where the variable terms such as ‘student’ specify possible members of the class, and where adjectival modifiers and restrictive clauses are seen as restricting the membership in the class. The referents for pronouns and indexical expressions that depend on previous sentences are located at this point. We shall just illus— the translation with the two examples used above.

 

?Bill is honest and talented or bold and intelligent.

 

•         This statement is an assertion.

 

Deep structure tree:

PROP RRF NAME BILL

PRED OR AND HONEST

•                                                           TALENTED

                                                    AND BOLD

 

 

•       Number of variables: 1

•       Number of quantified variables: 0

Imbedded quantifiers: No

Top level existential quantifiers: No

Definite description or Indexical reference: No

 

Reference description:

•         Var Classv Term Type Num Quant Qterm Descr Class

• Xl         BILL     NAME  SING

 

——Predication:

•    OR AND HONEST Xl

•          TALENIED Xl

•       AND BOLD Xl

           INTELLIGENT Xl

 

?If an intelligent student who is industrious is present, approach him!

 

This statement is an imperative.

 

Deep structure tree:

COND PROP REF DESCR TYPE INDEF

QUANT ALL

QTERM AN

NUN SING

CLASS STUDENT

•                TERM HIM

•                  PRED APPROACH

 

.                                                             The quantifier: (ALL Xl) was moved from argument: 1

•               of connective: COND

•   and advanced over the connective to deal with pronoun reference.

 

.                   Number of variables: 1

•   Number of quantified variables: 1

•  Imbedded quantifiers: No

•  Top level existential quantifiers: No

•  Definite description or indexical reference: Yes

 

•  Reference description:

• Var Classy Term  Type                     Nun Quant Qterm Descr Class

•                Xl CX1 STUDENT INDEF SING ALL AN             DESCR

•      Reference restrictions: (ALL Xl) IMPLY AND INDUSTRIOUS Xl

•                   INTELLIGENT Xl

•                MEMB X1

•                CX1

 

•  ——Predication:

•    (ALL Xl) COND PRESENT Xl

•                APPROACH Xl

 

 

There are three main types of standardized logical representations that are used in the model. There is a representation for conditional commands. A second type of representation uses the conjunctive normal form. The third type of representation is an exclusively disjunctive normal form.

 

Imperatives like the statement “If an intelligent student who is industrious is present, approach him!” are conditional in that they apply only under certain circumstances and to a restricted set of persons. We therefore analyze such a conditional imperative as being composed of the question: “is an intelligent and industrious student present”, and the command: “approach that student”. For our example we therefore need a representation that allows us to apply this question to each student so that the execution of the command can be conditional on the answer to the question. For a discussion of a logic of command that includes this conditional feature see Rescher (1966).

 

 

?If an intelligent student who is industrious is present, approach him!

 

•         Reference description:

• Var Classy Term                     Type     Nun Quant Qterm Descr Class

•  Xl CX1 STUDENT INDEF                          SING ALL AN                   DESCR

•                    Reference restrictions: (ALL Xl) IMPLY AND INDUSTRIOUS Xl

•                                                 INTELLIGENT Xl

•                                        MEMBX1

•                                                                                                                                       XC1

 

 

•         ——Command normal form:

•    (ALL Xl) COND PRESENT Xl

                   APPROACH Xl

 

 

Assertions can also be interpreted as commands that tell the subject to remember some facts about some individual. Assertions can also be conditional, as in the sentence “If an intelligent student is lazy then he is bold”. This assertion could also be used as a theory that only applies to a restricted class of individuals.

 

In considering the descriptive information conveyed by asser­tions, we can examine the certainty with which a given trait is asserted. For instance the sentence “Bill is honest and he is intelligent or industrious” assures us that Bill is honest. However it does not convey the same assurance about Bill’s Intelligence. There of course are many ways of indicating degrees of certainty about descriptive traits, as for instance with modals: “Bill may be intelligent”. The only type of uncertainty that is represented in the model results from the consideration of alternative traits as in the sentences: “Bill is intelligent or industrious” and “Bill is intelligent unless he is industrious”. The conjunctive normal form (CNF) allows us to differentiate between the information that is asserted with certainty, and that information for which alternative interpretations are possible. The traits about which there is certainty are directly dominated by the main connective ‘and’, while the others are also dominated by the connective ‘or’.

 

 

?Bill is honest and he is intelligent or industrious.

 

•  Reference description:

• Var Classy Term     Type    Nun Quant Qterm Descr Class

• Xl         BILL      NAME    SING

 

—Conjunctive normal form:

•    AND HONEST Xl

OR INDUSTRIOUS Xl

INTELLIGENT Xl

 

 

The exclusively disjunctive form (XDNF) is useful for deriving all the possible conjunctions of traits that may be applied to the individual as the result of an assertion. In other words, if we see the personality of an individual as represented by a conjunction of all the traits that apply to him then this representation gives us a list of the different possible personalities, the different trait configurations that might be inferred from the sentence. From “Bill is honest and he is intelligent or industrious” we can infer that he might be honest, intelligent, and industrious. On the other hand he may also be honest, intelligent, and lazy, or he may be honest, unintelligent, and industrious.

 

 

 

?Bill is honest and he is intelligent or industrious.

 

Reference description:

• Var Classy Term                                  Type     Nun Quant Qterm Descr Class

• Xl                                 BILL                 NAME     SING

 

—Disjunctive normal form(s):

•              OR AND HONEST Xl

•                                 INDUSTRIOUS Xl

•       AND HONEST Xl

•            INTELLIGENT Xl

 

 

In all interpretations he is seen as honest, since that was the trait that was asserted with certainty. This logical represen­tation captures our conceptualization of how an impression of the individual is formed. According to this conceptualization, the alternative trait configurations correspond to alternative impres­sions, alternative interpretations of the personality of the individual. In the model each of these alternatives is considered in making an evaluation of the individual. If all the traits in a description are asserted with certainty, for instance,if only the connective ‘and’ is used, then there is only one trait configuration.

 

Unfortunately this conceptualization is still not adequate for representing how a subject actually interprets the information from a sentence, in that some fairly simple sentences such as “Bill is honest and talented or bold and intelligent” would lead to a large number of alternative impressions. It seems unlikely that the sub­ject actually considers all these alternatives in making an evaluation.

 

 

?Bill is honest and talented or bold and intelligent.

.

Reference description:

Var Classy Term     Type Num Quant Qterm Descr Class

Xl         BILL     NAME SING

 

—Conjunctive normal form:

AND OR BOLD Xl

         HONEST Xl

OR BOLD Xl

        TALENTED Xl

      OR HONEST Xl

INTELLIGENT Xl

OR INTELLIGENT Xl

         TALENTED Xl

 

——Disjunctive normal form(s):

OR AND BOLD Xl

INTELLIGENT Xl

AND HONEST Xl

TALENTED Xl

 

11 alternative interpretations are possible.

The dimension ‘competence’ is composed of the adjectives ‘talented’, ‘competent’, and ‘incompetent’.

 

The dimension ‘decisiveness’ refers to the traits ‘bold’, ‘cautious’, and ‘indecisive’.

 

XOR AND BOLD Xl

HONEST Xl

INTELLIGENT Xl

TALENTED Xl

    AND BOLD Xl

COMPETENT Xl

INTELLIGENT Xl

LIE Xl

    AND BOLD Xl

COMPETENT Xl

                      HONEST Xl

INTELLIGENT Xl

    AND BOLD Xl

INCOMPETENT Xl

INTELLIGENT Xl

LIE Xl

    AND BOLD Xl

                      HONEST Xl

INCOMPETENT Xl

INTELLIGENT Xl

    AND BOLD Xl

INTELLIGENT Xl

LIE Xl

TALENTED Xl

          AND CAUTIOUS Xl

                    HONEST Xl

        TALENTED Xl

   UNINTELLIGENT Xl

          AND CAUTIOUS Xl

         HONEST Xl

 

        INTELLIGENT Xl

        TALENTED Xl

        AND HONEST Xl

                 INDECISIVE Xl

       TALENTED Xl

       UNINTELLIGENT Xl

       AND HONEST Xl

                 INDECISIVE Xl

       INTELLIGENT Xl

       TALENTED Xl

   AND BOLD Xl

                 HONEST Xl

       TALENTED Xl

       UNINTELLIGENT Xl

 

 

We now come to two further issues related to the interpretation of sentences. The first of these concerns redundancy. The second relates to the interpretation of negation.

 

Two types of redundancy will be considered in this context. The first, which we shall call logical redundancy, arises if a term is repeated in a conjunction or disjunction of traits, as in “Bill is intelligent and intelligent”. It may also appear if a term in the reference restrictions is repeated in the predication: “An intelligent student is intelligent and industrious”. In the model, the repetitions are simply ignored. A whole disjunction may be removed if one of the terms was asserted as certain: “Bill is intelligent and he is intelligent or bold”.

 

The second, semantic redundancy, arises if twe terms have similar meanings, so that one term implies the other. An obvious case would be the sentence “Bill is over 5’ 10” tall, and he is also over 6’.” In terms of a measurement representation we might think of these descriptors as lying on a Guttman scale. The same may be true of traits such as ‘indecisive’ and ‘cautious’, where a person who is characterized as indecisive is also believed to show all the traits of a cautious person. In the model we have conceptualized all the trait dimensions as bipolar Guttman scales, where the ordering on the scale is given by the ratings of the terms on one of the evaluative dimensions. So in encountering a conjunction such as “indecisive and cautious” the more extreme trait, i.e. the more negatively evaluated term if both terms are evaluated negatively, is seen as implying the other trait. The less extreme trait can there­fore be ignored as redundant.

 

 

?Bill is indecisive and cautious.

 

Reference description:

   Var Classy Term   Type              Nun Quant Qterm Descr Class

• Xl         BILL      NAME    SING

 

• Semantic redundancy on dimension: DECISIVENESS

•before: AND INDECISIVE Xl

•    CAUTIOUS Xl

•after: AND INDECISIVE Xl

 

•  —Conjunctive normal form:

 

•              INDECISIVE Xl

 

——Disjunctive normal form(s):

 

•              INDECISIVE Xl

 

•         This statement is unambiguous.

 

 

The interpretation of negation is also based on the dimensional conceptualization of the trait terms. In this case if a trait is negated then one of the other terms on the dimension must apply. Here again there may be a redundancy effect. If a less extreme trait is negated, such as in “Bill is not cautious”, then it can be assumed that the more extreme trait is negated as well.

 

 

 

?Bill is not indecisive.

 

•   Reference description:

• Var Classy Term                                  Type              Nun Quant Qterm Descr Class

• Xl                                 BILL      NAME     SING

 

*    Interpreting negation * .expression: NOT INDECISIVE Xl

 

.Checking for negative redundancy on the dimension: DECISIVENESS

.    before: OR BOLD

•                             CAUTIOUS

.after: OR CAUTIOUS

•       BOLD

 

•         —Conjunctive normal form:

•              OR CAUTIOUS Xl

•       BOLD Xl

 

——Disjunctive normal form(s):

• OR CAUTIOUS Xl

•                    BOLD Xl

 

?Bill is not cautious.

 

Reference description:

• Var Classy Term       Type     Nun Quant Qterm Descr Class

• Xl           BILL      NAME     SING

 

* Interpreting negation * .expression: NOT CAUTIOUS Xl

 

.Checking for negative redundancy on the dimension: DECISVENESS

.before: OR BOLD

.           INDECISIVE

 

• after: BOLD

•   —-Conjunctive normal form:

•     BOLD Xl

 

——Disjunctive normal form(s):

• BOLD Xl

 

•   This statement is unambiguous.

 

 

We have not discussed the detection of tautologies, contradic­tions and other anomalies in the sentence. Tautologies and contra­dictions can easily be detected from the logical forms. Semantic anomalies may become apparent during an examination of redundancy and negation. Thus for instance the sentence “Bill is bold and indecisive” would be anomalous since the terms are on opposite poles of the same dimension. The relationship between the reference restrictions and the predication must also be examined in this context.

 

The different logical representations then become part of the cognitive structure that represents the meaning of the sentence. It is these logical forms that are used in executing or acting on the sentence. We shall just illustrate this cognitive structure for a couple of sentences.

 

 

?Bill is intelligent and lazy.

 

•   Internal “cognitive” representation of the information:

      SENT ACT-REF.: (BILL)

•                             CONTEXT# : ASSERTION

•          DTREE : PROP REF NAIVE BILL

                        PRED AND INTELLIGENT

•                           LAZY

           INT-REF : (BILL)

           POSS—REF : (BILL)

                             PRED Xl CNP : AND INTELLIGENT Xl

                          LAZY Xl

                             PROP PRED CNF : AND INTELLIGENT Xl

                           LAZY Xl

                   DNF : AND INTELLIGENT Xl

                           LAZY Xl

                   LOG : AND INTELLIGENT Xl

                            LAZY Xl

                                     TERMS : (INTELLIGENT LAZY)

                   XDNF : AND INTELLIGENT Xl

                           LAZY Xl

               REF BILL VAR : Xl

                  VAR QUANTNO : 0

                                    VAR-L : (Xl)

                                    VARNO :1

                                    Xl MUM : SING

                                      TERM : BILL

                     TYPE : NAME

         RETURN : (T)

 

:If an intelligent student who is industrious is present, approach him!

 

•   SENT ACT—REP : (BILL)

•         CONTEXT# : IMPERATIVE

DTREE: COND PROP REF DESCR TYPE INDEF

QUANT ALL

QTERM AN

NUN SING

CLASS STUDENT

QUAL AND    INTELLIGENT

                          INDUSTRIOUS

PRED PRESENT

PROP REP PRON TYPE DEF

   QUANT NIL

   QTERM NIL

   NUM SING

   TERM HIM

      PRED APPROACH

 

.     INT-REF : (BILL)

.               POSS—REF : (BILL MARY)

.     PRED Xl CNF : OR  ABSENT Xl

                                                                APPROACH Xl

COMMAND LOG : (ALL Xl) COND PRESENT Xl

APPROACH Xl

DNF : (ALL Xl) OR ABSENT Xl

APPROACH Xl

LOG : (ALL Xl) COND PRESENT Xl

APPROACH

TERMS : (APPROACH INDUSTRIOUS INTELLIGENT PRESENT)

XDNF : (ALL Xl) XOR AND ABSENT Xl

APPROACH Xl

AND ABSENT Xl

AVOID Xl

AND APPROACH Xl

PRESENT Xl

REF CLASS—L : (CX1)

CX1 VAR : Xl

DEF-INDEX : T

STUDENT CLASSV : CX1 VAR : Xl

TERMS : (HIM STUDENT)

VAR QUANTNO : 1

VAR-L : (Xl)

VARNO : 1

Xl    CLASSV : CX1

CNF : AND INDUSTRIOUS Xl

INTELLIGENT Xl

•                                                DESCR : DESCR

•                   DNF : AND INDUSTRIOUS Xl

                                                                           INTELLIGENT Xl

•                 LOG : AND INTELLIGENT Xl

•                           INDUSTRIOUS Xl

•                              NUN:SING

•                 QUANT : ALL

•                                                REP-REST LOG : (ALL Xl) IMPLY AND INDUSTRIOUS Xl

•                                                   INTELLIGENT Xl

MEMBX1

                                                                                                                                             CX1

•                 TERM : STUDENT

•                 TYPE : INDEF

•    RASSOC Xl                     : (BILL)

•    RETURN :      (T)

• RSUCCESS Xl :     (ALL (BILL))

 

 

 

Before acting on the sentence, the global context has to be examined to see whether the sentence forms part of a definition of a task or of a theory. If so, the cognitive structure has to be examined to see whether the sentence fits into that context and thus should not be acted on at this time, or whether it is of a kind that calls for immediate execution. If the sentence is part of a task or of a theory, then this cognitive structure is remembered as part of the cognitive representation for that task or theory and is not exe­cuted. This cognitive structure then is used when the task is to be executed or when the theory is to be tested or applied.

 

 

Sentence execution

 

In acting on a sentence we have to identify the referents that might be involved before we can act on the predication. If there is no referring expression in the sentence, such as in “Quit!”, then the predication can be acted on immediately.

 

 

Most of the referring in the model is ultimately done by naming. In the statements “Bill is intelligent” or “Execute Taskl” we directly name the individual or the concept to whom the predication is to be applied. We can also make assertions about groups of indi­viduals, as long as the group has a name, such as ‘Group1’. But any assertion about such a group, such as “Group1 is pleasant” is remembered as a characterization of the group and no inferences about the individuals in the group are made. The model therefore cannot cope with statements such as “Group1 is working hard” where the nor­mal inference would be that the individuals belonging to the group are working hard.

 

When a descriptive term such as ‘student’ is used, it is assumed to refer to a specific, finite set of individuals, where each of the individuals can be named. In the model this set is specified at the time that individual and class names are being defined, as illus­trated at the end of chapter four. According to that example, the term ‘student’ refers to the members of Group1: Bill and Nary.

 

There are two interpretations for a generic statement such as “Students are intelligent”. On one hand, we might interpret it as a specific assertion that applies to the set of students that are known. In this case the predication “is intelligent” is applied to each of the specific individuals that has been located. So from “Students are intelligent” we would infer that Bill is intelligent and that Nary is intelligent.

 

On the other hand we might see the generic statement as a theory that could be applied to different individuals at different times. In the model we do not represent selective applications of the theory, where it might be applied to Bill but not to Nary. Rather, when­ever the theory is applied, it is applied to all the students that are known at the time.

 

In reference by description, adjectival modifiers and relative clauses indicate reference restrictions. In “intelligent students who are industrious”, the term ‘student’ points to a set of possible referents. The intended referents must belong to this set and must also be characterized as being intelligent and industrious. Definite descriptions such as “the intelligent student who is industrious” may be seen as imposing restrictions on the size of the set of intended referents. If we know more than one such student then the referring expression above is ambiguous.

 

The two types of reference discussed above have involved a search of memory to identify and select the intended referents. Some pronouns and indexical expressions that do not relate back to previous references in the sentence can also be seen as involving a search of memory. In this case the memory of previous sentences has to be searched for the appropriate referents. We may illustrate with the sentences: “The technicians and Bill are industrious”, “They are intelligent and talented”. In this case the pronoun ‘they’ would be interpreted as referring to the technicians and to Bill. But if we had used the sentence “They are intelligent and he is talented” then the pronoun ‘he’ would refer to Bill, and ‘they’ refer only to the technicians

 

There presumably are many situations where the reference is not made by searching memory but rather by actively looking for the intended referent. For instance if someone said “This book is good” we might well look to see whether he is pointing at a particular book. The referring expression therefore invokes a cognitive pro­cess that is not particularly a search of memory. This type of reference is represented in the model through concepts such as ‘time’ and ‘date’. When these terms appear in a referring expression they invoke algorithms that determine the time or print out the date.

 

Finally there is a type of reference that does not involve either a search of memory or an active cognitive process. The most immediate example is quoting. Here the quoted expression is directly taken as the reference. Some abstract descriptive concepts could be interpreted as functioning in a similar manner in some contexts, where they are not used to point to particular, nameable objects or entities but where they function rather as modifiers for the predi­cates. For instance in the sentence “Provide information!” the term ‘information’ does not refer to particular pieces of information but but rather modifies or restricts what is to be provided.

 

In chapter four we have already discussed the different contexts in which predicates may be used. One of these, in which the predi­cates are used to identify scale dimensions, is peculiar to evalua­tions and will not concern us here. Also, much of the cognitive processing involved in obeying commands is specific to the particular predicates in the command and therefore is not an aspect of general sentence processing. We shall therefore only be concerned with how these predicates are invoked. For assertions and questions we also have to consider what sort of effects the predicates may have. For assertions we therefore have to discuss how the memory structure is modified. For questions we have to consider how the result of the memory search is determined and how it is conveyed.

 

To start with, let us consider the memory structures built up by simple assertions. Let us start out with the representation of information about which the subject is certain, such as from the sentence “Bill is intelligent and industrious”.

 

 

Bill

 

individual#

           dispositions#                                                             member# term# type#

                                        and                                                 Groupl                Noun (person student)

 

(and (attitude industrious t)

(intelligence intelligent t))

 

Each trait is associated with its predicate dimension. The third term associated with each predicate is intended to indicate the per­manence of the trait. There are some descriptors that are invariant over time, such as ‘male’ or ‘female’. Other attributions, here associated with a ‘t’ may change or may be reconsidered as a result of information from new assertions. In the model the invariant attributes are asserted as factual assumptions.

 

In the model, uncertain information is represented in terms of sets of alternative descriptors. Let us therefore examine the memory structure after we have added the assertion: “Bill is bold or talented”. For the sake of simplicity we shall leave out the nodes:  member#., term#, and type#.

 

 

 

 

Bill

 

individual#

 

dispositions#

 

1

 and

(and (attitude

(intelligence intelligent t))

(or (or (decisiveness bold t)

(competence talented t)))

 

 

There may of course be more than one disjunction of traits that is remembered about the individual.

 

To answer questions, a search is made of this tree structure to see whether the predicates are represented on the tree. If the question involves uncertain information, such as “Is Bill bold or talented?”, both nodes may have to be examined since the question can be answered positively if one of the traits is known with certainty, or if both traits are part of a disjunction of traits. A special problem arises if neither the trait nor its negation is represented. For instance if no assertion has been made about Bill’s intelligence then neither ‘intelligent’ nor ‘unintelligent’ will be represented on the tree. In this case, since we do not represent uncertainty out­side of known alternatives, we have the problem of how to answer the questions: “Is Bill intelligent?” and “Is Bill unintelligent?” At present both questions are answered in the negative by the model. To deal with problems like this, a multi—valued logic would be needed. To further illustrate the problem, consider the conditional assertions and commands “If Bill is intelligent, he is talented” and “If Bill is intelligent, approach him!”

 

 

As we can see from the above examples, for conditional assertions and commands the condition is interpreted as a question. In some cases where the condition applies to more than one referent, the command is applied only to those referents for whom the question is answered positively, as in “If an intelligent student who is industrious is present, approach him!”

 

It should be clear from the preceding discussion that sentences are always seen as involving a task or an implicit command, in the sense that some action has to be taken to deal with the sentence. Assertions are seen as carrying the implicit command that the information conveyed should be believed and remembered. Questions involve requests for searches of memory. Commands can also result in a hierarchy of actions to be taken as a result. For instance the command “Execute Task1!” involves remembering and acting on the instructions for Task1. One of these instructions may be the command to read an assertion. As conceptualized in the model, acting on the command “Execute Task1!” then involves acting on the sentence “Read an assertion!”, which in turn involves reading and acting on an assertion. We therefore have a hierarchy of actions corresponding to the different sentences. This is how the hierarchy of tasks that was discussed in chapter two is implemented in the model.

 

This hierarchy of sentences to be acted on forms a basic part of our conceptualization of the cognitive dynamic underlying the behavior of subjects. We have already discussed how the subject acts on the basis of instructions, such as “Execute Task1!” If we asked him why he was answering the questionnaire, he might give as a reason a reflection of this instruction, such as “I am doing Taskl”. We might now extend this perspective to reasons for behavior for which the subject has not received explicit instructions. For instance the subject might say that he is participating in the experiment because he is cooperating with the experimenter. According to our perspective, we might hypothesize that the subject is acting under an internal instruction to himself: “Cooperate with the experimenter!” In the model this is represented by the command “Cooperate!”, the top level task which reads and acts on sentences. For a general discussion of a similar view of cognitive dynamic, from the perspective of plans, see Miller, Galanter and Pribram (1960).

 

We have already discussed the different memory representations that are involved in acting on sentences. The memory representation for the meaning and use of words does not normally change during the interaction and therefore would correspond to long term memory. In the model, the memory representation for the individuals, tasks, and theories is built up and modified during the interaction, but it is not subject to forgetting, except in the cases where the model had been instructed to imagine the persons. The memory representation for sentences, however, can be forgotten. The string of words making up the sentence and the corresponding surface tree is forgotten as the sentence is processed. The deep structure tree that is part of the cognitive representation for the sentence is forgotten after two further sentences have been processed. The whole cognitive represen­tation is also forgotten after four succeeding sentences have been dealt with, unless this cognitive representation has been remembered as a part of a theory or a task.

 

Finally we should briefly discuss the generation of verbal responses. We differentiate between responses that are generated in consequence of specific commands such as “Describe the individual!” and responses that are generated as part of the normal sentence processing. Except for answers to questions, the latter type of response is only intended as an acknowledgement.

 

Response generation is handled heuristically. The model does not generate complex sentences. No unique cognitive structures, no deep and surface structure trees are assembled for sentence genera­tion. While there is a set of algorithms and information structures for recognizing sentences, no equivalent set of algorithms and inf or— mation structures has been provided for generating sentences. Unlike the experimental instructions and the descriptive stimuli, the responses in impression formation and person evaluation studies have usually not been in sentence form but rather have involved scale responses or the choice of fixed alternatives. A simulation of the cognitive processes involved in sentence generation, while of independent theoretical interest, was not considered crucial for this model.


Chapter 6

 

CONCLUSION

 

 

The computer simulation model discussed in this dissertation represents an attempt to construct an exactly specifiable, testable cognitive process model of some phenomena found in impression forma­tion and person evaluation studies. The first two chapters illus­trate how the model is minimally competent to simulate the ability of subjects to evaluate persons on rating scales, to make inferences, and to act on impressions. The computer model is able to deal with simplified but complete questionnaires that could be given to a subject. It is also able to engage in a limited verbal interaction with the experimenter. To simulate the cognitive processes of the subject in dealing with these impression formation and person evalua­tion tasks, the computer model simulates the subject’s ability to understand and act on sentences: the experimental instructions and the descriptive stimulus sentences.

 

There are two levels at which the model is intended to make theoretical contributions. At a very general level, the feasibility of constructing cognitive process models based on natural language comprehension is explored. At a more specific level, directly oriented toward impression formation, information integration, and person evaluation, a more integrative account for the subject’s ability to interpret descriptions, to form impressions, to make inferences, to evaluate individuals, and to act on impressions is suggested. This account includes some hypotheses about semantic effects such as redundancy, the interpretation of negation, and ambiguity.

 

Three types of redundancy are differentiated in the model. Logical redundancy arises from the repetition of terms in the descrip­tion, while semantic redundancy arises if two terms have similar meanings so that one term implies the other. Inferential redundancy arises if the subject makes inferences about the personality of the individual and adds further trait descriptors on that basis. If the trait descriptor that would be added from an inference is already mentioned in the description, it may be considered redundant.

 

The scale evaluation is analyzed or decomposed into a number of processes. The evaluative dimension is derived from the meanings of the scale end terms. To make an evaluation, one or more alterna­tive impressions are considered, so that two levels of information integration are hypothesized. The first deals with the evaluation of a single coherent impression, conceptualized as a conjunction of traits. The second deals with the integration of the evaluations for each of these impressions to arrive at a single overall evaluation. Finally, this overall evaluation on the given dimension is translated into a scale choice.

 

This approach has raised a number of questions about the inter­pretation of ambiguous and uncertain information that will be explored in future research. An extension of the model is also planned to include an account for adverbial modification of trait descriptions.

 

The ability of the subject to make inferences is conceptualized in terms of theories that the subject might bring to the experimental situation. These theories are represented in the model as general assertions that can be applied at the moment when inferences are to be made. The possibility that the subject’s impression of an indivi­dual might influence his behavior toward that individual is repre­sented in the model. The behavior is conceptualized as a conditional task that is executed only in the appropriate circumstances.

 

At a general level, the model simulates at least some of the cognitive processes that are hypothesized to be involved in answering questionnaires in impression formation and person evaluation studies. The simulation is based on the assumption that comprehending and acting on sentences underlies most of the cognitive processing of the subject. The cognitive dynamic is conceptualized in terms of the ability of the subject to use the sentences in the instructions to build up a cognitive map based on the meaning of these sentences. This cognitive map then enables the subject to do the experimental task. The model simulates the subject’s sentence comprehension by constructing linguistic and logical representations for the meanings of sentences. The logical representation then provides a specifica­tion of the action to be taken. (For the descriptions, this logical representation specifies what information should be remembered about the individual, and what impressions could be formed.) The model then uses these representations of the meanings of sentences to simulate the subject’s ability to comprehend and act on instructions, and to assemble these instructions into a cognitive map for the complete task. The model also simulates the ability of the subject to describe the task.

 

Another aspect of cognitive processing is the ability of the subject to interrupt a task to interact with the experimenter, and then go back to the task. This is also simulated in the model.

 

It is hoped that this cognitive processing approach might lead to more general, and more integrative accounts of experimental interactions. Underlying this approach is the view that experiments are simulations of psychological processes in naturally occurring social situations. According to this perspective, experiments, and the theoretical accounts of the processes and behaviors in experi­ments, have to be evaluated in their function as analogies to the processes and behaviors in the naturally occurring social situation. This dissertation thus explores the feasibility of constructing data structures and theoretical models that use linguistic and logical representations and algorithms instead of numerical representations and functions as the basis for analogy. While this approach requires more complex data representations and algorithms, and while it does not allow as easily for information reduction in the data, as in the averaging of responses, an advantage of this approach is that it allows for a closer analogy and an increased scope of account. The model presented here does not require translation of the stimulus sentences into numerical data structures. As well as being more integrative in accounting for inferences and social behavior as well as scale evaluations, the model also includes an account of the knowledge the subject is assumed to bring to the experimental situation, and an account of how the experimental instructions are inter­preted and acted on.

 

As long as the main focus of experimental procedures is on the choice of fixed response alternatives such as in scale response behav­ior, an explicit account of sentence generating processes is not necessary. However these measurement constraints could be seen as contributing to the negative analogy between naturally occurring social situations and experimental studies. An extension of this model is planned to include a simulation of the cognitive structures and processes involved in verbal responses.


Appendix

 

PROGRAM LISTING

 

 

 

The program is written in LISP (Hafner and Wilcox, 1974) for use under the Michigan Terminal System on the IBM system 360 Model 67 computer.

 

This volume does not contain a listing of the program. A taped version of the program and the program listing are available from the author at Queen’s University, Department of Psychology, Kingston, Ontario, Canada.


 

 

 

Abelson, R. P. and Rosenberg, M. J. Symbolic psycho—logic: a model of attitudinal cognition. Behavioral Science, 1958, 3, 1—13.

 

Anderson, N. H. Integration theory and attitude change. Psychologi­cal Review, 1971, 78, 171—206.

 

Asch, S. E. Forming impressions of personality. Journal of Abnormal and Social Psychology, 1946, 41, 258—290.

 

Bales, R. F. Interaction process analysis. Cambridge, Mass:

Addison—Wesley, 1950.

 

Bever, T. G. A survey of some recent work in psycholinguistics. IBM Corp. Tech. Rep. No. 3,IV, Yorktown Heights, N.Y.: IBM Corp., 1968.

 

Burks, A. W. Cause, chance, and reason. Unpublished book, Depart­ment of Philosophy, The University of Michigan, 1963.

 

Burt, M. K. From deep to surface structure. New York: Harper & Row, 1971.

 

Cartwright, D. and Harary, F. Structural balance: A generalization of Heider’s theory. Psychological Review, 1956, 63, 277—293.

 

Chomsky, N. Aspects of the theory of syntax. Cambridge, Mass:

N. I. T. Press, 1965.

 

Coombs, C. H. A theory of data. New York: Wiley, 1964.

 

Dustin, D. S. and Baldwin, P. M. Redundancy in impression formation. Journal of Personality and Social Psychology, 1966, 3, 500—506.

 

Edwards, A. L. Techniques of attitude scale construction. New York:

Appleton—Century—Crofts, 1957.

 

Fishbein, N. (comp.) Readings in attitude theory and measurement. New York: Wiley, 1967.

 

Hafner, C. and Wilcox, B. Lisp/MTS programmer’s guide. Mental

Health Research Institute Communication #302 and Information

Processing Working Paper #21 (IP—2l), Mental Health Research

Institute, The University of Michigan, 1974.

 

Hempel, C. G. The logical analysis of psychology. Trans. W. Sellars. In H. Feigl and W. Sellars (Eds.), Readings in philo­sophical analysis. New York: Appleton—Century—Crofts, 1949.

 

Hesse, M. B. Models and analogies in science. Notre Dame: Univer­sity of Notre Dame Press, 1966.

 

Howe, E. S. Passive transformation, cognitive imbalance, and evaluative meaning. Journal of Verbal Learning and Verbal Behavior, 1970, 9, 171—175.

 

Kelley, H. H. The warm—cold variable in first impressions of persons. Journal of Personality, 1950, 18, 431—439.

 

Lakoff, G. Linguistics and natural logic. Studies in generative semantics No. 1, Phonetics Lab, The University of Michigan, 1970.

 

Lindsay, P. H. and Norman, D. A. An introduction to psychology. New York: Academic Press, 1972.

 

Miller, G. A., Galanter, E., and Pribram, K. H. Plans and the structure of behavior. New York: Holt, Rinehart and Winston, 1960.

 

Mills, T. M. Power relations in three—person groups. American Sociological Review, 1953, 18, 351—357.

 

Osgood, C. E., Suci, C. J., and Tannenbaum, P. H. The measurement of meaning. Urbana; University of Illinois Press, 1957.

 

Rescher, N. The logic of commands. New York: Dover, 1966.

 

Slovic, P. and Lichtenstein, S. Comparison of Bayesian and regression approaches to the study of information processing in judgment. Organizational Behavior and Human Performance, 1971, 6, 649—744.

 

Stevens, S. S. Mathematics, measurement, and psychophysics. In

S.       S. Stevens (Ed.), Handbook of experimental psychology. New York: Wiley, 1951.

 

Tulving, E. Episodic and semantic memory. In E. Tulving and W. Donaldson (Eds.), Organization of memory. New York: Academic Press, 1972.

 

Tulving, E. and Donaldson, W. (Eds.) Organization of memory. New York: Academic Press, 1972.

 

von Königslöw, R. Semantic and probabilistic redundancy in impres­sion formation. Unpublished research report, Department of Psychology, The University of Michigan, Ann Arbor, 1970.

 

Winograd, T. A program for understanding natural language. Cogni­tive Psychology, 1972, 3, 1—191.

 

Woods, W. A. and Kaplan, R.M. The lunar sciences natural language information system. BBN Report No. 2265, Bolt Beranek and Newman, Inc., Cambridge, Mass., 1971.