Causal models for diagnostic reasoning

13
Causal models for diagnostic reasoning Vijay R. Bandekar Siemens AG, ZTI INF 3, 8000 Munich 83, Federal Republic of Germany Theories of qualitative physics are crucial in developing knowledge-based systems in engineering. Basic models for diagnostic reasoning are based on causal connections between the system parameters. In this paper a knowledge representation and problem solving technique is presented as a ground-laying step. The technique is based on the concepts introduced by Iwasaki and Simon 9 and employs the methods of analysis from the field of control engineering. The causal analysis of systems with feedback is done by using knowledge about their block diagrams. A constraint representing the feed-forward path is modified in order to eliminate the feedback parameter. This approach supports the confluence heuristic in de Kleer and Brown's qualitative physics 4. Causal dependencies that partially describe the behaviour of a system are used to generate a search space for faulty components. This approach is based on reasoning about counterfactuals using modal categorizations, as proposed by Nicolas Rescher and Herbert Simon 12. The scope of application of this method in real-time monitoring and diagnosis of large industrial processes is discussed. Key Words: qualitative physics, structural equations, constraint net, causal ordering, feedback system, block diagram, diagnosis, monitoring, real-time. 1. INTRODUCTION As Davis points out 'an important part of the basic knowledge required for troubleshooting is understanding the mechanisms in terms of causalities '3. Diagnosing a malfunctioning device consists of pointing out its faulty components. Faults in a device are detected by comparing the expected and the observed behaviours of the device. For this purpose a description of the expected behaviour and the observed (measured) behaviour are required. Device behaviour is usually given by the relationships between its parameters. Several ways have been proposed to describe a device's behaviour. Davis 3, de Kleer and Brown 4, Forbus 7, Genesereth s and Iwasaki and Simon 9 each use a qualitative representation of the device behaviour. Causal dependencies between the parameters of a system may be derived from such qualitative descriptions either by constraint propagation 3"4'7 or from the structural equations using the method of comparative statics 9. The behaviour of any device may also be partially described by a causality graph, which is a unidirectional relationship between the parameters of a given device. In diagnostic reasoning, we may either directly utilize information about the causal dependencies or use structural and behavioural information. Both de Kleer and Brown 4 and Iwasaki and Simon* permit the computation of causal dependencies. The existing model- based diagnosis systems a,4,s that use the structural and behavioural knowledge of the device deal basically with discrete systems. The qualitative calculus 4 can produce ambiguous results when the propagation mechanism stops because of more than one unknown parameter in a confluence (qualitative differential equation). The presence of more than two parameters in a confluence indicates the Paper accepted January 1989. Discussion closes August 1989. existence of feedback. In case of ambiguities de Kleer and Brown's qualitative physics introduces assumptions. Based on the assumptions and the choice of qualitative value, all possible solutions are obtained, which are called interpretations. Interpretations are the results of all consistent models, only one of which is authentic; the rest are spurious. We adopt the representation method for physical devices introduced by Iwasaki and Simon 9. The equations that describe a device's behaviour are called Structural Equations. Structural equations usually are algebraic interpretations of the laws of physics governing the device. Causal ordering is a systematic procedure to identify the asymmetries between the dependent and independent variables and represent this relationship explicitly. This type of causal ordering is useful in diagnostic reasoning, whether or not it has something to do with the intuitive meaning of causality. We go beyond the analysis of Iwasaki and Simon, especially when it comes to systems with feedback. We derive the causalities among the variables that correspond to the causalities often mentioned by empirical scientists and practising engineers. Reasoning about systems with feedback may be considered as analysing two or more different systems in a particular order; one for the main path and the others for the feedback paths. For inherent feedback systems, systems in which no physical feedback can be noticed (this aspect of inherent feedback will be discussed in Section 4), we do the analysis in a different way. Instead of splitting the main and feedback paths, the feedback variables are temporarily eliminated from the primary path. The result is the same, since eliminating a variable or assuming it to be constant results in the same causal ordering. Our approach achieves the same purpose as the confluence heuristic of deKleer and Brown 4, and explains why their analysis produces the same causal orderings as ours. © ComputationalMechanics Publications 1989 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 79

Transcript of Causal models for diagnostic reasoning

Causal models for diagnostic reasoning

Vijay R. Bandekar

Siemens AG, Z T I INF 3, 8000 Munich 83, Federal Republic of Germany

Theories of qualitative physics are crucial in developing knowledge-based systems in engineering. Basic models for diagnostic reasoning are based on causal connections between the system parameters. In this paper a knowledge representation and problem solving technique is presented as a ground-laying step. The technique is based on the concepts introduced by Iwasaki and Simon 9 and employs the methods of analysis from the field of control engineering. The causal analysis of systems with feedback is done by using knowledge about their block diagrams. A constraint representing the feed-forward path is modified in order to eliminate the feedback parameter. This approach supports the confluence heuristic in de Kleer and Brown's qualitative physics 4. Causal dependencies that partially describe the behaviour of a system are used to generate a search space for faulty components. This approach is based on reasoning about counterfactuals using modal categorizations, as proposed by Nicolas Rescher and Herbert Simon 12. The scope of application of this method in real-time monitoring and diagnosis of large industrial processes is discussed.

Key Words: qualitative physics, structural equations, constraint net, causal ordering, feedback system, block diagram, diagnosis, monitoring, real-time.

1. INTRODUCTION

As Davis points out 'an important part of the basic knowledge required for troubleshooting is understanding the mechanisms in terms of causalities '3. Diagnosing a malfunctioning device consists of pointing out its faulty components. Faults in a device are detected by comparing the expected and the observed behaviours of the device. For this purpose a description of the expected behaviour and the observed (measured) behaviour are required. Device behaviour is usually given by the relationships between its parameters. Several ways have been proposed to describe a device's behaviour. Davis 3, de Kleer and Brown 4, Forbus 7, Genesereth s and Iwasaki and Simon 9 each use a qualitative representation of the device behaviour. Causal dependencies between the parameters of a system may be derived from such qualitative descriptions either by constraint propagation 3"4'7 or from the structural equations using the method of comparative statics 9. The behaviour of any device may also be partially described by a causality graph, which is a unidirectional relationship between the parameters of a given device. In diagnostic reasoning, we may either directly utilize information about the causal dependencies or use structural and behavioural information. Both de Kleer and Brown 4 and Iwasaki and Simon* permit the computation of causal dependencies. The existing model- based diagnosis systems a,4,s that use the structural and behavioural knowledge of the device deal basically with discrete systems.

The qualitative calculus 4 can produce ambiguous results when the propagation mechanism stops because of more than one unknown parameter in a confluence (qualitative differential equation). The presence of more than two parameters in a confluence indicates the

Paper accepted January 1989. Discussion closes August 1989.

existence of feedback. In case of ambiguities de Kleer and Brown's qualitative physics introduces assumptions. Based on the assumptions and the choice of qualitative value, all possible solutions are obtained, which are called interpretations. Interpretations are the results of all consistent models, only one of which is authentic; the rest are spurious.

We adopt the representation method for physical devices introduced by Iwasaki and Simon 9. The equations that describe a device's behaviour are called Structural Equations. Structural equations usually are algebraic interpretations of the laws of physics governing the device. Causal ordering is a systematic procedure to identify the asymmetries between the dependent and independent variables and represent this relationship explicitly. This type of causal ordering is useful in diagnostic reasoning, whether or not it has something to do with the intuitive meaning of causality. We go beyond the analysis of Iwasaki and Simon, especially when it comes to systems with feedback. We derive the causalities among the variables that correspond to the causalities often mentioned by empirical scientists and practising engineers. Reasoning about systems with feedback may be considered as analysing two or more different systems in a particular order; one for the main path and the others for the feedback paths.

For inherent feedback systems, systems in which no physical feedback can be noticed (this aspect of inherent feedback will be discussed in Section 4), we do the analysis in a different way. Instead of splitting the main and feedback paths, the feedback variables are temporarily eliminated from the primary path. The result is the same, since eliminating a variable or assuming it to be constant results in the same causal ordering. Our approach achieves the same purpose as the confluence heuristic of deKleer and Brown 4, and explains why their analysis produces the same causal orderings as ours.

© Computational Mechanics Publications 1989 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 79

Causal models for diagnostic reasoning: V. R. Bandekar

2. ORGANIZATION OF THE PAPER

The basic aspects of representing device behaviour using structural equations are described in Section 3. Also, the method of comparative static and causal ordering are presented in this section. The classification of feedback systems is also discussed. Section 4 is devoted to causal ordering in feedback systems. In Section 5 we discuss an application (electrical converters and dc drives) in detail. Examples of both 'inherent' and 'intensional' feedback systems are presented in this section. A method of diagnosis using causal dependencies is presented in Section 6 with a simple algorithm for generating a search space for faulty components. The scope of application of this proposal in real-time monitoring and diagnosis of industrial processes is described in Section 7.

3. FUNDAMENTALS OF CAUSAL ORDERING

The statement 'any change in parameter X causes a change in parameter Y' expresses a unidirectional relationship between X and Y. Such relationships are fundamental to qualitative reasoning. A device model (a system of equations) contains asymmetrical relationships between parameters (system variables). Simon defined a linear structure a3 as a set of linear, nonhomogeneous equations with the number of its variables at least equal to the number of equations. The set of equations describing a device in equilibrium is a linear structure. These equilibrium equations are called structural equations a°. This method of representing the steady state behaviour of physical devices introduced by Iwasaki and Simon ~° is

presented in this section. We shall also briefly describe the concepts of causal ordering and the method of comparative static.

3.1. Structural equations as a knowledge representation scheme

Structural equations are non-homogeneous linear equations. They describe the behaviour of a physical device in a limited but important sense. Structural equations may be directly obtained from the laws of physics governing the device behaviour, e.g. Ohm's law for electrical devices, Newton's laws of motion for mechanical devices. A simple circuit given in Fig. la behaves in accordance with Ohm's law. Once we specify the parameters of this circuit, its structural equation is given as follows:

Eo=Ei -R*I (1)

Such device models may be stored in a library. To build a model of a system using structural equations one has to identify the system parameters and instantiate the applicable laws of physics from the library.

3.2. Properties of the system of structural equations A system of structural equations is self-contained if the

number of unknowns is equal to the number of equations. Self-contained structural equations allow a complete solution. If the number of equations describing a device behaviour is less than the number of unknowns then we need to introduce assumptions in order to make the system of structural equations self-contained. The

Figure 1-a

R

©

t Ei

O

2_

Icoill I

©

t E o

/

A circuit with voltage sources connected to a coil.

Ei I Eo

i ) i i i Ei \

2) 1 Eo I 3) 1

1-b Matrix of S t ruc tura l Equat ions 1-c Causal Ordering Fig. 1. (a) A circuit with voltage sources connected to a coil. (b) Matrix of structural equations. (c) Causal ordering

80 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2

parameters that can be changed directly by some external means are termed exogenous variables 9. In the circuit shown in Fig. la, F~ and E 0 are exogenous as both can be set to a particular value externally. Thus we have:

E i = e i (2)

E o = e o (3)

Note that lower-case letters mean that the parameters are set to a particular value.

A proper sub-set of any self-contained system that is also self-contained and that does not have any proper subset that is self-contained is called a minimal complete subset. The set of all the minimal complete subsets of the original self-contained structure is termed the set of complete subsets of order zero.

If a minimal complete subset of order zero is solved for its unknowns, and the values of the unknowns are substituted in the rest of the structural equations, we get a new set of structural equations that is also self-contained. This is called the derived structure of order one. The set of complete subsets of the derived structure of order one also has order one. The procedure of deriving new sub- structures by substituting for the variables of the minimal complete subsets is continued until no more substitutions are possible.

3.3. Causal dependency Iwasaki and Simon 9 define direct causal dependency as

follows: If one denotes the set of variables in the complete subsets of the i th order by V~, where i ~> 1, then the variables in Vi are said to be directly causally dependent on the variables in Vi-1. Thus for a given system of self- contained structural equations, the causal orderings among the sets of variables may be obtained by using the above definition. A flow-chart of the general causal ordering algorithm is given in Fig. 2. It is not possible to derive a causal ordering between the variables themselves, as it depends upon the number of variables which each minimal complete subset contains. The above definition is too general to be used in deriving a detailed causal structure.

The system of structural equations may be represented by a matrix mij, where mij = 1 ifj th variable occurs in the i th equation, else rr~j = 0.

Fig. lb shows the matrix of the structural equations for the circuit example. The causal ordering obtained from the causal dependency algorithm is given in Fig. lc.

3.4. Feedback systems If the minimal complete subset of some order contains

more than one parameter the system is called a feedback system. Feedback systems are classified in two different categories:

1. Inherent feedback systems - these systems contain parameters that influence each other, through the inherent nature of physics.

1. Intentional feedback systems - in these systems, certain parameters are artificially made to influence some other parameters in order to achieve a required functionality.

While reasoning about the behaviour of large systems it has been observed that people usually choose the asymmetries between two different variables rather than the asymmetry between groups of variables at the time of

Causal models for diagnostic reasoning: V. R. Bandekar

investigation. We shall discuss this aspect of causal ordering in the presence of feedbacks in the next section.

4. CAUSAL O R D E R I N G IN SYSTEMS WITH FEEDBACK

The techniques used by control engineers to analyse complex systems 11 are quite useful for our purpose, to determine dependencies between the individual parameters/components. Such techniques are basically qualitative in nature, although a considerable amount of calculus is involved. One of the techniques - the block diagram - is discussed here, as it is central to reasoning about feedback systems.

4.1. Block diagrams in control engineering A control system consists of many components. In

order to show the functions performed by each component, control engineers commonly use a diagram called the block diagram. In a block diagram, all system variables are linked to each other through functional blocks. In general, the functional operation of a system can be visualized more readily by examination of the block diagram than by examination of the physical system itself. Formally, a block diagram may be represented as a directed graph with labelled nodes and arcs. Three different types of nodes exist: (a) summing points, (b) functional blocks and (c) branch points. The arcs are labelled by the variable names and represent unidirectional signal flows. Note that block diagrams contain information concerning dynamic behaviour, but do not contain any information concerning the physical construction of the system. Many physically different systems may have the same block diagram. A number of different block diagrams may be drawn for a system, depending upon the viewpoint of the analysis. To construct a block diagram, the input and output of the given system are identified, and then the block diagram that connects the input variable to the output variable through the intermediate variables are drawn.

Input variables are those which can be directly manipulated by an observer and the output variables are those which cannot be directly manipulated.

In order to demonstrate this with a feedback system, let us consider the inductance of the coil which we ignored in the previous example (Fig. 1). The voltage E' due to inductance (L) is a result of current flow through the coil. This voltage modifies the behaviour of the circuit which may be given by the following structural equations:

E 0 = E i - E ' - R * I (1.1)

E '=f(I) (1.2) we assume this relationship to be linear

Ei = ei (1.3)

E0 =e 0 (1.4)

After initial substitution equation 1.1 and 1.2 together will be left with two unknowns and none alone is self- contained. Hence they make a self-contained system of two unknowns, viz. E' and I. E' and I are feedback parameters. Iwasaki and Simon 9 claim that there is no causal ordering between the feedback parameters as they influence each other. But, in practice, we know that the inductive voltage is caused by the current and the reverse is not true. In order to derive this ordering we construct a

Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 81

Causal models for diagnostic reasoning: V. R. Bandekar

S:=

M: = number of variables i: = 0

~ ~ YES

se t o f n - t u p l e o f r o w s a n d t h e n u m b e r o f u n k o w n s in t h e n - t u p l e i s <_ n and n<M

I n : = n + l

J i : = i + l l . i ~ ,

YES

I"

M: = M- number of rows in S Vi: = variables in S M A T R I X : = MATRIX - rows in S

- columns that contain the elements in Vi

Fig. 2. A flow-chart of the Causal ordering algorithm

block diagram of the system. We assume that Ei and E 0 to be the inputs and I the output. The unique block diagram is given in Fig. 3. It is clear from the block diagram that E' is the feedback parameter and must be eliminated from equation 1.1. Hence we have

E0 = Ei - R*I (1.1')

This equation together with the remaining have no self- contained subsets of more than one unknown. The matrix of structural equations and causal orderings are given in Fig. 4.

This is an example of inherent feedback system since the inductive voltage is inherent in physical behaviour.

5. VARIABLE S P E E D D C DRIVES

In this section we present an application. A variable speed dc drive is chosen because it consists of devices that are of both the inherent feedback type and the intentional feedback type. A direct current (dc) motor is of the inherent feedback type because the induced electro motive force (back emf) is an inherent property of the

82 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2

Eqn. (1.1) Eo

Ei

E'

Fig. 3. 7q,

Eqn. (1.2)

Block diagram of circuit with inductivity

Ei I Eo E'

1 . 1 ' )

1.2)

i 1 1

I 1

1.3) i

1.4) 1

4(a)

Ei ~ I ~ E' 4(5)

Fig. 4. (a) Matrix of structural equations inductive circuit. (b) Causal ordering

for an

motor. Control systems are of the intentional feedback type, as they have physical structures that give rise to feedback systems, e.g. a pressure regulator.

5.1. Modelling an armature controlled dc motor Variable speed dc drives are controlled by regulating

the armature voltage. In Fig. 5 the schematic diagram of the motor is given. The functionality of the whole system is described by the laws of physics for each phenomenon occurring in the devices, which are interpreted as follows:

1. The torque (T) developed by the armature controlled dc motor is proportional to the armature current (I~).

T=K~*Ia where Km is the motor torque constant. Note that the field current (If) is held constant in an armature controlled de-motor.

2. The induced voltage or the back emf (Eb) in the armature is proportional to the angular velocity of the output shaft (fD.

Eb = kb*~) where k b is the back emf constant

Causal models for diagnostic reasoning: V. R. Bandekar

3. The sum of the potential drops across the armature circuit is equal to the supply voltage (Ea).

Ra*Ia + La*d/dt(Ia ) + ~ = Ea

where R a is the armature resistance and La is the armature inductance.

4. The torque (T) produced by the motor is consumed by the equivalent moment of inertia (J) and the viscous friction (F) of the shaft and the load.

T = J*d/dt(f2) + F ' f2

Note that (3) and (4) are differential equations. Let the differential equation y = dx/dt + x represent some device behaviour. This equation contains two state parameters. If y is assumed to be exogenous, x can be computed by solving the homogeneous differential equation. If x is assumed to be exogenous then x is constant and hence dx/dt is zero. This solves for y. The time derivative of a variable does not describe its relationship with other variables. It describes how the variable changes with time. The causality will not be altered if the above equation is written as y = f(x).

Thus (3) and (4) may be written in a form f(Ia,Ea,Eb, Ra,La)= 0 and g(T,fLJ,F) = 0 respectively.

5.2. Causal analysis of the dc motor There are four equations and eleven unknowns in the

above set of structural equations. Therefore we need seven assumptions in order to make the system self- contained. The voltage applied to the armature is exogenous. Hence we have:

E~=ea (5)

The unknowns that are physical constants depend upon the physical sizes of the device components. E.g., the diameter of the coil, resistance, etc. are assumed to be constant. Thus we may introduce the following assumptions. These assumptions also imply that all the related parts are properly functioning, e.g., if K m is constant the motor is functional.

Km =kin (6)

K b = k b (7)

J = j (8)

F = f (9)

L a = ~ (lO)

R a = r a (11)

This system of structural equations is now self- contained and can be used by the causal ordering algorithm. The matrix of structural equations is shown in Fig. 6. The matrix of the derived structure of order one is shown in Fig. 7. This is a minimal complete subset. So the causal ordering for the whole system may now be given as shown in Fig. 8. Notice that T, Ia, E b and f2 are feedback parameters, and hence there is no causal ordering between them. In order to elaborate the causal dependencies we have to consider the block diagram of the system. The block diagram is constructed assuming E a to be the input and f~ the output, as shown in Fig. 9. Because Eb is identified as the feedback parameter, it may be eliminated from equation (3) deriving a new structure. This has a single unknown minimal complete set, which can be further substituted in the rest of the equations. The

Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 83

Causal models for diagnostic reasoning: V. R. Bandekar

Ra La

Ea, -Eb

W

Ia

If

F

1

Ea Voltage applied to the armature Eb Back emfinduced due to rotation I a Armature current T Torque developed by the motor ~2 Angular velocity of the shaft If Field current(constant).

J equivalent moment of inertia of the shaft, rotor, coupling and the impeller Ra resistance of the armature circuit. La inductance of the armature circuit. F total viscous friction, due to the liquid flow through the pump.

Fig. 5. Schematic diagram of the motor

T

1 1

2

3

4 1

5

6

7

8

9

10

11 Fig. 6.

Ia Ea Eb ~ Km

1 1

Kb J F Ra La

1 1 1

1 1 1 1 1

1 1 1

1

1

1

The matrix of structural equations describing the dc motor

1

84 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2

1.1

2.1

3.1

4.1

T Ia Eb

1 1

1

1

1 1

1

1

Fig. 7. The matrix of the derived structure of order one

Causal models for diagnostic reasoning: V. R. Bandekar

Kb T

I)bl R a ,~

N1- F I ~

Fig. 8. Causal ordering of the motor parameters

Eai Ia :: i ~ fl(Ra,La) ~ f2(Km) ~ f3(J,F) '

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . I

Eb [ f4(Kb) I Fig. 9. Block diagram of the dc motor

f~

T Ia Eb f~ T Eb f~ Eb f~

1.1 1 1 1.2 1 2.3 1 1

2.1 1 1 2.2 1 1 4.3 1

3.1 1 4.2 1 1

4.1 1 1 Fig. 10. The matrix of the new derived structure of order 1, 2, 3 and 4

2.4

Eb

1

stepwise derivation of sub-structures has been given in Fig. 10. The causality graph of the dc motor is shown in Fig. 11.

5.3. Variable speed dc drive The overall control system consists of a main 3-phase

alternating current (ac) supply, a static convertor, a trigger set and controllers for motor speed and current. Feedback control systems are designed so that they can compare the actual outputs with a desired value (set point) and modify the output. They display the characteristics of the intentional feedback systems.

The structural equations of the individual components shown in Fig. 12 are given below:

Speed-controller (C-N): I~t = fx (N~t- NobJ Current-controller (C-I): ~ = f2(I~t- Iobs) Trigger-set (S): PT = f3(V~yc.~)

Converter (T): Iobs= f4(Vi,P~) Motor: Nobs = t"5 (Iob~)

Here the motor behaviour is described as a relationship between armature current and shaft speed.

The parameters that are common to any two components describe the connections between those components. The parameters Nset, V~ and Vsyc are exogenous because they are not connected; i.e., they do not exist in more than one mechanism.

Each of the parameters in the mechanisms may have either some numerical values or vectors of numerical and qualitative values. E.g., the set points are voltage signals that lie between 0 and - 10 volts, where PT represents six different pairs of pulses. Although the time behaviour of a device is described by the differential equations, it is not necessary to use them to explain the behaviour in causal

Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 85

Causal models for diagnostic reasoning: V. R. Bandekar

L a " K m L 'J ~ ~ b ~

~ / ~ , E b ---t,.

A

Fig. 11. Causality graph for the dc motor. In this graph arrows indicate causality. For example, the causes of T are Km and Ia. Ea is an exogenous quantity. The darkened variables are the constant quantities. Those in a box are measurable quantities and those without boxes are non-measurable quantities. Also refer section 6.3

Nset Iset

C-N C-I

Vsyc

a ~ . PT

Iobs

$

V

Nset Iset ct

Vsyc PT Vi Iobs Nobs Fig. 12.

Nobs

Set point for the speed Set point for the current Electrical angle at which the pulse is generated Synchronising 3-phase ac voltage A vector of pulses 3-phase ac voltage supply Actual (observed) dc voltage Actual (observed) speed.

Feedback control of a dc-drive

I ! ,~_J ~-__J, • . . . . . . . . . . . . a

C-N Speed controller C-I Current controller 5 Trigger Set T Static Converter M Motor 6 Tacho-generator

terms. A static model of the device behaviour is sufficient for a major part of diagnostic reasoning.

Causal analysis of the dc drive will produce no ordering between the variables in the feedback loop. We have to consider the block diagram of the system in order to modify certain mechanisms.

Before constructing the block diagram for the control system, we shall first introduce some modifications to the standard approach to block diagram construction 11. Normally, a functional block represents the transfer function, which is the ratio of the Laplace transform of the

input to the Laplace transform of the output. This means that a node of type functional block can only have two arcs, each representing either an input or the output parameter. We define another type of node called complex block that may have more than two arcs. Only one of these arcs represents an output. This is a hierarchical abstraction of the detailed block diagram. Another modification to the standard approach is that the arcs represent the parameters themselves and not their Laplace transforms. The block diagram for the mechanism in which the addition of two parameters is

86 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2

explicitly represented consists of an error detection node. Fig. 13 shows the block diagram for each mechanism in the variable speed dc drive.

Iobs and Nobs are the feedback parameters, and they occur in the speed controller and the current controller, respectively. Hence we modify both the controller mechanisms by eliminating Iobs and Nob * from C-N and C- I respectively.

Speed-controller (C-N): Ise t = f l ( N s e t )

Current- controller (C-I) :co = f[ (Ise t )

The minimal complete subsets of any order of the new structure are singleton sets. Hence all the parameters of the dc drive are causally ordered. The physical constants of respective mechanisms are considered to be additional parameters that affect the output parameters. These are symbols representing the mechanisms. E.g., C-N represents the physical constant of the speed controller and hence affects the output of the speed controller Is~t. The causality graph for the dc drive is given in Fig. 14.

6. DIAGNOSTIC REASONING

6.1. Motivation of the proposed approach Causal structure provides a unidirectional relationship

between the device parameters. It is an abstract description of the influences certain variables have on others. Therefore it provides a means to reason about the changes in parameters of a device and how these changes are propagated.

Causal models for diagnostic reasoning: V. R. Bandekar

We shall consider an example from Rescher and Simon x 2 in order to demonstrate the idea of using causal dependencies in diagnostic reasoning.

Consider the physical situation in Fig. 15 in which a gas is confined in a chamber by a movable piston. A weight rests on the piston. If we assume that the gas law holds in this situation, then in equilibrium: (1) PV--kT

where P is the pressure, V is the volume of gas in the chamber and T the absolute temperature of the confined gas. k is the gas constant.

Additional constraints have to be imposed in order to obtain a complete structure and obtain a causal ordering.

Assuming that the heat conduction through the cylinder wall can be altered, we have at equilibrium, the inside temperature always equals the outside temperature (it is assumed that the cylinder walls are perfect conductors of heat):

(2) T = t (3) P = p(exogenous).

Let the acutal values of pressure, temperature and volume be Po, To and V o respectively, then we have the following information-base:

Law: Facts:

PV = kT P = P o T = T o V=V o

Let us introduce the belief-contravening hypothesis, T = a*T o, and compute its counterfactual consequences.

Nset .K

+X

I I I I

i Iset

I + ,

Speed-Controller Current-Controller

Fig. 13. Block diagram for the dc drive

Vsye Vi

ct - - ~ ,PT ,Iobs ~ Nobs

Static- Trigger-Set Converter

Motor and Tachometer

C-N

Nse t

Fig. 14.

C-I

\ \ Iset ~ ct

S T M

\ , \ \ P PT ~ Iobs ~ Nobs

Vsyc / Vi / " 6 ' / "

Causal ordering for the variable speed dc drive

Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 87

Causal models for diagnostic reasoning: V. R. Bandekar

~ ~ movable piston

V

container P

v o l u m e of the con f ined gas p r e s s u r e app l i ed to the p i s ton t e m p e r a t u r e of the gas.

F i g u r e 15-aGas conf ined in a c h a m b e r .

P V T

I I I

3 1

1 T ~ V

[Mo]. IM~I

15-b M a t r i x of s t r u c t u r a l equa t ions . 15-c C a u s a l o r d e r i n g .

Fig. 15. (a) Gas confined in a chamber. (b) Matrix of structural equations. (c) Causal ordering

This may be accomplished by making the information- base consistent with T = a*T o as a new fact. Obviously we retain the law. In this case, we have two ways to make the information-base consistent:

(A) P = a* Po (B) V=a*Vo

One of these propositions has to be accepted and the other has to be rejected. A principal of acceptance and rejection has been proposed by Rescher and Simon 12. The procedure to be followed is to sort the propositions in question into modal categories M o, M1 . . . . . M, where the subscript represents the modal index. The lower the modal index, the less its susceptibility to rejection, and correspondingly the greater the modal index, the greater its susceptibility to rejection. In our approach the further down a parameter is in the causal ordering, the higher its modal category. The modal index of a proposition is taken as the position in the causal ordering of the variable it contains.

In Fig. 15c the causal graph is labelled with respective modal indices. Proposition (A) has a modal index 0 and (B) has an index 1. Hence P = P0 is accepted and V = V o is rejected. In general terms, if the temperature had been changed by a factor a then the volume would have changed by the same factor.

Modal categorization thus provides a model for diagnostic reasoning.

6.2. An overview of the approach A system is malfunctioning if discrepancies are

observed in the values of one or more parameters that

describe the system. Discrepancies are noted if the expected value of a variable differs from its observed value. For example, if the pressure gauge reading at the output flange of a pump shows a different reading than the one specified by a process engineer, then there is a discrepancy in the value of output pressure. Therefore we may call the output pressure symptomatic.

A natural way to troubleshoot such a system is to consider one or more of the following possibilities:

1. Some or all of the parameters of the system that cause the change in the symptomatic variable may themselves be symptomatic.

2. Unless each of the parameters mentioned in (1) are measured and the discrepancies (is any) are noted, all of them are considered to be symptomatic.

3. The parameters mentioned in (1) that deal with the physical sizes and shapes may be pointed out to be defective if the rest of the parameters are measured to be correct.

6.3. Definitions The following definitions of terms are necessary for

understanding the proposed method of diagnosis. Quantity: All the unknowns of a system of structural

equations including the device constants are called quantities.

Measurable quantity: In any device there are cetrain quantities that are measurable with standard techniques and instruments without disassembling the device. These quantities are termed measurable quantities. Note that this definition is vague and allows the categorizing to be

88 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2

dependent on device. The 'Observable Data (OB) in Genesereth's DART s are of this type. E.g., supply voltage and current of a motor, pressures at

inlets and outlets, etc., are measurable quantities. Induced voltage is a non-measurable quantity. Constant quantity: The quantities that deal with the

physical construction of the device are called constant quantities. E.g., length, diameter, resistance, capacitance, in-

ductance, etc. Causes of a quantity: In the causality graph, all the

quantities that cause a given quantity are called causes. Exogenous quantity: Any measurable quantity that has

no causes is called an exogenous quantity. These quantities are usually the inputs to a device. In DART s 'Achievable Data (AC)' has a similar meaning. E.g., supply voltage or current, pressure at the inlet of a

pipe, etc. Symptom: In any device, if a measurable quantity has an

observed value different from its expected value then the quantity is a symptom. The quantities may either be randomly measured or may be chosen by assigning to them some kind of weight based on the cost of measurement 5. If a quantity is unmeasured it is considered to be a symptom.

Culprit: A constant quantity that is a cause of a symptom is a culprit.

The values of the constant quantities such as armature resistance or motor constant depend upon the physical construction of the device. The resistance of a coil would depend upon the cross sectional area and material of the wire, number of windings in the coil, etc. Should there be a defect in the construction, the expected value may vary. But any discrepancies in these quantities are very expensive to discover, because measuring the values of such quantities is not possible without dismantling the device. Therefore we may choose not to measure such quantities. Instead these quantities are searched for while troubleshooting a malfunctioning device.

6.4. Theory of diagnosis Searching for the culprits in a given causality graph is

one of the goals of diagnosis. If we have a symptom in a given causal graph, all the causes of this symptom are diagnosed. The procedure of diagnosis is recursively defined as follows:

DIAGNOSE (symptom) let culprits: = {}

causes: = GetCauses (symptom) [The function GetCauses returns all the parameters that have causal precedence over its argument.]

for each cause in causes loop if cause is a constant quantity then

add cause into culprits elseif cause is symptomatic then [This test is performed by a tester and the truth value is returned after the measurement.]

if cause is exogenous then add marked cause into culprits

[Marked cause implies that the fault lies external to the device.]

else add DIAGNOSE (cause) to culprits RETURN culprits

Causal models for diagnostic reasoning: V. R. Bandekar

This algorithm gives a set of potential culprits. It is possible that one particular cause may be diagnosed more than once in the above procedure. We may avoid the repetition by maintaining a list of already diagnosed causes. Note that the procedure given above is in its simplest form and can be modified.

Suppose that P is a set of potential culprit(s) computed by the diagnosis procedure, which each peP is either a culprit or a set of culprits. The set of culprits is recursively defined and its depth in a given causality graph can be given by the highest order of causation (or modal index). The set of actual culprits (C) belongs to the power set of P and is non-empty. Determination of C depends upon the measurements or observation of the parameters. If a symptom has no causes that are symptoms, its constant causes are the culprits. The actual culprits are only tractible if this condition is satisfied.

6.5. Strategies for selecting quantities for measurement For a given causality graph if we assume that all the

measurable quantities except the exogenous quantities are symptoms then our algorithm produces the entire search space for the culprits. We may choose to measure the quantities that have causes that occur often. Here experimential knowledge plays a crucial role. The distinction between measurable and non-measurable quantities in our approach has already been discussed. This kind of classification based on the measurability of a quantity is entirely specified by the user. Such classification may be based on the cost of measuring a quantity. The least cost quantities are measured first.

Bhandari, Simon and Siewiorek 2 present an optimal algorithm that runs in polynomial time that can select a probe (sub-chain) in a given causal chain. Given a causal chain whose m th node is OK and n th node is not-OK, this algorithm finds a probe selection stragegy that results in the shortest expected time per troubleshooting session. Here a single fault is assumed. The optimisation is done on the basis of the expected time required for a particular section of the chain. In their causal chain each node represents a parameter and the edge connecting two nodes represents a set of components. Each edge is associated with a probability of failure.

7. SCOPE OF APPLICATION IN REAL-TIME PROCESS C O N T R O L

The troubleshooting technique discussed in this paper can be applied for both on- and off-line diagnosis of engineering systems (mechanical, electrical, hydraulic, pneumatic, etc.). In large industrial systens, on-line troubleshooters may be integrated with the existing process control hardware and software. In real-time diagnosis of industrial processes, causal dependencies play a major role. The concepts presented in this paper can be extended and tailored to suit such applications. The 'sensory and control ' unit of a standard automation system can be coupled with a 'diagnostician', which can receive and send signals from and to the process. In case of discrepancies, the 'diagnostician' can immediately refer to the causality graph of the system and track down the culprit/s and/or cure the fault. This is demonstrated by the following example.

In Fig. 16a a liquid container is shown. The liquid flows into the container at a rate Q1 and is drained out at a rate

Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 89

Causal models for diagnostic reasoning: V. R. Bandekar

a . The Schematic Diagram

H

÷

"i--- Q1 [!

lv Q2

t I SERVO-DEVlCE I-

b. The s t r u c t u r a l E q u a t i o n s

AND CONTROL UNIT

DI I

A I

GI NI Ol S i TI I

Cl I

AI NI

1) The change in liquid level (AH} is proportional to the difference in the ra tes of in- and out-flows. H = k(Q1-Q2).

2) The dra inage ra te is a funct ion of the valve position. Q2 = f(V).

3) V = v . exogenous 4) Q l = q l . exogenous

c. The c a u s a l g r a p h Q1

V ~- Q2

Fig. 16. A system for liquid level control

H

Q2. The drainage rate is proportional to the valve position V. The structural equations and the causality graph of this system are given in Figs. 16b and 16c respectively. Let the valve at the discharge pipe be driven by a servo-device that receives commands from the diagnostician. If a discrepancy is noted, say in H, the 'diagnostician' will refer to the graph and will check if Q1 and Q2 are symptomatic. If Q1 is symptomatic, it has to be reset before proceeding to check Q2. If evern after setting Q1, H is symptomatic, then Q2 is checked. If Q2 is symptomatic, then the servo-device is sent a message to reset the valve position V. Otherwise there exists a serious fault in the system. The 'diagnostician' can act as an early warning system in such cases.

8. CONCLUSION

We have presented a way to represent the behaviour of physical devices. The behaviour of a device is given partially by the causality graph. The causality graph gives us the complete search space for culprits (faulty components). A simple algorithm that generates this search space has been presented. In the demonstration system, all the minimal complete subsets contain just one variable. This means that there is no feedback in the system. Although the system actually has feedback, in the model it has been eliminated using controlled conditions;

in other words, choosing assumptions that derive causal structures with the fewest feedback variables.

Humans always observe systems under controlled conditions, and specifically when doing diagnosis. The causalities discussed in this paper are also derived from such controlled circumstances. Should the circumstances change, the causalities also change. Hence statements about causalities must always be accompanied by the conditions under which they are valid. Qualitative process theory a allows incorporation of causal information quite similar to the kind discussed in this paper. But we feel that the analysis procedure of the qualitative process theory involves large amounts of computations that may not be realistic in practical applications.

Iwasaki and Simon 9 claim that the causal analysis for the defective component can be accomplished only by redescribing the component in the faulty state. There is no clear picture of how this could be done. Intuitively, it seems that first the troubleshooting has to be done in order to redescribe the system.

Modelling physical systems using structural equations may not be adequate, but it is useful in computing the causal relationships between the various parts of the system. We have demonstrated some ideas about practical applications of this approach and believe that a subset of real engineering problems could be solved using this framework.

90 Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2

Causal models for diagnostic reasoning: V. R. Bandekar

9. A C K N O W L E D G E M E N T

This w o r k was d o n e as pa r t of the ' T E X - B ' p ro jec t s u p p o r t e d by the F e d e r a l G e r m a n M i n i s t r y for Resea rch and T e c h n o l o g y C o n t r a c t N u m b e r I T W 8506 E4.

BIBLIOGRAPHY

1 Bandekar, Vijay R. Diagnosis Using Causal Dependencies, 2nd Conference on the Applications of AI in Engineering, Boston, USA, August 1987

2 Bhandari, Inderpal S, Simon, Herbert A. and Siewiorek, Daniel P. Optimal Diagnosis for Causal Chains, Carnegie-Mellon University, Computer Science Department, CMU-CS-87-151, l0 September 1987

3 Davis, R. Diagnostic Reasoning Based on Structure and Behaviour, Artificial Intelligence, December 1984, 24, 1-3

4 de Kleer, J. and Brown, J. S. A Qualitative Physics Based on Confluences, Artificial Intelligence, December 1984, 24, 1-3

5 de Kleer, J. and Williams, B. C. Diagnosing Multiple Faults, Artificial Intelligence, April 1987, 32, 1

6 de Kleer, J. and Brown, J. S. Theories of Causal Ordering, Artificial Intelligence, July 1986, 29, 1

7 Forbus, Kenneth D. Qualitative Process Theory, Artificial Intelligence, December 1984, 24, 1-3

8 Genesereth, M. R. The Use of Design Descriptions in Automated Diagnosis, Artificial Intelligence, December 1984, 24, 1-3

9 Iwasaki, Y. and Simon, H. A. Causality in Device Behavior, Artificial Intelligence, July 1986, 29, 1

10 Iwasaki, Y. and Simon, H. A. Theories of Causal Ordering: Reply to de Kleer and Brown, Artificial Intelligence, July 1986, 29, 1

11 Ogata, K. Modern Control Engineering, Prentice-Hall International, Inc., 1982

12 Rescher, Nicholas and Simon, Herbert A. Cause and Counterfactual, Philosophy of Science 1966, 33, 323-340

13 Simon, Herbert A. Models of Discovery, D. Reidel Publishing Company, Boston, USA, 1977

Artificial Intelligence in Engineering, 1989, Vol. 4, No. 2 91