E-contract fulfilment and agents’ attitudes

10
E-Contract Fulfilment and Agents’ Attitudes Aspassia Daskalopulu Department of Computer Science, King’s College London, The Strand, London WC2R 2LS, UK. [email protected] Theo Dimitrakos CLRC Rutherford Appleton Laboratory, Oxfordshire, OX11 0QX, UK. [email protected] Tom Maibaum Department of Computer Science, King’s College London, The Strand, London WC2R 2LS, UK. [email protected] Abstract: One aspect of the development of e-market services for the facilitation of business-to-business electronic commerce concerns the provision of automated support for contract performance. Assessing the parties’ performance of an agreement, once it comes into force, requires reasoning with the contract terms (obligations, rights, powers and other legal relations that obtain between parties) as they go about conducting their business exchange, sometimes complying and sometimes deviating from their pre-agreed prescribed behaviour. Compliance with prescribed behaviour is typically evaluated individually by each partner to an agreement and where parties’ views differ, disputes arise that require some form of resolution. In this paper we present a simple architecture for an e-market, where an artificial (controller) agent undertakes such resolution. The controller’s decision-making is informed by the agreement and each party’s view of whether its own and the counter-party’s behaviour comply with it. Thus, the controller forms an opinion on the basis of such evidence (and possible additional recommendations from agents representing the parties), in similar spirit to a (human) judge’s process of reasoning in arriving at her ruling. We consider this as a belief formation problem and explore the potential of using subjective reasoning to represent an individual’s (possibly partial) views and to reason about their joint conflict and consensus formation. We comment on the relation of such belief formation on the establishment of trust between partners to an agreement and between the latter and the controller of an e-market. Keywords: Contract performance monitoring, contract execution, trust-based reasoning, theory of evidence 1. INTRODUCTION A considerable volume of recent research and industrial application efforts has concentrated on the provision of automated support for the establishment and subsequent implementation of electronic contracts (cf. [7], [1], [6], [13], [11], amongst many others). In general terms, e-contracting is viewed by many researchers as conducted within an electronic market place (a so called e-market), which offers a variety of services, such as: brokering to identify and match prospective business partners; negotiation facilitation; lodging electronic documents; arbitration and dispute resolution; contract performance monitoring and enforcement. In this paper we are concerned with the last, that is, with the development of appropriate services for contract performance monitoring. During a business exchange that is regulated by some agreement, the main issues of interest are: (i) what each party is obliged (or permitted, or prohibited, or empowered and so on) to do at a given point in time; (ii) whether each party complies with the behaviour stipulated for it by the agreement; and (iii) where a party deviates from prescribed behaviour—intentionally or due to force majeure—what remedial mechanisms are applicable that might return the business exchange to a normal course. To our knowledge, most of the current work on electronic support for contract performance has been concerned with (i). However, the model that it envisages for contract performance is such that (ii) and (iii) are not addressed at all, because the possibility of deviation from prescribed behaviour is ruled out. In general terms, such a model (cf. [11] for an overview) relies on extracting workflow information from a contract and then implementing functions that realise various activities within their specified time-bounds.

Transcript of E-contract fulfilment and agents’ attitudes

E-Contract Fulfilment and Agents’ Att itudes

Aspassia DaskalopuluDepartment of Computer Science, King’s College London, The Strand, London WC2R 2LS, UK. [email protected]

Theo DimitrakosCLRC Rutherford Appleton Laboratory, Oxfordshire, OX11 0QX, UK. [email protected]

Tom MaibaumDepartment of Computer Science, King’s College London, The Strand, London WC2R 2LS, UK. [email protected]

Abstract: One aspect of the development of e-market services for the facilit ation of business-to-business electroniccommerce concerns the provision of automated support for contract performance. Assessing the parties’performance of an agreement, once it comes into force, requires reasoning with the contract terms(obligations, rights, powers and other legal relations that obtain between parties) as they go aboutconducting their business exchange, sometimes complying and sometimes deviating from their pre-agreedprescribed behaviour. Compliance with prescribed behaviour is typically evaluated individually by eachpartner to an agreement and where parties’ views differ, disputes arise that require some form of resolution.

In this paper we present a simple architecture for an e-market, where an artificial (controller) agentundertakes such resolution. The controller’s decision-making is informed by the agreement and eachparty’s view of whether its own and the counter-party’s behaviour comply with it. Thus, the controllerforms an opinion on the basis of such evidence (and possible additional recommendations from agentsrepresenting the parties), in similar spirit to a (human) judge’s process of reasoning in arriving at her ruling.We consider this as a belief formation problem and explore the potential of using subjective reasoning torepresent an individual’s (possibly partial) views and to reason about their joint conflict and consensusformation. We comment on the relation of such belief formation on the establishment of trust betweenpartners to an agreement and between the latter and the controller of an e-market.

Keywords: Contract performance monitoring, contract execution, trust-based reasoning, theory of evidence

1. INTRODUCTION

A considerable volume of recent research and industrial application efforts has concentrated on theprovision of automated support for the establishment and subsequent implementation of electronic contracts(cf. [7], [1], [6], [13], [11], amongst many others). In general terms, e-contracting is viewed by manyresearchers as conducted within an electronic market place (a so called e-market), which offers a variety ofservices, such as: brokering to identify and match prospective business partners; negotiation facilit ation;lodging electronic documents; arbitration and dispute resolution; contract performance monitoring andenforcement. In this paper we are concerned with the last, that is, with the development of appropriateservices for contract performance monitoring.

During a business exchange that is regulated by some agreement, the main issues of interest are:

(i) what each party is obliged (or permitted, or prohibited, or empowered and so on) to do at a given pointin time;

(ii ) whether each party complies with the behaviour stipulated for it by the agreement; and

(iii ) where a party deviates from prescribed behaviour—intentionally or due to force majeure—whatremedial mechanisms are applicable that might return the business exchange to a normal course.

To our knowledge, most of the current work on electronic support for contract performance has beenconcerned with (i). However, the model that it envisages for contract performance is such that (ii ) and (iii )are not addressed at all , because the possibilit y of deviation from prescribed behaviour is ruled out. Ingeneral terms, such a model (cf. [11] for an overview) relies on extracting workflow information from acontract and then implementing functions that realise various activities within their specified time-bounds.

Such functions are either triggered centrally by an e-market module that controls the workflow systems ofcontracting parties, or in a distributed fashion by agents that are generated by the e-market controlli ngmodule and migrate to parties’ computers. In either case, contractual obligations (and their relatedpermissions, prohibitions etc.) are interpreted as specifications that the overall system comprising the e-market and the parties’ hardware and software must satisfy. The parties’ fulfilment of a contract is thusensured because there is no longer any scope for choice related to changing business priorities (requiringsome deviation from the behaviour prescribed in one contract in order to fulfil another), uncontrolledenvironmental conditions, lack of due care and attention by relevant parties or of opportunism.

We think that this scenario may be undesirable. Although contracting parties are supposed to operate ingood faith and anticipate compliance with an agreement, it is often the case that deviation from prescribedbehaviour including violation of a particular obligation might become necessary, or it might serve a party’sindividual business goal better under certain circumstances. The same entity may be a party in more than onecontracts, in which case the entity may be represented by a coaliti on of agents (one for each contract) whohave to meet the business objectives of the entity as a collective. An individual agent may then have todeviate from the behaviour prescribed in the associated contract for the benefit of the collective. (This ismimicking the real marketplaces where deviation may be sustained under certain circumstances.)Developing more realistic e-marketplaces requires effective and acceptable means for monitoring the parties’compliance with an agreement. In conventional (non-electronic) contracting practice, compliance withprescribed behaviour is typically evaluated individually by each partner and where parties’ views differ,disputes arise that require some form of resolution. (In such cases under English law, the concept of“balance of probabiliti es” is used.) In this paper we present a simple architecture for an e-market, where anartificial (controller) agent undertakes the analysis of the contract execution and the assessment of anydeviations, including potential violations of obligations. The controller’s decision-making is informed by theagreement, and each party’s view of whether its own and the counter-party’s behaviour comply with it.Thus, the controller forms an opinion on the basis of such evidence (and possible additionalrecommendations from agents representing the parties), in a spirit similar to a (human) judge’s process forarriving at his ruling.

2. EXAMPLE CONTRACT

Consider the following pizza-ordering exchange where a contract is formed between a purchaser (Peter)and a seller (Susan). Although the example is superficially simple, it exhibits some features that are typicalof lengthier, more complex agreements: a certain future behaviour is stipulated for the parties, the obligationsthey assume are time-bounded, provision is made in case an obligation is not fulfill ed (the price is reduced incase of late delivery), a third-party (the anonymous driver) is involved, a certain description is stipulated forthe goods that are to be delivered and (dynamic) pricing and payment arrangements are made, implied terms(the menu description of the pizza) are assumed to be incorporated in the agreement, and so on. Theexample is, in this sense, a prototypical version of the kinds of agreements that have been analysed inpractice.

Peter: I would like to order a pizza from your menu please.Susan: Certainly. What kind of pizza would you like and what size?Peter: The “Good Earth Vegetarian”* please, but without onions. Large, please.Susan: Very well , that will be £13.95, cash please. What is the address?Peter: 12 Hunger Lane. How long will that be?Susan: It is now 7 pm and we promise to deliver within half an hour. If our driver

takes any longer than that, we deduct £1.00 from your bill .Peter: Ok, thank you.

*The menu description of “ Good Earth Vegetarian” : mushrooms, onions, red and greenpeppers, all topped with mozzarella.

We assume that Peter and Susan entered into such an agreement via an e-market, by communicatingtheir respective requirements for their transaction via an e-market facilit ator module that observes theirexchange or relays messages between the parties. Such messages may be in EDI form or using the morerecently proposed FLBC framework (cf. [3]). Although there are many advantages to using the latter, theparticular choice of communication language does not mater much for the present discussion. What isimportant to note is that we assume that the e-market facilit ator is able to extract the content of the parties’

agreement. That is, we assume that the e-market facilit ator holds a representation of the contract betweenPeter and Susan in some form, which can then be used by the e-market controller module to monitor theevolution of the actual exchange between parties and to determine whether it complies with the agreement.

In [2] the possible ways in which the actual transaction between Peter and Susan might evolve weresummarised by means of a state diagram such as the one il lustrated in Figure 1. Each state offers a (possiblypartial) view of the status of the exchange as it evolves, given the parties’ actions with respect to theirobligations. That is, each state corresponds to the obligations (and more generally other legal relations suchas rights, powers and so on) that obtain between the parties. Transitions correspond to the performance ornon-performance of actions by parties that effect transformations to the status of the exchange. There is noattempt here to provide a formal account of transition labels. Expressions of the form a:A denote that agenta brings about state of affairs A. Similarly expressions of the form not a:A denote that it is not the casethat agent a brings about state of affairs A.

At state S0 Susan is obliged to deliver (or bring about a delivery of) the specified pizza to Peter. IfSusan fulfils her obligation, then the exchange results in state S1, where Peter’s obligation to make paymentcomes into force. If Susan violates her obligation at S0, then the exchange results in state S3. If somerecovery mechanism is specified in the agreement, or determined by the e-market controller, then it might bepossible for Susan to make amends for her violation (for example by delivering afresh at a reduced price, orby offering some financial compensation to Peter) and have the exchange return to state S1. In similar spirit,at state S1 Peter may fulfil his obligation to pay (and the exchange terminates happily at S2) or he may violateit, in which case the exchange results in S4, from where it might be possible to recover, either through someprovision agreed upon by the parties or through the controller’s direction.

Figure 1 State diagram for simpli fied pizza-ordering example

As was noted in [2], the controller may hold a representation of the contract in a modal language, suchas modal action logic [10] or dynamic logic [12], which implicitly defines state spaces such as the oneill ustrated in Figure 1.

3. CONTRACT PERFORMANCE MONITORING

Instead of assuming that software agents acting on behalf of the parties automatically perform what isrequired by the contract, we will use in this paper a simple architecture for an e-market ill ustrated in Figure2. The E-market controller holds a representation of the contract in terms of the obligations (and other legalrelations) that obtain between parties. Such a representation is accessible to each party (or its software agent)so that it may establish what it is supposed to do and what it may expect from the counter-party. As parties(or their agents) perform their activities during the business exchange, they log them with the e-marketcontroller. Similarly, parties may obtain information about their own or their counter-party’s logged activity

S0 S1 S2

S3 S4

�:s

�:snot

�:p

�:pnot

S5

Key

s: Susanp: Peter� � pizza deli very�

payment

through the controller. The controller also has access to information provided by advisors outside the partiesdirectly participating in contract execution. These may be certification authorities, regulators or controllers ofother associated markets. In our example, these may be Peter’s credit card company (should Peter decide topay by credit card), the pizza-delivery agent, a geo-positioning system monitoring the position of the pizza-delivery agent, a receptionist in Peter’s location certifying that a pizza is being delivered, etc.

Figure 2 E-market architecture for contract performance monitoring

Let us now consider contract performance in this setting with reference to the example that weintroduced earlier. In conventional contracting practice, compliance with prescribed behaviour is typicallyevaluated individually by each partner to an agreement and where parties’ views differ, disputes arise thatrequire some form of resolution.

For instance, it may well be the case that Susan dispatches a pizza of what she believes to be the agreeddescription and quantity to Peter and that as a result she believes that she has fulfill ed her obligation and isnow owed payment from Peter. Peter, on the other hand, might not receive the pizza at all (say, because thedriver made a mistake and took it to a different address), or might receive a pizza which he does not considerconformant with the agreed description (say, it contained one extra topping that he did not specify, or it wasdelivered in a box which he deems unsuitable), or might receive a pizza of the agreed description but not at atime which he believes to be within the agreed bounds (say, a minute later than was promised). In terms ofthe state transition system representation, we could see this disagreement as being focused on which of S1 orS3 had been reached by some action(s) in the system. So, the evidence based reasoning ill ustrated below ishelping some agent determine what transitions had actually taken place. In a more realistic situation, thenumber of potential transitions to choose from (including none!) may be very large.

Thus Peter might disagree with Susan and not consider her obligation to deliver a pizza as fulfill ed.Similar scenarios can be thought of in the case of Peter’s obligation to pay. In cases where payment is madeelectronically, it is perhaps easier to think of situations where a party may consider its obligation fulfill ed,while the other party may hold the opposite belief. Such situations, where parties disagree about whether anobligation was fulfill ed or not, require some form of arbitration and resolution. Where parties undertakesuch resolution themselves, they try to present evidence that supports their view of what was agreed andwhat actually happened in relation to that. Where a third party, an arbitrator or a judge undertakes suchresolution, similar evidence has to be presented to it.

In our example, the e-market controller plays the role of the arbitrator or judge, and is called upon toresolve conflicts between the parties’ views of what was agreed and what actually happened, based on theevidence presented to it (i.e. the messages issued by each party) and the representation of the agreement thatit holds. The controller has no means of establishing with absolute certainty that this or that action is indeedperformed as specified by each party—parties may maliciously or inadvertently misinform the controller.The controller is therefore required to adopt an attitude about what actually happened, based to some extent

� � � � � � �� � � � � � � � � � � �

ee-market-marketcontrollercontroller

ee-market-marketadvisorsadvisors

PeterPeterSusanSusan

on the parties’ attitudes. Several calculi and logics have been proposed for reasoning on the basis ofinsuff icient evidence, or in the presence of uncertainty and ignorance. Here, we investigate the usefulness ofJøsang’s [8] proposal, Subjective Logic, which is summarised in the following section.

4. SUBJECTIVE LOGIC

Trust by a party A in a party B for a service X is defined in [4] as the measurable belief of A in Bbehaving dependably for a specified period within a specified context. Trust allows one agent to reasonablyrely for a critical period on behaviour or on information communicated by another agent. Its value relates tothe subjective probabilit y that an agent will perform a particular action (which the trustor may not be able tomonitor) within a context, and in a dependable manner.

Subjective logic addresses the problem of forming a measurable belief about the truth or falsity of anatomic proposition denoting a state, event or identifying an agent, in the presence of uncertainty. It isdefined by integrating classical logic and a theory of subjective probabiliti es [8], based on an extension of theDempster-Shafer theory of evidence [14]. An agent's opinion is a representation of a belief and is modelledas a triple1 )(),(),()( ����� udb� where:

– )( �b measures belief, represented as the subjective probabilit y that a proposition is true;

– )( !d measures disbelief, represented as the subjective probabilit y that a proposition is false;

– )( !u measures uncertainty, represented as the subjective probabilit y that a proposition is either true or

false;

– " #10)(),(),( $%&&& udb and 1)()()( '(( ))) udb , for any proposition * .

A strong correlation between this opinion model and the probabilit y density functions associated withthe beta distribution [8] ensures that opinions can be deterministically established if all available evidencecan be analysed statistically. Subjective logic has been used in [9] in order to reason about decisionsinvolved in authenticating public keys based on recommendations and certificates and to ill ustrate that, inthis case, trust in remote agents can be determined by embedding trust recommendations inside public keycertificates.

4.1 Subjective Logic Operators

The following operations of subjective logic will be used in this paper:

4.1.1 Conjunction

The conjunction )()( +,-. / is an opinion udb ,, such that

)()( 01 bbb 23 , )()()()( 4565 ddddd 789: , )()()()()()( ;<;<;< uubuubu =>=>=? (1)

Conjunction forms an opinion about the conjunction of two propositions that refer to distinct judgements bythe same agent. Belief in the conjunction is viewed as belief in both propositions being true simultaneously,which is measured by multiplying the probabiliti es of each of the two propositions being true individually.Disbelief in the conjunction is viewed as belief in either of the two propositions being false, which amounts

1 In [8] an opinion is defined as a quadruple audb ,,,@A where a denotes relative atomicity. Roughly, assuming a

frame of discernment B , which delimits a set of possible states of affairs exactly one of which is assumed to be trueat any one time, relative atomicity normalises uncertainty by incorporating the percentage of the elementary states ofaffairs that are covered by the state of affairs about which the opinion is formed. In short, relative atomicity indicatesthe likelihood that uncertainty should have been belief. For the purpose of this paper, we have taken the simplifyingassumption that

5.0a and have therefore omitted relative atomicity in order to simpli fy the example. In real

applications, though, relative atomicity is important as it relates uncertainty to probability expectation. The latter isdefined as uabE CDE)( F .

G

to the union of the probabiliti es of each proposition being false individually. Uncertainty in the conjunctionis viewed as the belief that either one proposition is true and the truth of the other one is uncertain or that thetruth of both propositions is uncertain.

4.1.2 Disjunction:

The disjunction: )()( HIJI K is an opinion udb ,, such that

)()()()( LMLM bbbbb NOPQ , )()( RS ddd TU , )()()()()()( VWVWVW uuduudu XYXYXZ (2)

Disjunction forms an opinion about the disjunction of two propositions that refer to distinct judgements bythe same agent. Belief in the disjunction is viewed as belief in either of the two propositions being true,which amounts to the union of the probabiliti es of each proposition being true individually. Disbelief in thedisjunction is viewed as belief in both propositions being false simultaneously, which is measured bymultiplying the probabiliti es of each of the two propositions being false individually. Uncertainty in theconjunction is viewed as the belief that either one proposition is false and the truth of the other one isuncertain or that the truth of both propositions is uncertain.

4.1.3 Negation:

The negation: )( [\] is an opinion udb ,, such that:

)( ^db _ , )( `bd _ and )( `uu _ (3)

The negation of an opinion about a proposition being true amounts to the opinion about the same propositionbeing false.

4.1.4 Recommendation:

Recommendation: )()()( abbcbBBA

BA i de is an opinion udb ,, such that:

)()( fBBA bibb gh , )()( iBBA dibd jk and )()()()( lBBABABA uibiuidu mnno (4)

The recommendation operator2 p combines agent A ’ s opinion about agent B ’ s advice with agent B ’ sopinion about a proposition q expressed as an advice from agent B to agent A . Note that B ’ s opinion

about q is weighted against A ’ s trust in B ’ s advice, while A ’ s distrust and uncertainty in B ’ s advice

increases A ’ s uncertainty about q .3 This is consistent with the trust model described in [4]: “ Distrustpropagates through trust and it obstructs the propagation of trust. If A distrusts an intermediary T for aservice X, then A will i gnore T's mediation to the extent of the distrust.” Here, agent B plays the role of theintermediary (viz. an advisor) for assessing the truth of q .

The operator r is associative, but not commutative. This means that, in the case of a chain ofrecommendations, the same discounting of opinions will be calculated by starting at either end of the chain.However, the order of the recommendations is important. Finally, independence of opinion dictates that thesame advisor is not allowed to appear more that once in a recommendation chain. This, in effect, requiresthat advisors refrain from mixing “ first-hand” and “second-hand” evidence in their recommendations: anadvisor should always include the source of the evidence in her recommendation (e.g. first hand evidence, oragent X’ s recommendation).

2 This operator is called “discounting” in [8] and is similar to the “discounting” function of [14]. In fact, if c denotes the

“discounting” rate of [14], then, by setting )()( BABA iuidc st , the two definitions become equivalent.

3 As it is hard to comprehend physical belief discounting,, the definition of an operator for discounting opinions lendsitself to different interpretations. In particular, it is not obvious how to describe the effect of

A disbelieving that

Bwill give good advice. In this case, the latter contributes to agent A ’ s uncertainty about the truth-value ofu regardless of what B ’ s advice is.

v

4.1.5 Consensus:

The consensus: )()()(],[ wxyxyxBABA z{ is an opinion udb ,, such that:

k

buubb BABA )()()()( |||| }~�� ,

k

duudd BABA )()()()( ���� ���� and

k

uuu AB )()( �� �� (5)

where )()()()( ���� BABA uuuuk ���� and � is undefined when )(0)( �� BA uu �� .

The consensus operator � combines the opinions of two agents A and B about the same proposition� in a fair and equal way, resulting in the opinion of an imaginary agent A + B about � .4 This operatorassumes a set of mutually consistent opinions and reduces uncertainty. Uncertainty is therefore interpretedas “room for influence”, meaning that it is possible to reach a consensus only with somebody who maintainssome uncertainty. The effect of this operator is analogous to the situation in court where several witnessesgive consistent testimony, hence, amplifying each other’s views in the judge’s opinion.

The operator � is both associative and commutative; prescribing the order in which opinions arecombined to form a consensus is irrelevant. However, opinion independence has to be assumed in order toavoid allowing for an agent to unfairly influence consensus. This basically means that the opinion of anagent should not be counted more than once.

Finally, note that dogmatic opinions cannot be combined into a consensus. This is because uncertaintyis interpreted as “room for influence”. Having conflicting dogmatic opinions is philosophically counterintuitive (unless if deception is admitted as a possibility). In this model, opinions about the same “real”observation either incorporate some uncertainty or are necessarily equal.

5. APPLICATIONS OF SUBJECTIVE REASONING IN CONTRACTPERFORMANCE MONITORING.

Recall the simple architecture for an e-market ill ustrated in Figure 2. The E-market controller holds arepresentation of the contract in terms of the obligations (and other legal relations) that obtain betweenparties. Such representation is accessible to each party (or its software agent) so that it may establish what itis supposed to do and what it may expect from the counter-party. As parties (or their agents) perform theiractivities during the business exchange, they log them with the e-market controller. They also have access toinformation provided by advisors outside the parties directly participating in contract execution.

We discuss potential applications of subjective reasoning in contract performance monitoring on thebasis of two ill ustrative scenarios:

1. Scenario 1: A single agent, who is informed of the agents’ actions in order to form an opinion about thestate of the contract, and may also seek external advice when disputes arise, controls the e-market. Thee-market controller then communicates its decision to the participating parties who have to conform byupdating their information about the state of the contract execution or carry on the dispute outside the e-market.

2. Scenario 2: The e-market is self-regulated; the controller module is an imaginary agent that reflects theconsensus of the collective. All parties automatically update their information about the state of thecontract execution so that it conforms to the view of the collective.

5.1 First ill ustrative scenar io

Assume that the contract execution was at state S0 and the e-market controller has formed the followingopinions about the trustworthiness of Susan and Peter, based on previous experience and other sources ofinformation (e.g. recommendations from third parties). In addition, the e-market controller has opinions

4 Consider that agents A and B observe a machine over two different time intervals and have potentially different

opinions about it depending on the behaviour of the machine in the respective periods. The consensus corresponds tothe opinion that a single agent BA � would have after observing the machine over both periods.

about its own dependabilit y5 and the dependabilit y of its advisor(s) to provide recommendations about as : .Note that these opinions measure the dependabilit y of the agents for a specific service, viz. as : , in thecontext specified by the contract.

Trust presentations for service (s:a) Advisors: 1.0,1.0,8.0)( �RC i�Self: 6.0,1.0,3.0)( �CC i� Susan: 1.0,4.0,5.0)( �SC i� Peter: 3.0,1.0,6.0)( �PC i�

Then, Susan suggests, based on first hand evidence, that as : has most probably been met, bringingthe contract to a state 1S . Susan communicates this information to the market controller by means of anopinion about as : . That is, 1.0,0,9.0):( �asS

� . According to Susan, the contract execution reaches state

1S , since: 1.0,0,9.01.0,0,9.00,0,1)):(0:()1:( ������� asSstateSstate CC �� .

Unfortunately, Peter, based on his first hand evidence, is not convinced that as : has been met. Petercommunicates this to the market controller by means of an opinion 5.0,3.0,2.0):( �asP

� . Peter’s

opinion entails 5.0,3.0,2.0)1( �SP� and 5.0,2.0,3.0)2( �SP

� . Peter does not have enough

evidence to determine the state of the contract to his satisfaction.

As Susan and Peter have very different opinions about the state of the contract, a dispute arises. In thepresence of a potential dispute, the e-market controller has to weigh its own evidence and consider therecommendations from its advisors, in addition to the information sent by Susan and Peter in order to form ajudgement. We assume that the e-market controller has formed an opinion 3.0,2.0,5.0):( ��

asC� on the

basis of its first hand evidence and that the advisors send recommendation 101080 .,.,.(s:a) R ¡ .

The following formula encodes the opinion formation process of the e-market controller where allevidence is first weighed against trust in the corresponding agents and then taken into consideration equallyand fairly.

):():():():():():( ,,, asasasasasas PC

SC

RC

CC

CRPSC ¢¢¢¢¢¢ £££¤¤

Where:

- ¥ ¦):()():( asias cccCC §§§ ¨©ª

- « ¬):()():( asias RRcRC ­­­ ®¯

- ° ±):()():( asias SScSC ­­­ ®¯

- ² ¬):()():( asias PPcPC ­­­ ®¯

The formula gives the following numerical results:

79.0,6.0,15.0):( ³asCC

´18.0,16.0,56.0):( µasR

C¶ 26.0,17.0,57.0):():(, ·¸

asas RC

RCC ¹¹

55.0,0.0,45.0):( ºasSC

»70.0,18.0,12.0):( ºasP

C» 45.0,11.0,44.0):():(, ¼½

asas PC

PSC ¾¾

20.0,17.0,63.0):( ¿asÀ

The e-market controller can use this result in order to make an informed decision about the state of thecontract. The corresponding probabilit y expectation estimates may be used as input to the assessment of therisks associated with accepting the validity of as : and hence updating the state of the contract execution to

1S . Of course, it is very difficult to decide what “ the balance of probabiliti es” should be! In the exampleabove, it is clear that the controller accepting the validity of as : assumes that 0.63 is greater than the 5 As explained in [5], “ trust in oneself is measurable … self-assessment underlies the ability of an agent to delegate or

offer a task to another agent in order to improve eff iciency or reduce risk. In this case )( CC iÁ enables the e-marketcontroller to measure the dependabil ity of the opinion it has formed based on first hand evidence and balance thisagainst the recommendations received from the other agents.

Â

threshold required. (Any value above 0.5 could reasonably be considered as the standard.) If the threshold isgreater than 0.63, then the controller needs to make another decision: support Susan, as Peter’s assertion wasnot proved on “ the balance of probabiliti es” , or seek further evidence if available.

As an example, it might be the case that the time of the delivery is in dispute and the threshold forimputing blame is greater than 0.63. The controller decides to consult (seek a recommendation from) thegeopositional system used to track deliveries, which it considers to be is a reliable source of information,i.e. 1.0,0.0,9.0)( ÃGP iÄ . The system offers the opinion, with great certainty, that its estimate of the time of

arrival supports Susan, say 05.0,0.0,95.0):( ÃasGÄ . As a consequence the e-market controller updates its

opinion about as : to ):():( ,,,, asas GCRPSC

ÅÅ Æ. The latter unfolds toÇ È

):()():(,,, asias CGCCRPS

C ÉÉÉ ÊË which evaluates to 09.0,08.0,83.0 . Hence, the contract can be put

back on course by requiring Peter to pay the full amount (or seek recourse to courts outside the e-marketplace).

5.2 Second illustrative scenar io

Consider a self-regulated e-market, where the controller module is an imaginary agent that reflects theconsensus of the collective. When entering the collective, all parties have accepted to automatically updatetheir information about the state of the contract execution accordingly to conform with the view of thecollective. Recommendations can still be accounted for, provided that they come from agents who have beenregistered with the collective.

Assume that Susan and Peter, as in the previous scenario, have opinions 1.0,0,9.0):( ÌasSÍ and

5.0,3.0,2.0):( ÎasPÏ about as : , based on first hand evidence. The opinion of the collective is

1.0,05.0,85.0):():():( ÐÑÒ asasas PSC ÓÓÓ , which means that Peter is required to pay the full

amount (or leave the collective and seek recourse to courts outside the e-market place).

Note that, in this model, members of the same collective are expected to co-operate and to refrain fromattempting to deceive the collective. Consequently, the opinions expressed by the members of the collectiveare assumed to be consistent and their combination merely contributes to the reduction of uncertainty in thecollective. Thus, uncertainty is interpreted as “ room for influence”; the stronger the opinion of a member isthe more it influences the opinion of the collective.

There is a need to assess, however, the dependabilit y of agents outside the collective, such as advisorsor other collectives cohabiting a larger e-market. Assume, for example, that Susan and Peter have opinions

3.0,2.0,5.0)( ÔRS iÕ and 1.0,1.0,8.0)( ÖRP iÕ about the recommendations provided by the advisor R

for as : . The collective takesR ’ s recommendation into consideration by first forming a consensus aboutR ’ s dependabilit y as an advisor for as : and using it to weighR ’ s actual recommendation. That is,

):()():( asias RRcRC ××× ØÙ where )()()( RPRSRC iii ÚÚÚ ÛÜ , which evaluates to 08.0,14.0,78.0 .

Hence, )( RC iÝ evaluates if 101080 .,.,.(s:a)ÞR ß , as in the previous scenario, then 3.0,084.0,62.0):( àasR

Cá .

If this recommendation is taken into account, then the opinion of the collective changes to

7.0,013.0,8.0):(, âasCRC

ã .

6. CONCLUSION AND FURTHER WORK

It is unrealistic (in the extreme) to assume that parties to an electronic contract (humans or softwareagents) will always execute the contract perfectly. It is only to be expected that there can be deviation fromthe prescribed norms requiring arbitration and corrective action. We have sketched a model of the e-marketplace in which evidence from various sources (the parties to the contract themselves, third parties)relating to the putative breach of the contract is weighed in order to inform actions for setting the contractback on course. An independently proposed formalism, subjective logic, appears to offer support forrepresenting this kind of reasoning and we ill ustrate its application in scenarios related to a simple exampleof contract execution.

� ä

The technical elements of dealing with evidence would appear to involve: attributing levels of trust tothe parties or market advisers involved, accounting and combining recommendations, and forming consensusviews. A potential advantage of subjective logic is the strong correspondence between the opinion modeland posterior probabiliti es presented by means of beta-distribution. This allows for a standard presentation(by means of opinions in subjective logic) of results from posterior statistical analyses, such as opinionpolli ng, frequency analysis, etc. This is particularly useful for incorporating evidence collected fromauditing, quantitative risk analysis and reputation systems. However, there are some strict assumptionsrelated to the applicabilit y of subjective logic (such as the assumption that the propositions combined by thelogical connectives refer to states of affairs in distinct frames of discernment) and we need to explorewhether these are too restrictive for practical purposes.

Further research will be directed towards combining an analysis of the normative aspects of contracts(obligations, rights, powers and other legal relations that obtain between parties) with subjective reasoning,guided by a case study of substantial size and complexity. In order to prepare the ground for such integration,subjective logic will have to be extended with operators about the combination of interdependent opinions,guarded events, sequential and parallel compositions of events. In addition to applications, there is also aneed for fundamental research as there is likely to be a subtle interplay between the different formalisms thatwill need to be understood. The ultimate goal is the demonstration, by means of a prototype system, of theusefulness of the approach sketched in this paper for supporting a more realistic, and therefore moreattractive to do business within, dependable electronic market place.

REFERENCES

[1] Allen T. & Widdison R. Can Computers Make Contracts? Harvard Journal of Law and Technology, 9(1), 1996.

[2] Daskalopulu A. Modell ing Legal Contracts as Processes. Proceedings of the 11th International Conference andWorkshop on Database and Expert Systems Applications, IEEE C. S. Press, 2000, pp. 1074–1079,

[3] Daskalopulu A. & Sergot M. Computational Aspects of the FLBC Framework. Decision Support Systems, specialissue on Formal Modelling in E-Commerce, forthcoming (2001).

[4] Dimitrakos Th. System Models, e-Risk and e-Trust. Towards bridging the gap? Towards the E-Society: E-Business,E-Commerce, and E-Government. (Proceedings of the 1st IFIP conference on e-Commerce, e-Business, e-Government. Zurich) Kluwer Academic Publishers, 2001.

[5] Dimitrakos Th. Towards a Formal Model of Trust in e-Commerce. Proceedings of the Novel E-CommerceApplications of Agents Workshop, AI 2001, Ottawa, Canada. NRC-44883, pp. 13-22, 2001.

[6] Dellarocas C., Klein M. Rodriguez-Aguilar J. A. An exception-handling architecture for open electronicmarketplaces of contract net software agents. Proceedings of the 2nd ACM Conference on Electronic Commerce,Minneapolis, pp. 225-232, 2000.

[7] Ibbotson J., Sachs M. Electronic Trading Partner Agreement for E-Commerce. IBM Corporation, 1999.

[8] Jøsang A. A Logic for Uncertain Probabili ties. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 9, No. 3, 2001.

[9] Jøsang A. An Algebra for Assessing Trust in Certification Chains. In Proc. Network and Distributed SystemsSecurity Symposium. The Internet Society, 1999.

[10] Kent S. J. H., Maibaum T. S. E. & Quirk W. J. Specifying Deontic Behaviour in Modal Action Logic. FORESTDeliverable Report WP1.R2, 1992.

[11] Merz M., Griffel F., Tu T., Müller-Wilken S., Weinreich H., Boger M., Lamersdorf W. Supporting ElectronicCommerce Transactions with Contracting Services. International Journal on Cooperative Information Systems, Vol.7, No. 4, 1998.

[12] Meyer J.-J. Ch. A Different Approach to Deontic Logic: Deontic Logic Viewed as a Variant of Dynamic Logic.Notre Dame Journal of Formal Logic, 29 (1), pp. 109–136, 1988.

[13] Reeves D. M. et al. Towards a declarative language for negotiating executable contracts. Proceedings of AAA I-99Workshop on Artificial Intell igence in Electronic Commerce, 1999.

[14] Shafer, G., A mathematical theory of evidence. Princeton University Press, 1976.