• No results found

Computational model for morality and emotions in EmoBN

N/A
N/A
Protected

Academic year: 2021

Share "Computational model for morality and emotions in EmoBN"

Copied!
86
0
0

Loading.... (view fulltext now)

Full text

(1)

Department of Science and Technology

Institutionen för teknik och naturvetenskap

Linköping University

Linköpings universitet

g

n

i

p

ö

k

r

r

o

N

4

7

1

0

6

n

e

d

e

w

S

,

g

n

i

p

ö

k

r

r

o

N

4

7

1

0

6

-E

S

LiU-ITN-TEK-A-14/028-SE

Beräkningsmodell för moral och

emotioner i EmoBN

Henry Fröcklin

(2)

LiU-ITN-TEK-A-14/028-SE

Beräkningsmodell för moral och

emotioner i EmoBN

Examensarbete utfört i Medieteknik

vid Tekniska högskolan vid

Linköpings universitet

Henry Fröcklin

Handledare Pierangelo DellAcqua

Examinator Pierangelo Dell'Acqua

(3)

Upphovsrätt

Detta dokument hålls tillgängligt på Internet – eller dess framtida ersättare –

under en längre tid från publiceringsdatum under förutsättning att inga

extra-ordinära omständigheter uppstår.

Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner,

skriva ut enstaka kopior för enskilt bruk och att använda det oförändrat för

ickekommersiell forskning och för undervisning. Överföring av upphovsrätten

vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan användning av

dokumentet kräver upphovsmannens medgivande. För att garantera äktheten,

säkerheten och tillgängligheten finns det lösningar av teknisk och administrativ

art.

Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i

den omfattning som god sed kräver vid användning av dokumentet på ovan

beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan

form eller i sådant sammanhang som är kränkande för upphovsmannens litterära

eller konstnärliga anseende eller egenart.

För ytterligare information om Linköping University Electronic Press se

förlagets hemsida

http://www.ep.liu.se/

Copyright

The publishers will keep this document online on the Internet - or its possible

replacement - for a considerable time from the date of publication barring

exceptional circumstances.

The online availability of the document implies a permanent permission for

anyone to read, to download, to print out single copies for your own use and to

use it unchanged for any non-commercial research and educational purpose.

Subsequent transfers of copyright cannot revoke this permission. All other uses

of the document are conditional on the consent of the copyright owner. The

publisher has taken technical and administrative measures to assure authenticity,

security and accessibility.

According to intellectual property law the author has the right to be

mentioned when his/her work is accessed as described above and to be protected

against infringement.

For additional information about the Linköping University Electronic Press

and its procedures for publication and for assurance of document integrity,

please refer to its WWW home page:

http://www.ep.liu.se/

(4)

Institutionen för teknik och

naturvetenskap

Department of Science and Technology

Examensarbete

Design of a computational model for morality and

emotions in EmoBN

Examensarbete utfört i Artificial General Intelligence vid Tekniska högskolan vid Linköpings universitet

av Henry Fröcklin LiU-ITN-TEK-A-14/028-SE

Norrköping 2014

Department of Science and Technology Linköpings tekniska högskola Linköpings universitet Linköpings universitet, Campus Norrköping

(5)
(6)

Design of a computational model for morality and

emotions in EmoBN

Examensarbete utfört i Artificial General Intelligence

vid Tekniska högskolan vid Linköpings universitet

av

Henry Fröcklin LiU-ITN-TEK-A-14/028-SE

Handledare: Pierangelo Dell’Acqua

ITN, Linköpings universitet Linköpings universitet

Examinator: Pierangelo Dell’Acqua

(7)
(8)

Avdelning, Institution Division, Department

MIT

Department of Science and Technology SE-601 74 Norrköping Datum Date 2014-08-28 Språk Language Svenska/Swedish Engelska/English  ⊠ Rapporttyp Report category Licentiatavhandling Examensarbete C-uppsats D-uppsats Övrig rapport  ⊠

URL för elektronisk version

http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-XXXXX

ISBN — ISRN

LiU-ITN-TEK-A-14/028-SE Serietitel och serienummer

Title of series, numbering ISSN

Titel

Title Design av en beräkningsmodell för moral och emotioner i EmoBNDesign of a computational model for morality and emotions in EmoBN

Författare

Author Henry Fröcklin

Sammanfattning Abstract

Denna rapport presenterar en metod för att designa moraliskt beteende i ett scenario med een. een är en iteration av emobn som är baserat på bn, ett system för aktions val med dy-namisk aktivering mellan moduler, mål orienterat och kapabelt till förutsägelse och planer-ing. Designen är baserad på nuvarande forskning från en prominent psykolog som Haidt och använder Mikhails umg ramverk för kausal och medvetenhets validering. Även Rose-mans värderings modell och Haidts mft används för att fastställa moraliska emotioner i en moralisk kontext. Designen är testad mot empiriska resultat av ett filosofiskt experiment känt som Vagns problemet, ett välkänt moraliskt dilemma.

Nyckelord Keywords

(9)
(10)

Sammanfattning

Denna rapport presenterar en metod för att designa moraliskt beteende i ett sce-nario med een. een är en iteration av emobn som är baserat på bn, ett system för aktions val med dynamisk aktivering mellan moduler, mål orienterat och ka-pabelt till förutsägelse och planering. Designen är baserad på nuvarande forsk-ning från en prominent psykolog som Haidt och använder Mikhails umg ram-verk för kausal och medvetenhets validering. Även Rosemans värderings modell och Haidts mft används för att fastställa moraliska emotioner i en moralisk kon-text. Designen är testad mot empiriska resultat av ett filosofiskt experiment känt som Vagns problemet, ett välkänt moraliskt dilemma.

(11)
(12)

Abstract

This master thesis presents an approach on how to design moral behaviour in a scenario with een. een is an iteration of emobn which is based on bn, an action selection system with activation dynamics among modules, goal oriented and capable of prediction and planing. The design is based on current research from prominent psychologist like Haidt and uses Mikhial’s umg framework for causal and intentional validation. Also Roseman’s appraisal model and Haidt’s mftis used for determining moral emotions in a moral context. The design is tested against empirical results from philosophical experiment know as the trol-ley problem, a well known moral dilemma.

(13)
(14)

Acknowledgments

Foremost, I would like to thank to my advisor Pierangelo Dell’Acqua for the con-tinuous support during the thesis, for his patience, motivation, enthusiasm, and immense knowledge.

Besides my advisor, I would like to thank my opponent, for his insightful com-ments and interesting questions.

Last but not the least, I would like to thank my family: my parents Ingela Fröcklin and Victor Mendéz, for giving birth to me at the first place and supporting me spiritually throughout my life, to my Anna for undestanding and keeping me sane.

Norrköping, Maj 2014 Henry Fröcklin

(15)
(16)

Contents

List of Figures xi

List of Tables xii

Notation xiii List of Definitions xv 1 Introduction 1 1.1 Problem description . . . 2 1.2 Report outline . . . 2 2 Background 3 2.1 Emotion theory . . . 3 2.2 Morality theory . . . 4

2.3 Moral emotions theory . . . 6

2.3.1 Other-condemning family . . . 6

2.3.2 The Self-Conscious family . . . 7

2.3.3 The Other-Suffering Family . . . 8

2.3.4 The Other-Praising Family . . . 8

2.4 Moral emotional decision making . . . 9

2.4.1 Ethics and morality . . . 9

2.4.2 Decision theory . . . 10 3 emobn architecture 11 3.1 Introduction . . . 11 3.2 Components of emobn . . . 12 3.2.1 States . . . 12 3.2.2 Behaviour Modules . . . 12 3.2.3 Goals . . . 13 3.2.4 Resources . . . 13 3.2.5 Emotions . . . 13 3.2.6 Emotional Links . . . 13

(17)

x Contents 3.3 Activation Spreading . . . 14 3.3.1 Network Parameters . . . 14 3.3.2 Activation Spreading . . . 15 3.4 Action Selection . . . 16 4 Models 19 4.1 Background theory summary . . . 19

4.2 Moral Foundation Theory . . . 21

4.2.1 The six moral foundations . . . 22

4.3 The trolley problem . . . 24

4.3.1 Bystander scenario . . . 25

4.3.2 Footbridge scenario . . . 25

4.4 Universal Moral Grammar . . . 26

5 The moral design 29 5.1 een, extension of the emobn . . . 29

5.1.1 Introduction . . . 29

5.1.2 Components of een . . . 29

5.1.3 Graphical representation of . . . 30

5.1.4 Transformation Γ . . . 31

5.2 Implementing the trolley problem with een . . . 33

5.2.1 Expected moral emotions process . . . 33

5.2.2 Weighting . . . 39

6 Testing 41 6.1 Aim of the tests . . . 41

6.2 Bystander (Test 1) . . . 41

6.3 Footbridge (Test 2) . . . 42

7 Related work 43 7.1 Empathy in social agents . . . 43

7.2 Modelling theory of mind . . . 44

7.3 lida . . . 44 8 Discussion 47 8.1 Conclusions . . . 47 8.2 Future work . . . 48 Bibliography 51 A Code 57 A.1 Bystander code . . . 57

(18)

List of Figures

2.1 Lowenstein model of how emotions affect behaviour [20]. . . 10

4.1 A diagram of the theories from the background chapter and how the different concepts are linked according to our understanding. . 20

4.2 A diagram of the bystander case. . . 25

4.3 A diagram of the footbridge case. . . 25

4.4 umg intentional structure of the bystander dilemma [22]. . . 27

4.5 umg intentional structure of the footbridge dilemma [22]. . . 27

5.1 Graphical representation of a een component. . . 31

5.2 Roseman’s model of common emotions and how they are triggered [30]. . . 34

5.3 A diagram showing activation flow of the bystander case. . . 37

(19)

List of Tables

3.1 Affective impact on the network parameters . . . 15

4.1 The five moral foundations [16] . . . 23

5.1 Emotions from actions for the bystander case . . . 35

5.2 Emotions from states for the bystander case. . . 35

5.3 Emotions from actions for the footbridge case. . . 36

5.4 Emotions from states for the footbridge case. . . 36

6.1 Result of bystander test. . . 41

6.2 Result of footbridge test. . . 42

(20)

Notation

Abbrevations

Abbrevation Meaning

agi Artificial General Intelligence tem Transient Episodic Memory mft Moral Foundation Theory

lida Learning Intelligent Distribution Agent bn Behaviour Network

ebn Extended Behaviour Network emobn Emotional Behaviour Network

een Expected Emotion Network ug Universal Grammar

umg Universal Moral Grammar

pomdp Partially Observable Markov Decision Process fmri Functional Magnetic Resonance Imaging

npc Non-Player Character

aicg Artificial Intelligence and Computer Graphics bbc British Broadcasting Corporation

(21)
(22)

List of Definitions

Morality from the Latin moralitas "manner, character, proper behavior", is the differentiation among intentions, decisions, and actions between those that are good (or right) and bad (or wrong). A moral code is a system of morality (for example, according to a particular philosophy, religion, culture, etc.) and a moral is any one practice or teaching within a moral code. The ad-jective moral is synonymous with "good" or "right." Immorality is the active opposition to morality (i.e. good or right), while amorality is variously de-fined as an unawareness of, indifference toward, or disbelief in any set of moral standards or principles.

Emotion is a complex psychophysiological experience of an individual’s state of mind as interacting with biochemical (internal) and environmental (exter-nal) influences. In humans, emotion fundamentally involves "physiological arousal, expressive behaviors, and conscious experience." Emotion is asso-ciated with mood, temperament, personality, disposition, and motivation. Motivations direct and energize behavior, while emotions provide the affec-tive component to motivation, posiaffec-tive or negaaffec-tive.

Behaviour or behavior (see American and British spelling differences), refers to the actions and mannerisms made by organisms, systems, or artificial enti-ties in conjunction with its environment, which includes the other systems or organisms around as well as the physical environment. It is the response of the system or organism to various stimuli or inputs, whether internal or external, conscious or subconscious, overt or covert, and voluntary or invol-untary.

(23)

xvi 0 List of Definitions

Social Action in sociology, social action refers to an act which takes into account the actions and reactions of individuals (or ’agents’). According to Max Weber, "an Action is ’social’ if the acting individual takes account of the be-havior of others and is thereby oriented in its course" (Secher 1962). Ethic is sometimes known as philosophical ethics, ethical theory, moral theory,

and moral philosophy, is a branch of philosophy that involves systematiz-ing, defending and recommending concepts of right and wrong conduct, often addressing disputes of moral diversity.

Judgment, a value judgment is a judgment of the rightness or wrongness of something or someone, or of the usefulness of something or someone, based on a comparison or other relativity. As a generalization, a value judgment can refer to a judgment based upon a particular set of values or on a par-ticular value system. A related meaning of value judgment is an expedient evaluation based upon limited information at hand, an evaluation under-taken because a decision must be made on short notice.

Empathy is the capacity to recognize emotions that are being experienced by an-other sentient or fictional being. One may need to have a certain amount of empathy before being able to experience accurate sympathy or compassion. Reciprocity in social and political philosophy, is the concept of reciprocity as

in-kind positive or negative responses for the actions of others. Faculties is an ability, skill, or power, often plural.

(24)

1

Introduction

The research done in this thesis is primarily based on psychology articles from prominent researchers such as Jonathan Haidt, Jesse Graham, Craig Joseph and Selin Kesebir. Another point of view comes from a professor in legal theory, John Mikhail. Also to get a better understanding of the inner workings of the human psyche the field of biology and neuroscience has been taken into consideration during the preliminary research. We have chosen the design primarily based on the work from one particular psychologist, Jonathan Haidts, mainly because it would otherwise be contradictory but also to remain consistent.

With the base of a common dilemma we have designed a model, ran tests and compared our results with a psychology experiment. We used the well known trolley problem1 as scenario for validation. This scenario significantly reduced

the number of relevant emotions, since our model is not general, each emotion and what it does is hard coded into the system for this specific scenario, to have full control and to narrow the scope as much as possible.

(25)

2 1 Introduction

1.1

Problem description

There is a great demand for more realistic models of human behaviour in be-haviour simulation and for npc. To further increase the believability of artificial agents, the decision making needs to incorporate morality as well. Recently, in the aicg lab2at LiU, an architecture for representing emotional influence in de-cision making, based on the research [20] of Lowenstein, has been implemented. The model is based on an extension of behaviour networks [18, 19]. At the cur-rent state of development, the system lacks morality. Thus, what it is needed, is to design a computational model of morality that will allow us to analyse spe-cific scenarios and dilemmas studied in research ethics and decision making in psychology. The existing system emobn3, has been used for a public display

installation and as a interactive application for emotional behaviour in games. Even the simplest model of artificial moral behaviour similar to humans is very complex, even so this is an attempt to evaluate if emobn can be used, with or without modifications. The aim is to design and investigate the possible imple-mentation of a computational model of morality with emobn architecture with focus on aspects of behaviour.

1.2

Report outline

We start with introducing the background theory of emotions and morality in chapter 2. Then we move on to briefly explaining the emobn architecture in chapter 3, this chapter is very much based on the PhD thesis of Anja Johannson and the work of Pierangelo Dell’Acqua. We proceed with chapter 4, where we de-scribe the relevant models from the background theory used in the design of the implementation. Next in chapter 5 we go through the moral design. Chapter 6 contains testing of the moral model. After that we present three related research projects and existing implementations with similar aims to this thesis in chapter 7. The 8th and last chapter discusses the topic and future work.

2http://webstaff.itn.liu.se/ piede/aicglab/. 3See chapter 3.

(26)

2

Background

This chapter presents the theoretical background of the theories, upon which this thesis is based on. These concepts have different hypothesis and this is a selection based on our intuitive acknowledgement with our problem statement in mind.

2.1

Emotion theory

It is evident that there is no one model of emotions1, Baumeister et al. summarizes

what emotions are in one short sentence [1],

Put simply, the quick affective responses mainly indicate either good or bad evaluations, which activate either the approach or avoidance systems.

There are various hypothesis on what constitutes as an emotion, Wikipedia states that emotion could be the driving force of motivation, and therefore plays a ma-jor part in decision making. Humans needs motivation to act upon a goal. This is coupled with the statement that behaviour2strives to change emotion, for the purpose of becoming happier or reduce pain[1]. Another statement says that emo-tion does not directly cause certain behaviour [1], although the same article does not deny that emotions can have a direct effect on behaviour. Which coincides with another statement that human beings function well when emotion directly stimulates cognition, and not so well when emotion directly stimulates behaviour [31]. Damasio also claims that emotion is feedback, that it comes after the rele-vant behaviour and is therefore too late to cause it [2], and another statement

1Subjective experiencing the arousal of the nervous system, see appendix 2Actions made in an environment, see appendix

(27)

4 2 Background

suggest that emotion is the result, not the cause of behaviour [31].

Mostly emotions work toward the personal interests and the benefits that can be made. A drastic and perhaps the most basic example would be the instinct to survive and procreate, a not so drastic instinct could be to increase ones status or happiness. Living emotional organisms only have emotions about things that matter to them [2] according to Damasio. Shaver et al. writes that behaviour is aimed at producing change in emotion [31], since emotion can drive behaviour it indicates a relationship between behaviour and emotion, and thus makes emo-tion a prerequisite for moral behaviour as well.

Another paper writes about how emotion helps learning [1], this connects emo-tions with the social intuitionist model that Haidt writes about [8], these intu-itions are learned from every interaction in the environment and later applied as intuitive gut feeling judgement, similar to a muscular reflex for the physical world but in this case as a mental reflexes. Also in neuroscience evidence of a con-nection between emotion and morality can be seen in an fmri image [6], when dealing with personal moral dilemmas the fmri image displays the same areas that activate during emotional processing. This results in a strong connection between moral information processing and emotional processes.

2.2

Morality theory

According to Jonathan Haidt morality3is based upon what he calls moral

founda-tions [12, 13, 16], these foundafounda-tions are groups of moral values. The foundafounda-tions entail different capabilities of morality or faculties4. The two most important of

morality are reciprocity5 and empathy6 [3]. Some older beliefs about morality has been that it can only be applied by humans and thus something not existing in the animal kingdom, Frans de Waal shows how monkeys act morally by not accepting food until his friend also gets the same kind [3]. Historically morality has been interpreted as human reasoning, however this is not correct according to current research, since morality is also used in subconscious actions and judge-ments. On a cognitive level there are two kinds, moral intuition and moral rea-soning, according to Haidt [14]. The former involves a fast or intuitive reaction to something that happens, this is also something that Haidt writes about when describing the social intuitionst model [8]. However from analysing fmri scans Haidt et al. comes to the conclusion [6] that when trying to strip off the various processes that the concept moral judgement7is comprised off, there is not much

left, they write in [6],

Thus, the interrelationships among these overlapping concepts is com-plicated, and many relevant details remain unclear. What is becoming

3Intentions, decisions and actions which are good or bad, see appendix 4Ability, skill or power, see appendix

5Response actions, see appendix 6Recognize emotions, see appendix

(28)

2.2 Morality theory 5

increasingly clear, however, is that there is no specifically moral part of the brain. Every brain region discussed in this article has also been implicated in non-moral processes.

There are two approaches to morality one that depicts that morality is only learned at childhood and the other suggests that it is been built into humans by evolution. According to Haidt, morality is both natural and learned [12]. Even though cul-tures around the globe are very different they still need to solve the same prob-lems(e.g. power, resources, care, conflicts). Different routines are formed but the basic moral modules8are the same across civilizations.

Evolution favours group selection9. morality is vital for cooperation to take place

and create positive results in a large group, which is necessary for competing and becoming victorious against other groups. But if there are selfish individ-uals in a cooperative group, cheaters or defectors, group selection can not take place. Evolution solves this by putting everyone in the same boat [11], this has happened many times in nature, from bacteria to insects and humans. When hu-mans started to divide the work into groups they became much more effective, and became part of a greater cause, much of this behaviour is due to morality. As Jonathan Haidt said in one of is talks [11],

The most powerful force known on this planet is human cooperation - a force for construction and destruction

This means that humans have a very advanced understanding of reciprocity and that we can apply it in our social interaction, for good and for bad. Haidt argue that humans evolved to see sacredness all around us and to join into teams to preserve objects, people or ideas that benefit the group [11]. Most people want to do something good and noble during their life time, overcome pettiness and become part of something larger, and that’s where morality comes in, to aid group cooperation for making such desires possible.

Dispute often occurs between two parties, both with the belief that they are right, how that can happen can only indicate that they have different standards and values that they rely on. There will always be human differences in perception and interpretation of events. Just as we need a common understanding of words to communicate, we need a common understanding of behaviour to act morally. morality hasn’t always been the same it has evolved with the advancement of our progress as a human civilization. For example, not long ago it was considered impermissible to steal medicine for a mortally ill relative from an apothecary that wouldn’t give away medicine to a poor person who could not pay the price. Not long ago this was generally accepted in society, compared to modern consensus, who proclaims that it is permissible to steal the medicine, since the value of life overpasses greed and laws in this example. Today people think that the right thing to do would be to acquire the medicine to save one person no matter the cost, specially if it is someone close.

8The most distinct are suffering, hierarchy, reciprocity and purity 9As natural selection but between groups in evolutionary biology

(29)

6 2 Background

2.3

Moral emotions theory

A moral emotions is an emotion that is related to an action or event during be-haviour that affects the outcome of another individual or society in either a good or bad manner, i.e. moral conduct, that emotion then becomes a moral emotion according to Haidt [9]. Haidt et al. describes this in [14],

Moral intuitions are about good and bad. Sometimes these affective reactions are so strong and differentiated that they can be called moral emotions, such as disgust or gratitude, but usually they are more like the subtle flashes of affect that drive evaluative priming effects.

There are two component features of an emotion that identifies a moral emotion, elicitors and action tendencies. The moral emotions are gathered into families by Haidt, the families are other-condemning, self-conscious, other-suffering and other-praising [9].

2.3.1

Other-condemning family

This family contains the negative emotions humans get from other individuals or actions that represent cheating, lies or faking the appearance of being reliable. Anger

Anger is an emotional response to a threat or provocation, it becomes the domi-nant emotion when a person takes action to stop that threat. It may be physically correlated to such as increased heart rate, blood pressure[wiki].

Elicitors: Betrayals or unwarranted insults give rise to anger, goal blockage and frustration will also trigger anger, it is associated with injustice[9].

Action tendencies: Revenge, either physical or emotional. To hurt, either through attacking or humiliating the person who has acted in an unjust way[9].

Disgust

The response to something offensive, unpleasant or revolting, primarily to a sen-sation of taste but also by smell, touch or vision[wiki]. It helps to distinguish between groups of different status.

Elicitors: Mainly from violations of physical purity but also objects or ideas, for example food and sex, and by moral depravity. It can also be applied in response to hypocrisy, betrayal or cruelty[9].

Action tendencies: To avoid, expel, segregate or otherwise break off contact with the offender[9].

Contempt

It is easily mistaken as disgust and sometimes as anger, considered a secondary emotion it is a mix of anger and disgust. In contrast to anger and disgust, it is

(30)

2.3 Moral emotions theory 7

characterized by low arousal, bordering on indifference[wiki].

Elicitors: The emotion is triggered when experiencing violations of duty and hi-erarchy and also with disrespect[9].

Action tendencies: It doesn’t motivate withdrawal or attack, only a change in mentality, such as less respect and weakens other positive moral emotions[9].

2.3.2

The Self-Conscious family

These emotions are self triggered, used to correct one’s behaviour in groups so to avoid triggering anger, disgust or contempt of others. Shame and embarrass-ment are very similar, it mostly depends on culture if a human has developed a distinction between them.

Embarrassment

Embarrassment is related to hierarchy, and is thus considered a social emotion, it is the weakest of the self-conscious emotions.

Elicitors: When one’s social persona10 is threatened, a social convention rule is

breached by the self or events out of one’s control. Most often experienced when one is around people of higher status[9].

Action tendencies: Hide, withdraw or disappear on both movement and speech, with the intention of reducing the punishment. With a lesser urge than shame[9]. Shame

Shame is more aligned to towards moral norms and the violations of them. It is more cognitively oriented than embarrassment.

Elicitors: It triggers when the self is under the belief that someone else knows about the self caused violation, from failure to live up to morality, aesthetics or competence[9].

Action tendencies: Hide, withdraw or disappear on both movement and speech, with the intention of reducing the punishment. With a larger urge than embar-rassment [9].

Guilt

This emotion is often confused with shame since it is also in relation to hierarchy, but aligned more towards relationship and focuses on the behaviour rather than the self image.

Elicitors: It can trigger when one’s self has the belief that one has caused harm, loss or distress to a another person. The emotion is strengthened by the closeness

(31)

8 2 Background

of the relationship[9].

Action tendencies: It motivates one to help one’s victim or otherwise to make up for one’s transgression to restore or improve the relationship[9].

2.3.3

The Other-Suffering Family

This is one of the more basic family of emotions that develops early, even with in the first year of a child, and they are also found in chimpanzee. They are characterized by being moved by or understanding others suffering.

Compassion

It is also know as empathy and sympathy.

Elicitors: When perceiving suffering or sorrow in another person, and can but not a always, bring the desire to help. It can be felt for a complete stranger and can be very strongest towards a person with a close relationship[9].

Action tendencies: It creates a desire to help, comfort or reduce the suffering in other[9].

2.3.4

The Other-Praising Family

The family of emotions that is associated with good actions and positive moral perceptions belong to this category. They work differently than the negative emo-tions, usually there is no procedure or action to correct wrong behaviour, rather they open perception to novel ideas, relationships or possibilities. These emotion encourages people to improve themselves to better comply with future events. Greatly motivates good behaviour.

Gratitude

Encouraging others to repay the benefactors.

Elicitor: Is the feeling that triggers when someone has done a good deed to one’s self, the more costly and unexpected the greater will the feeling to repay be[9] Action tendencies: Attempts to return a similar favour, and as a motivator for morally good behaviour[9].

Awe

Simply awe is a combination of surprise and fear, but it is also likely that this emotion is a very complex emotion comprised of many more components such as reverence, wonder, admiration, respect, genius, great beauty, might, etc.

Elicitor: Awe triggers by beauty and exemplar actions, and also by kindness and charity[9].

(32)

2.4 Moral emotional decision making 9

Elevation

This emotion has the opposite composition of disgust.

Elicitor: Acts or actors of charity, kindness, or loyalty and self-sacrifice are triggers[9]. Action tendencies: It motivates people to help others and to become a better person[9].

2.4

Moral emotional decision making

2.4.1

Ethics and morality

Morality can sometimes interchangeable be used with the word ethics, however there is a distinct difference between them. While both entail the essence of right and wrong behaviour, morality centres around the self image and is subjective and intuitive to the actor, it can also change depending on that persons beliefs. Where ethics is a code of conduct imposed by i.e. a group of people or culture[4]. Meaning that ethics depict objectively how different persons should act in a given situation and environment. Both concepts are important in understanding the essence of how morality affect behaviour. This is how Haidt describes how his moral modules of morality affects our behaviour[13],

These modules generally have as one of their outputs the emotion of compassion: the individual is motivated to act so as to relieve suffer-ing or otherwise protect the child.

Therefore a moral person would be someone that is making good decisions ac-cording to the ethics or in other words society’s values along the lines of the definition of a good person. Two approaches are suggested in the literature for processing moral behaviour, top down and bottom up.

Top down

In moral context this approach consists of moral theories derived from Conse-quentialism, which is a set of rules. These rules set boundaries for the algorithms driving the behaviour[34]. This approach encounters difficulty with computa-tional load since there is a need for vast amount of human knowledge required to process such concepts. It is also difficult to set the initial conditions for this approach. The top down approach is a symbolic way of implementing morality.

Bottom up

This approach represents routines used for action selection, values and norms formed by experience, it considers goals or standards that may or may not be based on moral theory, if they are, it is to declare the task for the process of ac-tion selecac-tion. In contrast to the top down approach the bottom up approach

(33)

10 2 Background

depends on the learning ability and the sensors of a system [34]. As such this ap-proach encounters technological constraints. It could mimic the subjective part of morality and it is consider dynamic and flexible.

2.4.2

Decision theory

A true moral behaviour process will arguable need both ends of the top down and bottom up approaches to fully cover the different aspects of moral behaviour. Most of human behaviour is composed by a rapid intuitive response mechanisms. Loewensteins model of decision making describes how emotions can affect the cognitive process [20] and create such mechanism, the same mechanism that Haidt calls intuitive social behaviour which are conducted in a moral context. A figure of Lowensteins process can be viewed in the figure 2.1. This emotional process is modelled in emobn created by Johansson[17] based on ebn. Most of human behaviour is composed by these rapid intuitive response mechanism, this behaviour caused by intense emotions can even be against the subjects inter-ests. This could explain how emotions can override reason and how decisions can seem irrational during strong emotional episodes. Since with less informa-tion about the world that persons percepinforma-tion is more narrow than with a weaker emotional influence.

(34)

3

EMOBN

architecture

This section is based upon Johansson’s Ph.D thesis [17] and the article [18].

3.1

Introduction

The behaviour network approach was first introduced by Maes [21] in 1989 as an energy-driven approach to decision making. A behaviour network consists of goals, states, behaviour modules and parameters. The decision making mecha-nism is based on the notion of activation. At each cycle of the behaviour network activation is spread from the goals of the network to behaviour modules which in turn spread activation to other behaviour modules. The activation of a behaviour module can be seen as the utility of the behaviour. The more activation, the more desirable it would be to perform that behaviour.

In 2009, Johansson and Dell’Acqua [18] extended the behaviour networks with emotions to create a general affective decision making model. They called the new model Emotional Behaviour Network (emobn). The parameters of the net-work were made subject to the emotional state of the agents. The authors intro-duced the notion of influences to let emotional states directly affect the activation of behaviour modules without being preconditions. They also let emotions affect the probabilities of the effects to mimic the pessimistic vs. optimistic judgement of humans.

The different components of emobn, the activation spreading and the action se-lection mechanism are described in detail below.

(35)

12 3 emobn architecture

3.2

Components of

EMOBN

A emobn is represented by a tuple hS, B, G, R, E, Li of six components: a set of states S, a set of behaviour modules B, a set of goals G, a set of resources R, a set of emotions E, and a set of emotional links L.

3.2.1

States

A state represents a belief the agent has about some aspect of the environment (and itself). We write v(s) to indicate the value of a state s ∈ S. We let 0 ≤ v(s) ≤ 1. Given a state a, we write a to indicate its opposite.

States are graphically represented as rectangles.

3.2.2

Behaviour Modules

A behaviour module represents an action that can be performed by the agent. A behaviour module k ∈ B consists of a set of preconditions Prec(k), a set of effects

Effects(k), and a set of emotional links eLinks(k).

Preconditions. Preconditions are represented by states, that is Prec(k) ⊆ S. A behaviour module has none, one or possibly several preconditions. They must be true for the behaviour to be executable. When the preconditions take continuous values, the executability value of k is obtained by multiplying the values of the preconditions. (See step (2) in Fig.3.4.)

Effects. A behaviour module k has one or several effects representing the conse-quences of the behaviour. Effects are represented by states Effects(k) ⊆ S. Each effect s of k is coupled with a probability pk(s), with 0 ≤ pk(s) ≤ 1.

Emotions affect how optimistic and pessimistic we are, and also how much risk we are willing to take. Johansson [17] suggested that emotions should affect the probability of the effects. To do this, each effect must know the perceived “good-ness” of the effect. If the agent is optimistic, the probability for a negative effect should decrease while the probability for a positive effect should increase. The perceived goodness is called benevolence and has value +1 if it is positive, −1 if it is negative, or 0 if it is neutral. The benevolence is specified for each effect of the behaviour module. Using this value, the new emotional probability bpk(s) of

an effect s of k is defined as:

bpk(s) = pk(s) × (1 + benevolence

×(posE − negE) × π)

where π is a constant used for determining to what extent emotions should alter the perceived probability, and posE and negE are the average values of emotions that affect risk-taking positively and negatively, respectively.

For example, happiness and anger make humans subjectively increase probabili-ties of favorable events occurring, while at the same time lowering the probabil-ities of negative events happening. In this case, the formula above changes the probability to a higher value if anger is high and benevolence is positive.

(36)

3.2 Components of emobn 13

3.2.3

Goals

A goal specifies what an agent wants to accomplish. Every goal has one or more

conditions, represented by states, that need to be fulfilled in order for the goal to

be successful.

A goal has a static importance x, with 0 ≤ x ≤ 1, and a dynamic importance y that is a value linked to states and emotions. This enables the goal to have a varying importance depending on the value of that state. The importance ig of a goal

g ∈ G is defined as:

ig = f (x, y)

where f is any continuous triangular norm. In our framework we use multiplica-tion.

Goals are depicted as diamonds.

3.2.4

Resources

A resource represents a necessary means to execute a behaviour module. Each resource is coupled with the number of available units and the number of bound units (units of this resource that are currently being used by behaviours).

3.2.5

Emotions

Eis a set of emotions. For each emotion e ∈ E we write v(e) to indicate its value, and we let 0 ≤ v(e) ≤ 1.

3.2.6

Emotional Links

Emotions should be able to affect individual behaviour modules directly. For in-stance, people might be more inclined to dance if they are happy than if they are sad, regardless of the effect of dancing. However, being happy is not a require-ment for dancing and should therefore not be a precondition.

Emotional links allow us to couple emotions to behaviour modules. An emotional link he, k, zi ∈ L expresses a connection between an emotion e and a behaviour module k with a given strength z (with −1 ≤ z ≤ 1). z determines the extent to which e affects the selection of k for execution. We let eLinks(k) be the set of all emotional links to a behaviour module k. eLinks(k) contains the tuple he, zi for every emotional link he, k, zi ∈ L.

Given a behaviour module k, the emotional influence n(k) on k is defined as:

n(k) = X

he,zi∈eLinks(k)

(v(e) × z) where v(e) is the value of the emotion e and z its strength.

(37)

14 3 emobn architecture

3.3

Activation Spreading

Every behaviour module receives activation from each goal in the network. The activation is spread from the goal to the behaviour modules. In turn, the be-haviour modules spread activation internally from module to module. The total activation for each behaviour module is calculated and used to select which be-haviour to execute.

3.3.1

Network Parameters

The activation spreading in is controlled by the following parameters.

γ: activation influence. It determines how much activation is spread through

positive links1. The activation influence implicitly determines the amount

of planning the agent is capable of.

δ: inhibition influence. It determines how much activation is spread through

negative links. The inhibition influence implicitly determines the amount of planning the agent is capable of.

β: inertia. It determines how much the last activation affects the current one.

Having a high inertia value will suppress erratic, indecisive behaviour. How-ever, having a high inertia will decrease reaction time.

θ: activation threshold. It determines the threshold that the execution value

must exceed for the behaviour to be executed.

The parameters affect the activation spreading in different ways. The inertia pa-rameter β affects how easily the agent switches behaviour. A low value of β will result in a more flip-flop behaviour, where the agent has a tendency to switch behaviours more often and more easily. The parameters δ and γ both affect how far in the future the agent can plan. Using low values for these parameters will give much less activation to behaviour modules that are further away from goals. The parameters γ, δ and β of the behaviour network have to lie within the inter-val [0, 1] for the system to be stable. One must take this into account when letting emotions influence the parameters. Setting the initial values of the parameters is fairly straightforward using trial-and-error methods. It has furthermore been proven that a behaviour network is goal converging no matter the parameter val-ues if it is dead-end free and terminating [24].

In , the network parameters are effected by the emotional state of the agent. Dif-ferent emotions affect difDif-ferent parameters. For example, the emotions used in [18] are given in Table 3.1.

The overall influence of the emotions in Table 3.1 is controlled by an emotional impact parameter ε. Any parameter P ∈ {γ, δ, β} has a corresponding emotional

1Positive links occur when the effect of a behaviour module is the same as the precondition of

another behaviour module. Likewise, negative links signify a link where the effect of a behaviour module is the opposite of the precondition of another behaviour module.

(38)

3.3 Activation Spreading 15

version bP defined as:

b

P = P × (1 + (posE − neg E) × ε)

where posE is the average value for the emotions with positive impact for that parameter P and negE is the average value for the emotions with negative impact.

3.3.2

Activation Spreading

Each cycle time t, activation propagates from the goals to the behaviour modules. There are four ways by which a behaviour module can receive activation. Below, we let k be a behaviour module, g a goal with importance ig and t the cycle time.

Case 1. k receives positive activation A from g at t if there is an effect s of k that

is one of the conditions of g.

A = γ × ig ×bpk(s)

Case 2. k receives negative activation B from g at t if there is an effect s of k

that is the opposite to one of the conditions of g.

B = −δ × ig ×bpk(s)

Case 3. Consider all behaviour modules j having a precondition sj that is one

of the effects of k. The amount of activation C that k receives from g at t via all

j’s is: C = γ ×X for everyj∈B such that ∃sj∈S(sj∈Prec(j)∧sj∈Effects(k)) (σ(at−1 jg ) × bpk(sj) × (1 − v(sj)) where σ(x) = 1 1 + eλ(µ−x)

Parameter Positive Negative

γ sadness fear fatigue hunger anger δ sadness fear fatigue hunger anger β anger happiness sadness fear

(39)

16 3 emobn architecture

σ(x) is a Sigmoid function used here to filter the previous activation value for

that particular goal and module. The parameters λ and µ are constants used to modify the shape of the Sigmoid curve.

The equation above states that the less fulfilled a precondition of a module is, the more activation will be sent to modules that fulfil that precondition.

Note that using at−1

jg in the definition of activation implies that each module must

store the activation received by each goal during the previous cycle time t − 1.

Case 4. Consider all behaviour modules j having a precondition that is the op-posite of one of the effects of k. The amount of activation D that k receives from

g at t via all j’s is:

D = −δ ×X

for everyj∈B such that ∃sj∈S(sj∈Prec(j)∧sj∈Effects(k)))

(σ(at−1

jg ) × bpk(sj) × (1 − v(sj)))

−δ ×X

for everyj∈B such that ∃sj∈S((sj∈Prec(j)∧sj∈Effects(k))

(σ(at−1

jg ) × bpk(sj) × (1 − v(sj)))

Total activation. The activation at

kg given to the behaviour module k by the goal

g at cycle time t is set to the activation that has the highest absolute value: atkg = absmax(A, B, C, D)

This implies that only the strongest path from each goal to a behaviour module is used. Combining activations from the different paths is not allowed.

The total activation at

k for a behaviour module k is the sum of the activations

given to the module from all goals in the network plus the total activation calcu-lated at the previous cycle multiplied by the inertia parameter β.

atk = β × at−1 k + X for every g∈G atkg

3.4

Action Selection

The action selection mechanism for emobn is specified by the following proce-dure. Let t be the cycle time.

(40)

3.4 Action Selection 17

1. Calculate the total activation at

k for every k ∈ B. atk = β × at−1 k + X for every g∈G atkg

2. Calculate the executability et

kfor every k ∈ B.

etk = Y

s∈Prec(k)

v(s)

3. Calculate the execution value ht

kfor every k ∈ B.

htk = at

k× etk×(1 + n(k))

4. Sort all behaviour modules by their execution values, largest value first.

5. For each behaviour module k in the sorted list, check that: (i) the execution value ht

kexceeds θ, the required activation

thresh-old.

(ii) there are enough unbound resources required to execute k. If both conditions (i-ii) are met, bind the resources and choose k for execution.

6. Unbind all resources, increase the cycle time t by one unit. Figure 3.4 Action selection mechanism

Remark 1. The executability at step (2) is calculated by multiplying the value of the preconditions of k. In general, any triangular norm over the preconditions of

k can be used. The executability is a measure of how likely it is that a behaviour

can execute successfully.

Remark 2. At step (3) since at−1

k and not ht−1k is used in the calculation of atk,

emo-tional influences will only affect the module locally. The change in the execution value is not spread to other modules in the next cycle.

Remark 3. The action selection mechanism presented above differs from the orig-inal one [18] for what concerns the handling of the activation threshold θ. The original definition contains a threshold decay ∆θ that determines how much acti-vation threshold should be lowered if no behaviour module could be selected for execution. Here, the activation threshold is fixed, and cannot be lowered. Only a change in the values of the emotional influences can trigger a behaviour that couldn’t be triggered earlier.

(41)
(42)

4

Models

We begin this chapter with a summary of the theories in the background chapter and move on to explain the relevant models for the moral design, these models and theories are used as tools for the design.

4.1

Background theory summary

From the presented background theory emotions are triggered by sensory input and/or thoughts, to drive the agents actions towards beneficial outcomes for the self. They can be controlled by conscious thought, which will lead to either in-creased or dein-creased strength of the emotion. Conscious thought processes cre-ates and modifies knowledge and stores this in memory as values, that are always available for access during interaction with the environment. This extraction of the values with out consciousness is the concept of intuitions. Behaviour that is purely driven by strong emotions could result in uncontrolled and/or irrational behaviour. The moral emotions are a combination of emotions and personal val-ues that bring positive outcomes for others, and in the best case also for the self. The figure 4.1 is a conceptual diagram of how emotions and reasoning can be linked, created from our understanding of the theory researched for this thesis.

(43)

20

4

Models

Figure 4.1: A diagram of the theories from the background chapter and how the different concepts are linked according to our understanding.

(44)

4.2 Moral Foundation Theory 21

4.2

Moral Foundation Theory

Haidt describes mft as evolutionary prepared basic routines linked to appraisal in specific social behaviours [13]. mft is an attempt to categorize similarities across different cultures with respect to morality. It has been coined intuitive ethics by Haidt. He is the co-founder of the moral foundations. He argues that morality is not based around one core value but many. To better explain the moral foundations Jonathan Haidt uses an analogy with taste. Humans have different receptors for different basic tastes, he argues that morality could be described in a similar way. In mft care, fairness, liberty, loyalty, authority and sanctity are the receptors of different basic moral values that exist in every culture and some even in the animal kingdom. It is also an organized way of displaying a measurement of the moral concerns among individuals, groups or cultures. mft is standardized and can be used to quantify morality in a systematic way [12, 13, 16]. mft is based on earlier work from Fiske, Shweder and Hogan. Haidt’s view on morality is that it is both inherent and learned during childhood. Haidt has found evidence for the existence of inherent morality as he writes [13],

It is therefore implausible that mammals learn entirely through domain-general learning mechanisms how to recognize suffering or distress in their offspring. Rather, many mammals have innate harm-detection modules that were shaped by evolution to be responsive to the proper domain of signs of suffering in their own offspring.

The two most basic and best understood foundations are care and fairness. Ac-cording to Frans de Waal these two moral foundations are also found in the ani-mal kingdom, which exists in monkeys and elephants according to Frans de Wall [3]. We will focus on one foundation, care/harm, perhaps the most basic founda-tion in this thesis for our design/implementafounda-tion.

Virtues are the positive traits that are morally good and a person behaving in virtuous way is considered a moral being. The inverse concepts are vices, they could be considered bad behaviour that are counterproductive towards morally good behaviours.

(45)

22 4 Models

4.2.1

The six moral foundations

Haidt et al. have established [16] six moral foundations that morality consists of, here is a short description of the six pillars.

• Care/Harm is the virtue of kindness, gentleness, and nurturance, the oppo-site is violence.

• Fairness/Cheating is the pillar that contains justice, rights, and autonomy. • Liberty/Oppression which contain solidarity, to oppose or take down an

op-pressor.

• Loyalty/Betrayal can be summed up in the saying "‘one for all, all for one"’, virtues of patriotism and self-sacrifice for the group describes this pillar well.

• Authority/Subversion contains virtues of leadership and followership, this is true for traditions.

• Sanctity/Degradation entails living in an elevated, less carnal, more noble way, for example by not living up to certain moral values you desecrate your self.

Five of the moral foundations that are fully mapped out can be viewed in the table 4.1. The liberty foundation has recently begun to be evaluated in Haidt’s latest paper and therefore is not fully mapped out.

(46)

4.2 Mor al F ounda tion Theory 23

care/ harm fairness/ cheat-ing loyalty/ be-trayal authority/ sub-version sanctity/ degra-dation Adaptive chal-lenge Protect and care for chil-dren Reap benefits of two-way partnership Form cohesive coalitions Forge benefi-cial relation-ships within hierarchies Avoid com-municable diseases Proper domain Suffering,

dis-tress, or threat to one’s kin Cheating, coop-eration, decep-tion Threat or chal-lenge to group Signs of domi-nance and sub-mission

Waste prod-ucts, diseased people

Actual domain Baby seals, cute cartoon characters Martial fidelity, broken vend-ing machines Sports teams, nations Bosses, re-spected profes-sionals Taboo ideas, racism, deviant sexuality Characteristic emotions Compassion for victim, anger for perpetrator Anger, grati-tude, guilt Group pride, rage at traitors, belongingness

Respect, fear Disgust

virtues Caring, kind-ness Fairness, jus-tice, honesty, trustworthi-ness Loyalty, patrio-tism, self sacri-fice Obedience, def-erence Temperance, chastity, piety, cleanliness vices Cruelty Dishonesty Treason,

cow-ardice

Disobedience, uppitiness

Lust, intemper-ance

(47)

24 4 Models

4.3

The trolley problem

We have chosen a well known dilemma called the trolley problem [25]. This is a philosophical thought experiment with various cases. The different cases have seemingly small differences to the composition of the dilemma, while the behaviour to execute remains the same, but the permissibility changes from each case. In the study of the trolley problem compassion is the primary emotion and therefore we can narrow down the mft foundations to just the one concerning care and harm.

The scenario plays out at a track where an accident is witnessed in first person. A runaway trolley is heading towards five people that will be hit by the trolley and die, this is for certain. There is no way around this outcome according to the setup of the scenario. Thus this is the belief of the observer. However the observer have the ability to, also with 100% certainty, direct the trolley away from the five persons, but by doing so the observers action will for certain kill one other person. There more cases which have seemingly small differences to the composition of the dilemma. While the outcomes remains the same in all cases the action to execute changes, and so the permissibility changes from each case.

Since the trolley problem is a well know dilemma both in the public and in cog-nitive psychology, we consider it a good reference point. These cases have been studied by linguists, philosophers and psychologists. The two most polar cases, which can be seen in the graphs of [22], are the bystander and the footbridge cases. With a small rearrangement of the dilemma the action to take suddenly becomes completely impermissible in contrast to the previous arrangement. The simplicity of the dilemma and with differentiating outcomes makes this a suited test bed for establishing the validity of our design. Since the trolley problem is a well know dilemma both in the public and in experimental psychology, we consider it to be a good reference point.

(48)

4.3 The trolley problem 25

4.3.1

Bystander scenario

In the bystander case you are standing on the side of the track with a lever that can switch the track so the run away trolley diverts into one person instead of the five, which the trolley was heading towards. A diagram of the bystander case is given in figure 4.2.

Trolley ?

Figure 4.2: A diagram of the bystander case.

4.3.2

Footbridge scenario

For the footbridge you are standing on a bridge next to another human that you do not know, however you do know for certain that his weight will stop the trolley heading towards the five persons on the track, which is depicted in the figure 4.3. This other person on the bridge has not given consent to being pushed down on the track.

Trolley ?

(49)

26 4 Models

4.4

Universal Moral Grammar

umg is a parallel perspective to ug1 in the respect that it is true to the same concepts and models used to figure out what the elements of language are and how language work. But in the case of umg, the goal is to formulate the building blocks of morality and how it works. Mikhail proposes a framework for umg [22] that can aid in understand and model moral behaviour.

According to Mikhail umg might be innate [22]. He claims that we have innate moral principles, that they are the foundations used to develop our moral values upon. Such that morality would indicate a innate base of right and wrong in chil-dren that are not taught by parents, society or cultures, but like a inborn trait that functions as a rule system guiding the process of building up a persons morality. In never encountered situations they guide our behaviours, the so called gut feel-ings. Based on that we will always have some preferred behaviour instinctively, and this is also what Haidt suggests with his social intuitionist model. With these different perspectives they strengthen the theory of a innate moral foundation, and this gives ground to why certain feelings of right and wrong can not be ex-plained by the elicitor [8, 9, 10, 12, 13, 16, 23].

An example in language for how grammar is innate is clearly made by a quote from Chomsky,

Colourless green ideas sleep furiously

It is a nonsemantical and nonsensical sentence, but its well formed English from a syntax point of view. Some how English native speakers feel that a sentence with the format: adjective adjective noun verb adverb, sounds good. When people are asked in experiments, they often can not explain why they have that opinion, yet they still have a notion that it is acceptable. For example a reverse order of the same sentence is not a well formed English sentence, syntax wise, and some how feels wrong.

A intentional structure diagram with umg for the bystander case can be viewed in the figure 4.4. A similar intentional structure diagram with umg can be viewed in figure 4.5 for the footbridge case.

With umg we can verify causality, intentions and morality of a given state or be-haviour. This is used as a validation layer for designing the states and behaviours with the een and how they are sequentially linked together.

(50)

4.4 Universal Moral Grammar 27

Figure 4.4: umgintentional structure of the bystander dilemma [22].

(51)
(52)

5

The moral design

The extension of the emobn and the process of how our design is implemented are explained in this chapter.

5.1

EEN

, extension of the

EMOBN

The addition of moral emotions in the decision making created a need for new components in emobn, we added expected emotion to the existing architecture and called it een short for Expected Emotion Network.

5.1.1

Introduction

In emobn all goals should be affective [17]. As the author points out, psycholog-ical theories suggest that humans make decisions that try to maximize positive emotions while minimizing negative emotions (for more information, see Anja’s phd Section 2.3.2). The ultimate reason behind many ordinary behaviours is to increase positive emotions, such as happiness, and likewise minimizing negative emotions, such as fear or sadness.

While many of our choices are not consciously made to improve our mood, sub-consciously the prediction of the future consequences for our emotional state is a key factor in decision making.

5.1.2

Components of

EEN

A een is represented by a tuple hS, B, R, E, L, X i of six components: a set of states S, a set of behaviour modules B, a set of resources R, a set of emotions E, a set of emotional links L, and a set of expected emotions X . We assume that S contains both states and their opposite, that is, k ∈ S if and only if k ∈ S.

(53)

30 5 The moral design

Emotions, states, and resources and behaviour modules are defined similarly as in emobn.

Expected Emotions

An expected emotion t ∈ X has a valence represented as Val(t) and an importance represented as Imp(t).

An expected emotion is induced by states and behaviours. Given t ∈ X , we write

Ind(t) and Rel(t) to indicate the set of states and behaviours that induce, resp.

release, the value of t.

Ind : X → (S ∪ B) Rel : X → (S ∪ B)

Given an expected emotion t and a state or behaviour s, we indicate with qt(s) the

strength by which s induces or releases t, with 0 < qt(s) ≤ 1.

Expected emotions are depicted as rectangles with a triangle on the left side.

5.1.3

Graphical representation of

Let hE, S, R, B, X i be a een. A graphical representation of a een network is built in two phases: in the first phase p-arcs are added to the network and in the sec-ond phase n-arcs.

Phase 1

• For every t ∈ X , add a p-arc from any element in Ind(t) to t. • For every k ∈ B, add a p-arc

- from every precondition of k to k, and - from k to every effect of k.

• For every he, k, zi ∈ L with z ≥ 0, add a p-arc from e to k. Phase 2

• For every t ∈ X , add a n-arc from any element in Rel(t) to t. • If there is a p-arc from a state s to a behaviour module k, add a

n-arc from s to k.

• If there is a p-arc from a state s to a behaviour module k, add a n-arc from s to k.

• For every he, k, zi ∈ L with z < 0, add a n-arc from e to k.

Below we use a plain arrows to represent positive arcs (p-arc), and dash arrows to represent negative arcs (n-arc).

(54)

5.1 een, extension of the emobn 31 5.1 Example

Figure 5.1: Graphical representation of a een component.

5.1.4

Transformation Γ

In this section we introduce a transformation, called Γ, that maps een into emobn. Γmaps an een hE, S, R, B, X i into a emobn hE, S, R, G, Bias follows.

(55)

32 5 The moral design

Transformation Γ • E′= E

• S′contains all the states in S (that is, S ⊆ S).

• R′ = R

• B′ contains all the behaviour modules in B that neither induce nor

release any expected emotion. • For every t ∈ X do:

- Add two new states a and a to Sboth with value 0.5.

- Add a new goal g to G. g has static importance Imp(t) and dynamic

importance 0.

- If Val(t) > 0, the condition of g is a, otherwise a.

- For every state s ∈ Ind(t) add a new behaviour module j to B. The

precondition of j is s and its effect is a with benevolence = 0 and pj(a) =

qt(s).

- For every state s ∈ Rel(t) add a new behaviour module j to B. The

precondition of j is s and its effect is a with benevolence = 0 and pj(a) =

qt(s).

- For every behaviour k ∈ Ind(t) add a new behaviour module j to B.

j has the same preconditions and effects as k plus the effect a with benevolence = 0 and pj(a) = qt(k).

- For every behaviour k ∈ Rel(t) add a new behaviour module j to B.

j has the same preconditions and effects as k plus the effect a with benevolence = 0 and pj(a) = qt(k).

References

Related documents

Hypothesis 4: When correcting for differences in relevant negative emotions and moral perceptions, there is no remaining statistically significant difference between the

understand and harness the natural physiological conditions of the amoeba phagosome and its interactions with the giant virus. A significant proportion of amoeba proteins identified

The project aims to explore the role of emotions in decision-making and performance among private active traders (i.e. people that make investment decisions frequently and with their

Linköping Studies in Arts and Science Dissertation No. 698, 2016 Department of Management

The growth dynamics of faceted three-dimensional (3D) Ag islands on weakly- interacting substrates are investigated—using kinetic Monte Carlo (kMC) simulations and

Based on observations of collaborative group discussions within Swedish university-based midwifery education, the study shows how students negotiate the appropriate feeling norms in

In this paper we have addressed the problem of identification of hybrid dynam- ical systems, by focusing our attention on piecewise affine (PWARX), hinging hyperplanes (HHARX),

Accordingly, only one study by McGreevy and Rogers (2005) assessed a possible correlation between the foot preference when standing or grazing with the first nostril used