Responsible Innovation \u0026 Ethical-Constructive Technology Assessment

23
Responsible Innovation & Ethical-Constructive Technology Assessment ASLE H. KIRAN | 10.06.2014

Transcript of Responsible Innovation \u0026 Ethical-Constructive Technology Assessment

Responsible Innovation & Ethical-Constructive Technology Assessment

ASLE H. KIRAN | 10.06.2014

• elsa | responsible innovation | integrated research → nano- & biotechs, agriculture, neuroscience, health care, etc.

• technologies are not mere neutral means; they “have politics” a. they cannot be (fully) controlled after-the-fact b. but we can anticipate (some of) their consequences c. and therefore shape (some of) their consequences proactively → responsible innovation

• furthermore: new, emerging technologies impact on ethical and normative aspects of society and practices → consequently: we should also (be able to) shape these aspects

• “specific normative effects from specific technology design” → easier said than done!

introduction: ‘responsible innovation’

• from policy to design

• technology-society relation - externalism & internalism - hard & soft impacts

• some normative design methodologies

• ‘ethical-constructive technology assessment’

• eCTA-example from health care

overview

• 1st generation TA: external control (governance) - 1960s: a push for greater social responsibility in innovation - reveal unintended and adverse future effects of innovation - TA should pave way for new and better policy-making

• Collingridge’s dilemma: “the social consequences of a technology cannot be predicted early in the life of the technology. By the time undesirable consequences are discovered, however, the technology is often so much part of the whole economics and social fabric that its control is extremely difficult” (1980, p.11)

→ a criticism of the practical scope and legitimacy of TA → undermines the assumption of (full) socio-political control/ governance over social impacts of technological development

technology assessment | from policy to design

• 2nd generation TA (1980s→): internal shaping - anticipate potential effects of new technologies in order to feed these insights back into decision making - interaction between technology developers and social scientists and / or stakeholders - “a new design practice” (Schot & Rip 1997) - constructive TA, participatory TA, prospective TA

• 3rd generation TA (2000s→): ‘normative turn’ - new generation? - midstream modulation, embedded ethics, ethical TA, value sensitive design, etc.

technology assessment | from policy to design

• technology and society external to each other: → belongs to two different and distinct domains → technology is foreign to society → relation to technology: distrust, suspicion

• technology and society bound together in a relation of tension → a threat to human forms of life

• technology development and innovation become a power struggle: → we must control technology, or it controls us (cf. TA-1)

• externalist ethics of technology: → based on precaution principle → focused on damage control

technology-society relation | externalism

• society and technology shape each other (TA 2/3)

• in order to influence technology development social actors need to enter ‘midstream’ to engage in the innovation process → TA should not just map effects (and make policies), but influence innovation trajectories/streams (“a design practice”)

• acknowledges the last part of Collingridge’s dilemma (“governance has limits”)

• but denies the first part (“early anticipation difficult”) → broad and multidisciplinary mapping methods, stakeholder input, and “ethics on the lab floor”

• democratic: stakeholders should have a word in how (and if) the technologies are developed

technology-society relation | internalism

• hard impacts: - nanotech in textiles | GMO: environmental issues - scanning tech in healthcare: radiation issues - unemployment, health risks, deaths, etc. → often unknown degrees, but part of risk/benefit analyses

• technologies also shape and rearrange “soft” aspects: a) social structures of practices and routines; roles, identities b) moral norms underlying these; ethics and values

• often in unpredictable and surprising ways: - birth control pills - tv-dinners - power saving light bulbs (revenge/rebound effects)

technology-society relation | soft impacts

• increasing investments of projects that deal with soft impacts: - elsa, mvi (‘responsible innovation’), horizon 2020 → multidisciplinary (engineers, social scientists, ethicists, etc.)

• midstream modulation | embedded ethics → different degrees of “softness” → ranges from mere monitoring to interaction → mainly ethicists talking to/informing tech developers → paternalist, but some contact with stakeholders

• eTA | value sensitive design → preoccupied with adverse effects → pre-defined set of ethical issues → universalist → also primarily paternalist

normative design | some methodologies

• negative effects: “ethical technology assessment will serve as a tool for identifying adverse effects of new technologies at an early stage” (palm & hansson 2006, p. 543)

• checklist approach to ethical issues: “1. Dissemination and use of information 2. Control, influence and power 3. Impact on social contact patterns 4. Privacy 5. Sustainability 6. Human reproduction 7. Gender, minorities and justice 8. International relations 9. Impact on human values” (p. 551)

normative design | some methodologies

• universalist: “value sensitive design builds from the psychological proposition that certain values are universally held, although how such values play out in a particular culture at a particular point in time can vary considerably” (Friedman, Kahn, Borning 2008, p. 86)

normative design | some methodologies

• kiran, verbeek, oudshoorn: ‘beyond checklists ’ (2014)

• focus on adverse effects restricts TA to evaluate how new technologies put constraints on and violate norms and values → the ways in which technologies co-produce positive norms or identities of future users are made invisible

• for instance, telecare technology means losing face-to-face contact, but prevents discriminatory attitudes (age, gender, ethnicity) towards patients

ethical-constructive technology assessment

• use of checklists implies that technologies are evaluated according to fixed and pre-given ethical principles and rules

• but society and ethics co-evolve with technology → norms and values are not given, but will be (re)constituted in relation to new technologies and vice versa…

• this interaction may change the foundations of normative judgments (cf. birth control pill)

• today’s checklists might be outdated for the society dealing with the emerging technologies

• two meanings of ‘soft impacts’ a) how technologies impact on ethical aspects in a society b) how technologies impact on a society’s norms and values

ethical-constructive technology assessment

• universalist approaches to values neglect differences between technologies and users → cannot adequately address unforeseen and unanticipated ethical consequences in different local, cultural settings → nor the diversity in how users appropriate new technologies

• ethics perspective in eCTA: → pragmatist ethics (context-sensitive) → technological mediation (human – technology – world) → soft impacts (roles, identities, relations) → how to construct a ‘good life’ (possibilities-constraints)

• not primarily focused on ‘do/don’t’ or ‘good/bad’ technology, but how a specific user, with specific abilities and interests can shape a way of living with a specific technology

ethical-constructive technology assessment

• technological (co-) shaping of identity: ‘good life’ → ‘good patienthood’: → how ATs enable and constrain possibilities for patients to construct a safe and satisfying being-with-frailty

• methodological implications: a) proactively in innovation, and b) strategies for constructing patienthood within practices

eCTA | example from health care

• use of ATs is for many patients not a question of choosing, but a matter of relating to technologies already there → patienthood is constituted through ATs

• normative challenges from perspective of innovation: → do developers and policy makers have moral responsibility to warrant contextual user-based construction of patienthood? → how should this responsibility be enacted? → is it possible to this through design?

• normative challenge from perspective of users: how should care receivers and – givers (nurses, family etc) proceed in order to construct patienthood; how to live-with ATs?

eCTA | example from health care

• in order to anticipate use context, should user requirements be translated (hardwired) into technical requirements?

• Stewart & Williams: the design fallacy: “the assumption that the primary solution to meeting user needs is to build ever more extensive knowledge about the specific context and purposes of various users into technological design” (2005, p. 198)

• more precisely defined technical requirements might lead to more rigid technological systems; less room for individual modification

• rigid systems imply that patients have similar skills and needs, but what constitutes ‘good care’ and ‘good patienthood’ differs greatly

eCTA | example from health care

• developers and policy makers should recognize the limits of being proactive and the possibilities in situated shaping → methodological insecurity | flexible design

• ATs should encourage users’ (care receivers and formal and informal givers) involvement in the construction of patienthood → involved implementation facilitates trust and responsibility (k&v, 2010)

• fine idea, but does it work in practice?

• health care moves towards personalization in many respects → but is personalization responsible?

eCTA | example from health care

eCTA |

• ‘responsible innovation’ aims to implement or reinforce a specific, desired normative understanding of a technological practice

• “Kiran’s dilemma”: this idea seems to re-introduce instrumentalism: innovation is “used” to gain a pre-defined, socially desired, responsible goal → design instrumentalism → Q: are Winner’s bridges examples of design instrumentalism?

• Ihde: the designer fallacy → what technologies mean in their use contexts always differ from their intended meanings: typewriter, telephone, sms → their social effects are unpredictable → because technologies are multistable

responsible innovation | the designer fallacy

• “designers’ intentions do not always correspond with the users’ practice; in fact, the relation between design and use is very complex and principally unpredictable” (albrecthslund on vsd, 2007, p. 63)

• positivist: symmetry between design and user context is assumed problem: this symmetry often is not the case

• counter-productive: not recognizing this means losing influence over responsible use because it underestimates outside influence

• dangerous: legislators, policy makers et al might think that technologies developed under ‘responsibility’ are foolproof with regards to future ethical problems and dilemmas

responsible innovation | the positivist problem

• multistability?

• “technological artefacts can enter into many very different human–technology relations and […] a technology is defined by its particular relational context” (albrecthslund 2007, p. 68) → examples: can of sardines, the bow

• however, ‘multistability’ too descriptive, has little explanatory or methodological force

• what implications does it have for design work? → we must understand why & how technologies are multistable → the dynamics in co-shaping, co-evolving, co-constituting → postphenomenology | technological mediation | sts

responsible innovation | the positivist problem

• conceptual and ethical worries aside, design is (inherently) proactive: how to do design if a specific use is not intended?

• many designs have normative behavior-steering effects: - Latour’s safetybelt & speed bumps (coercive/moralist) - urinals in Schiphol (nudging/persuasive) - metro/lottery tickets (seducing) → some technologies invites specific behaviors, others inhibit

• choosing a coercive strategy in order to safeguard a specific type of behavior might backfire; rigidity increase danger of rejection

• only 25% of telecare technology makes it past proto stage; not because they are user unfriendly, but because they are not well suited to the existing norms and values in health care practices

normative design strategies