100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Samenvatting Ethics of Technology (FI3V19019) $7.78
Add to cart

Summary

Samenvatting Ethics of Technology (FI3V19019)

 30 views  2 purchases
  • Course
  • Institution

Summary of all lectures and related literature, with additional notes and images

Preview 3 out of 17  pages

  • June 22, 2022
  • 17
  • 2021/2022
  • Summary
avatar-seller
SV Ethics of Technology
Lecture 1 - What is technology (from an ethical point of view)?
The Question Concerning Technology (Heidegger)
 “Everyone knows the two statements that answer our question. One says: Technology is
a means to an end. The other says: Technology is a human activity. The two definitions
belong together. …to posit ends and to procure and use the means to them is a human
activity.”
o Instrumental theory
o Anthropological theory
o Heidegger adds that technology is also a form of “revealing”
 The current conception of technology, according to which it is a means and a human
activity, can therefore be called the instrumental and anthropological definition of
technology.
 Modern technology too is a means to an end. That is why the instrumental conception of
technology conditions every attempt to bring man into the right relation to technology.
Everything depends on our manipulating technology in the proper manner as a means.
 Technologies are tools
 Technologies are means to human ends
 Technologies are in themselves value-neutral
o Guns don’t kill people, people kill people

Robots should be slaves
 The potential of robotics should be understood as the potential to extend our own
abilities and to address our own goals.
 A robot is any artificial entity situated in the real world that transforms perception into
action. If a digital assistant listens and talks to a human, it is a robot — it is an agent, an
actor, living in and changing the world.
 Agents transmit, create and may even destroy information, including human opinions
and reputations. Digital agents may use the Internet to actively purchase goods or
services, thus causing the movement of physical objects as well as ideas.
 Robots should be servants you own
 Technologies are our human property, which we can buy and sell
 Therefore: we should regard technologies as tools, and means to our ends (“robots
should be slaves”)
 Therefore: we should avoid creating robots that trigger social & non instrumental
responses in people
 Fundamental claims:
1. Having servants is good and useful, provided no one is dehumanised.
2. A robot can be a servant without being a person.
3. It is right and natural for people to own robots.
4. It would be wrong to let people think that their robots are persons.
 We choose those motivations and design the decision-making system. All their goals are
derived from us.
 Robots are often overly personified.
1. First, this is because of our desire to have the power of creating life.
2. Second, this is because we are not certain what it means to be human, so we
currently offer the term to anything that senses, acts, remembers and speaks.


1

,  My conclusion is that we are obliged not to the robots, but to our society. We are obliged
to educate consumers and producers alike to their real obligations with respect to
robotics.

Post-Phenomenology
 An updated version of the anthropological theory of technology
 Peter-Paul Verbeek: technologies “mediate” our relation to the world
 Technologies mediate:
o (1) our experiences/perceptions (“inputs”) &
o (2) what we are able to do/how we are able to do it (“outputs”)
o …including what we are inclined to do
 Against the idea that technologies are value- neutral:
 In shaping our perceptions and actions, technologies also shape our values and goals
o “guns don’t kill, people kill people”??? A man with a gun = a gunman
 Technologies contain intended and unintended “scripts” ( they “tell us” what to do)

Kant & Bryson comparison
 Kant: persons vs. things
 Kant: anything that is not a person can be treated as a mere means. Persons should be
treated as ends-in-themselves, and never as mere means
 Bryson (translated into Kantian terms): technologies should always be treated as things
and as mere means, and never as ends-in-themselves

Some very different perspectives that are becoming increasingly widely accepted by
philosophers of technology:
 Some technologies can be moral agents
 Some technologies (e.g. robots with AI) can be persons
 Some technologies should be treated with moral consideration, i.e. not as mere means,
but as ends-in-themselves
 Some technologies can be our friends or romantic partners
o These views are very far away from the instrumental theory of technology!!!

Conclusion: a range of ways of understanding what technologies are/can be:
 The traditional instrumental theory (= all technologies are value- neutral tools, and
means to human ends)
 Bryson’s normative version of the instrumental theory
 The anthropological perspective (including post-phenomenology and mediation theory)
 The non-instrumental theory (= some technologies can be persons, moral agents, have
moral status, be our friends, etc.)

Materializing Morality - Design Ethics and Technological Mediation
 Phenomenology: the philosophical analysis of the structure of the relations between
humans and their life-world.
 Within the praxis perspective, the central question is how artifacts mediate people’s
actions and the way they live their lives. While perception, from a phenomenological
point of view, consists in the way the world is present for humans, praxis can be seen as
the way humans are present in their world.
 An important difference with respect to the mediation of perception, however, is the
way in which action-mediating artifacts are present. Artifacts mediate action not only
from a ready-to-hand position but also from a present-at-hand position. A gun, to
mention an unpleasant example, mediates action from a ready-to-hand position,

2

, translating “express my anger” or “take revenge” into “kill that person.” A speed bump,
however, cannot be embodied. It will never be ready-to-hand; it exerts influence on
people’s actions from a present-at-hand position.
 The translation of action has a structure of invitation and inhibition, the transformation
of perception a structure of amplification and reduction.

Lecture 2 – Methods of Technology Ethics: the ethics of self-driving
cars as a case study
The Reflective Equilibrium Method
 Reflective equilibrium as a methodology for ethics research
o a state of balance or coherence among a set of beliefs arrived at by a process of
deliberative mutual adjustment among general principles and particular
judgements.
o For example, if we find out that a baby has been murdered, we might think that
this is terrible and feel quite bad about it. Given a moral intuition, the method of
reflective equilibrium says that we should try to generalize and come up with a
general moral principle that explains this intuition.
o The method of reflective equilibrium serves the aim of defining a realistic and
stable social order by determining a practically coherent set of principles that
are grounded in the right way in the source of our moral motivation, such that
we will be disposed to comply with them.

Ethics by analogy: e.g. the trolley problem analogy
 3 worries about comparing the ethics of self-driving cars with the trolley problem:
1. simplified thought experiments vs. the messiness of real-life ethics
2. moral and legal responsibility
3. known and certain facts vs. risks and uncertainty
 Empirical ethics: 3 worries:
1. lack of real-world experience (cf. Collingridge dilemma)
2. gut reactions vs. arguments
3. inconsistent attitudes

Some challenges related to applying the traditional moral theories to new technologies
(like robots and AI):
 The “who is the moral agent?” problem
 The theories were developed with human-human interaction in mind, before there were
technologies like AI, robots, self-driving cars, etc.
 Our own shared human moral sensibilities (which express themselves in our traditional
ethical theories) also developed before we lived in the modern world

Controversies about methods – and should all ethics researchers use the same method?
 Nassim JafariNaimi: some methods – such as the method of constructing trolley
problems – are morally problematic!
o JafariNaimi: Discussing trolley problem cases displays moral insensitivity
 John Harris: the moral machine project (“empirical ethics”) also makes morally
problematic assumptions!
o Harris: it makes the morally problematic assumption that we should allow
machines to make life-and-death decisions
 One possible response to JafariNaimi & Harris: even if trolley problems and the moral
machine project are problematic, they can still be indirectly useful. à explaining what is

3

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller IsabelleU. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $7.78. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

53068 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$7.78  2x  sold
  • (0)
Add to cart
Added