100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Summary Algorithmic Persuasion Reading Summaries $10.75
Add to cart

Summary

Summary Algorithmic Persuasion Reading Summaries

1 review
 33 views  2 purchases
  • Course
  • Institution

This document contains the highlighted points, definitions and sections of the required reading for the Algorithmic Persuasion in the Digital Society course. All readings are included except for the last two of Week 7.

Preview 4 out of 40  pages

  • May 30, 2022
  • 40
  • 2021/2022
  • Summary

1  review

review-writer-avatar

By: jordykruse • 1 year ago

avatar-seller
Algorithmic Persuasion in the Digital Society

Readings
Lecturers: Sophie Boerman and Brahim Zarouali

Semester 2 Block 2

,Week 1: Introduction to algorithms and the digital society
What exactly is an algorithm? Algorithms explained | BBC Ideas
• Premise of algorithms: data goes in, set of instructions are followed, then a product comes out
o Comparable to a recipe
o Algorithms remove the issue of human error, but complications can happen when
humans creating algorithms make an error.
o We need to worry about who is designing the algorithms, not so much be scared of the
algorithms themselves.



All Hail to the Algorithm - A five-part series exploring the impact of algorithms
on our daily lives (Al Jazeera)

Part 1: 'Trust me' - I'm an algorithm
• Australian government caught in a ‘robo-debt’ scandal where thousands of civilians are accused
of owing the government welfare payments.
• Not necessarily mad at technology/algorithms but more at those who created them (I.e.,
government)
• In the US, it's often the poor and marginalized who get the worst deal regarding the digitization
of government and business programs.
• These automated systems are not just administrative errors but have a link to the history of
inequality and lack of welfare rights.
• We shouldn’t ask if we can trust algorithms, but more so whether the quality of the data fed
into them and the individuals designing them can be trusted? (Considering biases both internal
and systemic)

Part 2: 'Like me' - The popularity & power of big tech
• Data colonialism: a new ‘land’ is being appropriated, namely us. Our important and personal
data is being appropriated by these companies for their own benefit.
o Not exactly as horrific and violent as the original colonialism but has the same premise
of forcing people to be involved in a new system aiding to improve the economy.
• US and China are the biggest players
o China has been investing in Africa for over 20-30 years to expand, and the ways they do
it aren’t always the most ethical.
o Big countries would not be able to behave the way they do in developing countries in
their own countries. That’s why the ‘rush to be connected’ in developing countries is
concerning
▪ Facebook with their Express WIFI and Internet.org initiatives in Sub-Saharan
Africa, Alphabet and Loon (balloon service offering connectivity to remote
locations)
o Many African countries, especially, have moderate to zero data protection laws being
enforced by their states

, ▪ Often having a critical perspective on what happens with our data is painted as
being against the ‘net positive’ benefits that come with being connected. This
leads those in power of these states to just follow along to not risk missing out
on the digital world.
o Algorithms and digitization are not the problem, it is the need to mitigate the harm
caused by data collection and critically understand how people’s personal data are
being exploited and ‘Big Tech’ is the only one who profits off it.

Part 3: 'Click me' - The algorithm made me do it
• Tons of misinformation and fake news is being spread across the Internet that meet not only
government and political ideologies but also have been proven to hurt the lives of innocent
people.
• Non-human bots are now outdated, and so human-like bots are flooding the internet and
different social media platforms with misinformed political ideologies.
• Mexico is a country where tons of fake news has been spread quite frequently, it has led to
innocent people losing their lives due to people being misinformed.

Part 4: 'Follow me' - You can run, but you can’t hide
• Biometric systems are increasingly being used to track people’s data.
• It’s said to enforce security but who’s to say we have control over that data.
o Police systems are being created but are unethical as people are being tracked through
their biometric data without being asked.
o Children are being tracked in schools through their biometric data which is not ethical.
• Before we know it, we will be reduced to 0s and 1s without the opportunity to protect our
privacy anymore unless we prioritize enforcing ethical legislations for these new technologies.

Part 5: 'Read me’ or just tap ‘I agree!’
• Terms and conditions act as if they give us a sense of control, but we really do not since we do
nor cannot read the Terms and conditions since they are so long
• Design of buttons to ‘agree’ make it clear that tech companies want to encourage us to agree to
their terms
o They ensure that we still disclose our data
• All apps are competing for our attention and trying to get us to stay on a website/app
• Design has the power to manipulate behavior (Skinner Box)
o Foundation of casino slot machines and social media apps (Instagram, Pinterest)
• Need to move from human centered design to human protective design to ensure technology
truly ‘extends our best selves’ and does not affect our mental health



Zarouali et al. (2022): The Algorithmic Persuasion Framework in online
communication: Conceptualization and a future research agenda
The APF consists of five conceptual components:
• Input

, • Algorithm
• Persuasion attempt
• Persuasion process
• Persuasion effects

In short, it addresses how data variables are inputs for different algorithmic techniques and algorithmic
objectives, which influence the manifestations of algorithm- mediated persuasion attempts, informing
how such attempts are processed and their intended and unintended persuasive effects.


Week 2: Algorithmic persuasion in online advertising
Boerman et al., (2017). Online behavioral advertising: A literature review and
research agenda.

Online behavioral advertising: “The practice of monitoring people’s online behavior and using the
collected information to show people individually targeted advertisements”
• Also called ‘online profiling’ and ‘behavioral targeting’
• Only refers to advertising based on people’s online behavior

Online behavioral advertising uses personal information to tailor ads in such a way that they are
perceived as more personally relevant. A new dimension to this personalization is the fact that the
tracking of online activities, collection of behavioral data, and dissemination of information often
happen covertly. This covertness may be harmful and unethical, as consumers are unaware of the
persuasion mechanisms that entail OBA; it has led to a call for transparency.


Overview of theoretical position of online behavioral advertising

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller stuviamina. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $10.75. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

53340 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$10.75  2x  sold
  • (1)
Add to cart
Added