lOMoARcPSD|15781725
lOMoARcPSD|15781725
Module 1: Web & Social Media
Algorithm
● A step-by-step procedure for solving a problem or accomplishing some end
● “A procedure for following a mathematical problem in a finite
number of steps that frequently involves repetition of an
operation”
● Examples:
○ Advertisements on social media
○ People we elect for office (Cambridge Analytica)
○ How the economy is run (algorithmic trading, automated supply chains)
● Algorithmic Accountability
○ Who governs the algorithms that govern our lives?
■ Algorithmic systems are increasingly used as part of
decision- making processes, with potentially
significant consequences for individuals, organizations
and societies as a whole
■ Increasing concerns that many of these systems are
opaque to the people affected by their use and lack
clear explanations for the decisions they make
■ This lack of transparency risks undermining meaningful
scrutiny and accountability, which is a significant
concern when these systems are applied as part of
decision-making processes that can have a considerable
impact on people’s human rights. Eg. critical safety
decisions of automotive devices
Digitalization and the growing influence of social media is revolutionizing politics
● Eg. Donald Trump. 5 years ago it would have been unimaginable
that the US presidents would have communicated threats about
foreign countries on its own
● Eg. In UK, bots were used on Tinder to change people’s opinions
about voting against Brexit
Article: When algorithms think you want to die
● Background: British teenager sought images of self-harm online
before committing suicide. Later discovered that these images were
also being delivered to her, recommended by social media platforms.
In the face of intense criticism, Instagram has banned graphic self-
harm images
○ Problem: Some platforms host the most troubling
content and recommend troubling content to the
most vulnerable
, lOMoARcPSD|15781725
○ The problem lies deeper than just banning certain kinds
of content. Algorithms should be able to recognize the
following things:
, lOMoARcPSD|15781725
■ When content is right for some and not for others
■ When finding what you searched for is not the
same as being invited to see more
■ When what is good for the individual may not be
good for the public as a whole
○ Platforms make problematic content easily found
■ Algorithmic recommendation intends to give users a
personalized and more enjoyable experience. However,
with problematic content, the platform learns about the
users’ interests, recommends it, and sends it further
down the rabbit hole
■ Social media companies need more in-house
knowledge about mental health, to better judge how to
handle content that is both objectionable and valuable
● Further difficulties
○ Content moderation is difficult
○ Bans are not only imperfect, they can be harmful in and of themselves
○ Recommendations have become the primary means for
social media to keep users on the site and clicking
Article: Filter Bubbles
● Political polarization (division into two sharply contrasting
groups or sets of political opinions or beliefs) long predates the
rise of social media
● Filter Bubbles: a term coined by internet activist Eli Pariser – is a state
of intellectual isolation that can result from personalized searches
when a website algorithm selectively guesses what information a user
would like to see based on information about the user, such as
location, past click-behavior and search history
● Discusses recent research concerning the influence of filter bubbles
on political polarization
○ 2011: Pariser warned that personalized news feeds and
search results would undermine civic discourse by steering
people towards information that appeals to their
preconceptions
■ We would search for, like, and retweet the ideas we
already agreed with, and algorithms optimized for
engagement would serve us more of the same—
crowding out anything that might trouble our worldview.
○ 2016: Brexit and Trump election seemed to validate this theory
● Findings
○ Internet is not the primary driver of rising political polarization
■ Americans aged 18 – 39 hardly more polarized in 2012
than it had been in 1996 (when media did not exist)
, lOMoARcPSD|15781725
■ Polarization has been driven primarily by the
demographic groups that spend the least time online
● Americans 75+ experienced by far the greatest
ideological divergence of any age group (only 20%
use social media)
■ In fact, people are more likely to encounter opposing views
in online media than in their day-to-day activities
○ Another study revealed that news feed algorithms do filter
ideologically “cross-cutting” news to some extent but less so
than users’ own choices of what to read.
● Limitations
○ Approach captures differences between age groups, but
may obscure trends within age groups along lines such as
party affiliation, income, education level
○ Data capture only the first few years of the algorithmic
personalization trend
● Yet, after 2020, it is hard to argue that social media filter bubbles are not a thing
○ People switched platforms because of misinformation and hate speech
○ COVID-19 content: surfacing of conspiracy theories about the
virus that is reshared by people who believe in it
Internet of Things (IoT)
● A vision where low-cost sensors, processors, and communication
are embedded into a wide array of products and our environment,
allowing a vast network to collect data, analyze input, and
automatically coordinate collective action. Eg. smart lamps, watches,
thermostats, door locks, etc.
● Big Data
○ Massive data sets from which many organizations collect insights
○ Data analytics, business intelligence, and so-called machine-
learning are driving discovery and innovation, redefining
modern marketing, and creating shifting knife-edge of privacy
concerns that can shred corporate reputations if mishandled
● In the previous decade, tech firms have created profound shifts in
the way firms advertise and individuals and organizations
communicate
○ New technologies turn sophisticated computing into a utility
available to even the smallest businesses and nonprofits.
○ New technologies have fueled globalization, redefined
concepts of software and computing, crushed costs,
fueled data-driven decision making, and raised privacy
ad security concerns
○ Content adjacency: Sometimes companies have a
concern that an advertisement will run near offensive
material, embarrassing the