The Limits of AI
Parissa Akefi
Abstract commerce company Amazon started building a
The growing popularity of artificial intelligence recruiting tool to review job applicants’
(AI), and the use of products related to AI, has resumes. The expectation of the company was
raised questions about the trustworthiness and that this tool would be given data as an input,
the limits that these systems have. In this essay for example 100 resumes, and that it would
I try to outline some limits, e.g., biases in the output the top 5. However a year later, they had
data of algorithms, but also privacy concerns an ‘oops’ moment and had to scrape the tool
and the control we have over AI. after realizing the system was rating the
candidates in a not so gender-neutral way. Men
Introduction where preferred over women (BBC, 2018).
Artificial Intelligence (AI) has become According to Vallverdù et al. (2019), this is due
extremely popular. Companies work with AI for to the fact that these programs are mostly
various goals and reasons, consumers use written and implemented by men. However,
products with intelligent systems and the AI gender bias is not the only type of bias that AI
study has piqued the interest of (upcoming) can suffer from. According to IBM (2020),
students so much that, in 2018, the Dutch news more than 180 human biases have been defined
platform RTL nieuws published an article about and classified (IBM, 2020). Another example of
a temporary student registration stop for the bias in AI is AI systems that use internet data.
study. AI-related technology has come so far Tay, a teenage chatbot created by Microsoft,
that there are emotion-, speech- and face used internet data to mimic teenagers on social
recognition software’s. With all the media media platform Twitter. Tay began her first
news, new technologies, AI start-ups and AI tweet with an innocent “Hellooo World!”, a
products, AI is almost unavoidable. Given the common phrase within programming
impact AI (products) have on the society, it’s languages. Later however, these tweets became
important to keep in mind that AI has its limits. racial and offensive, from praising Adolf Hitler
In this essay I will try to outline some of these to promoting mass murder of certain races
limitations by using real time examples. (Parker, 2017).
1. Biases 2. AI Can’t Replace Humans
Companies like Google, Uber and Tesla are Society tends to overestimate the benefits
working on self-driving cars. These are cars artificial intelligence has in relationship to
without a driver, taking you to your destination, humans and human replacement. For example,
delivering you your groceries, or shipping your when building robots, we give them human like
products if you run a company. Artificial characteristics. The term anthropomorphism
technology has come a long way in automating may be well used in this context, which means
processes, making life easier and much more attributing human traits, emotions and
comfortable for humans. This development, intentions to other entities. As Vallverdù and
together with other developments like an AI colleagues (2019) pointed out, naming fear as
system beating humans in IQ-tests, have if-then instruction in our code is fallacious and
become successes. However, there are a lot of not functionally precise, because this way we
AI systems with limitations. One major implicate somehow that a robot feels fear,
limitation being the bias that intelligent systems anthropomorphizing its language. But this goes
and algorithms have. Algorithms, which are further than just writing the code. When
sets of instructions used by intelligent systems, bringing AI products to the market, the goal of
are written by humans. Humans have certain some of these products is to replace some
biases that they might not be aware of, thus, you human aspect that the buyer is missing. For
could reason that these biases are unconsciously example, Japan is currently world leading in the
implemented in algorithms, and therefore bias use of sex robots (Morsünbül, 2018). But also
decisions that intelligent systems make dating or having some other kind of relationship
(Dignam, 2019). An example of this is selecting with virtual girlfriends or holograms has
someone for a job interview. In 2014 e- become normal (Vallverdù, 2019). It is good to