Overview of Key Concept
Theme 1 Overvie
21st century skills
Cook, 2016, CH3
• Rapid change implies less need for task-speci c skills and more need for general abili es, to adapt, solve
problems, de ne one’s own direc on, and work in teams
Three trends – lead to reduc on of organisa onal rou nes & prac ces & increase in non rou ne and
interac ve tasks
• skill-based technological change
• job polarisa on
• o shoring
Two 21st century skills
• complex problem solving
• collabora ve problem solving
- both emphasise the importance of adap ng to new situa ons and problems for which no rou ne
solu on is readily available
- characteris c features of complex and collabora ve problems can be systema cally varied in assessment,
building on the construct in focus and following theore cally de ned and empirically validated
dimensions
Complex problem solving
• addresses individuals’ transversal skill in successfully dealing with dynamic decision-making: handling
complex, dynamic and intransparent situa ons (i.e., those without a readily apparent solu on), requiring
the ac ve acquisi on and applica on of knowledge in various domains
• characteris cs of problems targeted in CPS are:
• complexity of the problem structure (i.e., a mul tude of inter-related elements)
• the dynamics of the system (i.e., changes due to me or to interac ng with the problem)
• interconnectedness of elements (i.e., a change in one part of the system has repercussions in other
parts)
• mul ple goals requiring simultaneous considera on, and the intransparency of the problem situa on
requiring ac ve inves ga on
Collabora ve problem solving:
• addresses problem solving in group se ngs
• regards the assessment of similar skills as complex problem solving, but then in interac ve se ngs
• requires to share the understanding and e ort to reach a solu on, to pool knowledge, skills and e orts
Psychological prediction vs Machine Learning
Liem et al., 2018
please do not share without permission PSN Exam Guide
ff titi titi titititi fi ti ti ti ti ti w
ttiti tiff tifi tititi titi tis
ti fi titi ti ti ti ti titti
ti tiff
, Psychology Machine Learning
1 focus is on understanding data data is used to verify that a robust model has been
trained
2 essential is: human-interpretable ML expert mostly is interested in understanding &
meaning of x and y and link to theory improving the learning procedure; the origin of x and y
are irrelevant in machine learning research
3 linear regression models are commonly large exibility in choosing f(x) model - focus is on
chosen for f (x), and not typically identifying the model that obtains most accurate
contrasted with alternative models predictions (often not linear model)
4 little attention is paid to criterion validity, more attention is paid to criterion validity, even though
considering the alignment of y with what certain types of measures tend to dominate
is supposed to be measured
5 uses exploratory factor analysis to uses unsupervised learning techniques to examine the
examine relationships among data relations (in terms of the original input dimensions)
Correc on for a enua on
• this problem relates to the unreliability of your scales (eg when you don't get the same scales when
assessing answers in an interview)
• a enua on: the predic ve validity weakens because of unreliability
• Marise shows a formula that can be used to reach the predic ve validity coe cient that corrects
unreliable predictor and criterion
• Marise: many people say that we should not use this formula, because only in paradise, scales are
completely reliable. But in everyday life, things are unreliable
please do not share without permission PSN Exam Guide
tt fltiti tt titi ti ffi
, Theme 2 Overvie
Fairness and bias
• Fairness: is a social concept, without a single meaning/de ni on
• Bias: bias refers to systema c error in a test score that di eren ally a ects the performance of di erent
groups of test takers
Four views on fairness
• 1) fairness as requiring equal group outcomes
• 2) fairness in terms of equitable treatment
• 3) fairness in terms of comparable access to the constructs measured in a selec on procedure
• 4) fairness as a lack of bias
Four Views on fairness
• fairness as requiring equal group outcomes: equal passing rates for subgroups of interest. Some reject this
and argue that outcome di erences in themselves do not indicate bias, although one should be careful
that group di erences are not caused by certain sources of bias
• fairness in terms of equitable treatment: of all examinees during the selec on process
• fairness in terms of comparable access to the constructs measured by a selec on procedure: whether
factors such as age, race, ethnicity, gender, socioeconomic status, cultural background, disability, and
language pro ciency restrict accessibility and a ect measurement of the construct of interest.
• fairness as a lack of bias: subgroup di erences in regression slopes or intercepts may signal predic ve bias
Two types of bias
• predic ve bias: the e ect of irrelevant sources of variance on predictor-criterion rela onships, such that
slope or intercepts of the regression line rela ng the predictor to the criterion are di erent for one group
than for another
• di eren al predic on (intercept and slope di erence) should therefore not be mixed up with a
predic ve bias (there can be di eren al predic on, without bias)
• measurement bias: the e ect of irrelevant sources of variance on scores on a given variable
Validity
Personal remark: read types of validity in APA (excellent explana on!)
• (Marise) Ra onal validity: whether a human being is able to subjec vely es mate the (predic ve) value of
a predictor (a predictor that is established in a sta s cal way) > is one able to es mate a test’s predic ve
validity?
Di eren al Validity
• Di eren al Validity: when one group performs be er on a test than another
• Cook uses this de ni on to refer to the slope and intercept
Di erence between di eren al validity and adverse impact
• Adverse impact: a group (eg. men) scores higher on a test (eg physical strength)
please do not share without permission PSN Exam Guide
ff ti ti ti ff
fi fiti tiffff ff fftiti ff tiffw
tiff tiff ti
tt ti fffi ti ti ti ti ff titi ti titi ffti ti ffti ti
, • Di eren al validity: a strength predicts work performance for men, but not for women, or not as well for
women
• Alec: di eren al validity has to do with an invalid test procedure, whereas adverse impact refers to the
impact/outcome of the decision-procedure (the acceptance of people). Hence, there can be adverse
impact even when there is no di eren al validity
• an example could be that with adverse impact: two groups are on the same regression line, but having
di erent scores (whereas with di eren al validity: there is a di erence in terms of slope/intercept)
Diversity-validity dilemma
Ployhart and Holtz, 2008
• diversity-validity dilemma: some of the most valid predictors of job performance are also associated with
large di erences in predictor score among racio-ethnic and sex subgroup
Five categories for strategies
• Category 1: strategies use predictors that have smaller subgroup di erences than overall cogni ve ability
• Category 2: strategies combine or manipulate scores to lower subgroup di erences
• Category 3: strategies a empt to remove construct irrelevant variance from predictor scores
• Category 4: strategies allow prac ce prior to tes ng, or retes ng if the applicant is rejected
• Category 5: strategies a empt to foster favourable applicant reac ons to assist in recrui ng and
performance in the selec on system
Example Strategies in Categories – Overview
• Strategy in Category 1: use educa onal a ainment or GPA as a proxy for cogni ve ability
• Strategy in Category 2: explicit predictor weigh ng
• Strategy in Category 3: minimise verbal ability requirements to the extent supported by job analysis
• Strategy in Category 4: retes ng
• Strategy in Category 5: increasing and retaining racio-ethnic minority and female applicants
please do not share without permission PSN Exam Guide
ff
ffti ti ttti ti fftiffti titi tt titi ti ff ti ff ff ti ti ti