100% tevredenheidsgarantie Direct beschikbaar na betaling Zowel online als in PDF Je zit nergens aan vast
logo-home
Complete Lecture Notes ARMS with SPSS Output () €5,50   In winkelwagen

College aantekeningen

Complete Lecture Notes ARMS with SPSS Output ()

 23 keer bekeken  1 keer verkocht

In dit document worden alle gegeven hoorcolleges en seminars doorgenomen en wordt bij elke statistische toets de SPSS output weergegeven en geïnterpreteerd.

Voorbeeld 4 van de 34  pagina's

  • 8 juni 2022
  • 34
  • 2020/2021
  • College aantekeningen
  • I. klugkist
  • Alle colleges
Alle documenten voor dit vak (12)
avatar-seller
morrisverholt
Advanced Research Methods & Statistics for Psychology:
Experimental Psychology
Psychologie Jaar 2 UU
Hoorcolleges & Seminars
In dit bestand worden alle gegeven hoorcolleges en seminars van Advanced
Research Methods and Statistics doorgenomen. Bij elke statistische toets zullen
voorbeelden van output gegeven worden en interpretatie van deze output. Zo zal
je zo goed mogelijk voorbereidt zijn op de praktische toets van ARMS.
►Hoorcollege 1: Multiple Linear Regression
Always try to look at the outcome of a study in terms of "how did they view this?" Look at
the following points within the structure of the study:
→Was a representative sample used?
→Have reliable measurement methods been used to measure the variables?
→Is there a correct analysis and is the interpretation of the results correct?
Always look at other explanations for the statistical association:
→An association is not the same as Causal Relationships.
→Will the effect remain if other variables are added?

■ Multiple Regression
Multiple Linear Regression is all about adding variables into your model. Simple Linear
Regression, on the other hand, involves a model with one outcome (Y) and one predictor (X).
MLR examines a model where multiple predictors are included to check their unique linear
effect on Y. It is also called an Additive linear Model because every new predictor adds to
the overall score in the model. The outcome in the model is the variable you want to calculate
and depends on the X in your model. This makes the result called the Dependent Variable.
The predictor is referred to as the Independent Variable in the model. The equation of a
simple linear regression is the following:


By adding more predictors in your model, you get the equation that belongs to multiple linear:


In both these equations there is a Slope for the independent variable and a starting point of the
function along the Y-axis, the Intersection or Constant. The B0 stands for the intercept in
front of the equation. Then there is an increasing number for every next predictors slope
(B1X1i – B10X10i). The ei stands for the residual or difference between the observed and
predicted score. When you are looking at the equation you can see there is a Predicted Score
(portrayed by Ŷi) and an Observed Score (portrayed by Yi)

Yi = Ŷi + ei

,Ŷi = B0 + B1X1i + B2X2i … etc.

You can check the relevance of a predictor through 2 ways.
→ Firstly, you can look at the R2 (the squared sizes of the residuals). If you have a
Large R2 this means that you have a model with small residual sizes and that a lot of variation
in Y is explained.
→ You can also look at the slope of the regression line. The more diagonal a
regression line is plotted, the greater the effect a predictor has on the dependent variable (Y).
▲ Types of Variables
In formal there are 4 different measurement levels, nominal, ordinal, interval and ratio. For
choice of analysis, we distinguish the Categorical or Qualitative Levels (nominal + ordinal),
and the Continuous, Quantitative, or Numerical Levels (interval + ratio). The MLR
requires a continuous outcome and continuous predictors. Though categorical predictors can
be included as Dummy Variables. With that you must transform the data so that the groups
can be displayed as continuous variables (male=1, female=0). Dummy variables have only 2
numbers a zero and a one. If you have a predictor with more than one group, then you should
make a table that codes for every group ones and zeros. You get an equation with a slope for
every separate group within the variable. You will get an equation with n (categories) – 1.
▲ Types of MLR
You can have a scenario where you want to test a model with two factors on a construct and
thereafter want to study what effect adding additional predictors has on the Y. This example is
a Hierarchical MLR. For the hypotheses between the two models you can check multiple
things.
→ For each model you can check whether the R2 is significantly not 0 and
alternatively bigger than 0.
H0: R2 = 0
HA: R2 > 0
→ Check if the R2 significantly increases after adding the new predictors.
H0: R2-Change = 0 (the additional predictors do not improve the model)
HA: R2-Change > 0
→ Check for each predictor in each model if there is a unique effect.
H0: B1 = 0
HA: B1 ≠ 0

,■ Output




→ Blue/ R2: The proportion of variance explained by the model. The displayed
number is in percentages so .135 equals 13.5% is explained by the predictors.
→ Red/ R: The square root of R2 and is called the Multiple Correlation Coefficient.
The correlation between Y (observed) and Ŷ (predicted).
→ Orange/ Adjusted R2: Are somewhat smaller than the original R2, that is because
your model and results are computed over your sample, though you want to say something
about the population. It turns out that the R2 of your sample is not a great representation about
the population R2. The greater your number of predictors is the more optimistically the R2
gets in contrast to reality. To correct for this bias, we get an Adjusted R2.
→Green/ R2-Change: The R2-Change compares the model to the previous one. You

, can observe that the first R2 is equal to the first R2 because there is no other model to compare
it to yet. The number under the first model is the added R2 to the original R2.+
Blue/ Constant: This is the intersect of the equation.
Yellow/ Slope: This is the slope of the equation. It basically says that if one predictor (for
example age) is the same for 2 hypothetical persons, then 1 year of education more means an
increase in life satisfaction of 3.035.
Pink/ Beta: All the variables are measured in different units (years, score, hight) to
compensate for that we use standardized coefficients. The Beta tells us which variable has the
strongest contribution to the outcome. The value that is the furthest away from 0 tells you
which variable has the biggest contribution.

►Hoorcollege 2: Advanced Multiple Regression: Moderation &
Mediation
■ Moderation
Moderation is the following: the effect of predictor X1 on outcome Y is different for different
levels of a second predictor X2. For example, the effect of X1 is different for males and
females (X2). This is not the same as an additive effect. Here we speak of an Interaction or
Moderation Effect.
There are two ways to look at a moderation effect. The first path model is more theoretical,
the second comes form a more statistical point of view. Here we see on the left the theoretical
model, and on the right the statistical representation.




Both models represent the same idea. There is a second predictor in the model that influences
the existing effect between X and Y. Here, gender affects the relation between X and Y. So,
gender is a moderator for the relation between X and Y. In the statistical (right) model we see
what we must do to be able to make conclusions. We separate X1 as a predictor, gender as a
predictor and the interaction effect between gender and X1 as a standalone predictor for Y.
The general equation that comes with this model is as follows.

Ŷi = B0 + B1X1i + B2Genderi + B3X1iGenderi
In this equation the B2 is the slope times the gender. Now gender is a dummy variable with 1
or 0 (for males and females) so the equation will become much smaller when you are a male
and you must multiply the slopes by 0.

Ŷi = B0 + B1X1i + B2Genderi + B3X1iGenderi
Slope of B2 * 0 = 0 X1i * 0 = 0

As previously stated, the equation for males, in contrast to females, becomes much smaller
because of the zeros:

Voordelen van het kopen van samenvattingen bij Stuvia op een rij:

Verzekerd van kwaliteit door reviews

Verzekerd van kwaliteit door reviews

Stuvia-klanten hebben meer dan 700.000 samenvattingen beoordeeld. Zo weet je zeker dat je de beste documenten koopt!

Snel en makkelijk kopen

Snel en makkelijk kopen

Je betaalt supersnel en eenmalig met iDeal, creditcard of Stuvia-tegoed voor de samenvatting. Zonder lidmaatschap.

Focus op de essentie

Focus op de essentie

Samenvattingen worden geschreven voor en door anderen. Daarom zijn de samenvattingen altijd betrouwbaar en actueel. Zo kom je snel tot de kern!

Veelgestelde vragen

Wat krijg ik als ik dit document koop?

Je krijgt een PDF, die direct beschikbaar is na je aankoop. Het gekochte document is altijd, overal en oneindig toegankelijk via je profiel.

Tevredenheidsgarantie: hoe werkt dat?

Onze tevredenheidsgarantie zorgt ervoor dat je altijd een studiedocument vindt dat goed bij je past. Je vult een formulier in en onze klantenservice regelt de rest.

Van wie koop ik deze samenvatting?

Stuvia is een marktplaats, je koop dit document dus niet van ons, maar van verkoper morrisverholt. Stuvia faciliteert de betaling aan de verkoper.

Zit ik meteen vast aan een abonnement?

Nee, je koopt alleen deze samenvatting voor €5,50. Je zit daarna nergens aan vast.

Is Stuvia te vertrouwen?

4,6 sterren op Google & Trustpilot (+1000 reviews)

Afgelopen 30 dagen zijn er 72042 samenvattingen verkocht

Opgericht in 2010, al 14 jaar dé plek om samenvattingen te kopen

Start met verkopen
€5,50  1x  verkocht
  • (0)
  Kopen