Sometimes relationships can be described via a third variable Z:
Figure 1: Moderation
Moderator = different relationship for different strengths of Z (Figure 1)
Mediator = relationship between X and Y is explained by/ through Z,
(Figure 2).
In the first lecture, moderation was explained in more detail. A moderator is a
variable that modifies a relationship between two other variables and is Figure 2: Mediation
essentially conceptualized as an interaction effect. Four cases describing different
measurement levels of the involved variables were discussed.
CASE 1 - MODERATION IN REGRESSION: VARIABLES X AND Z ARE BOTH INTERVAL
To assess a moderation model using Figure 3: Main Effect Only Model (left) vs Main Effect + Interaction Model
regression an interaction term needs (right)
to be added to the main effect-only
regression model, i.e., adding b3XZ.
The main effect-only model does not
allow for interaction because
although constants are different, they
have the same regression weights,
leading to parallel lines.
Adding this interaction term
(i.e., the product term of XZ) allows for different regression weights and nonparallel regression lines. Thus, it
allows for interaction. You should only choose regression lines that are based on your a priori preference or on
conventional Z-values (Mz – SDz, Mz, Mz + SDz).
An interaction via XZ is called a linear-by-linear interaction. It describes the linear relationship
between the third variable Z and the regression weights for the X-Y relationship. Importantly, the XZ product is
not the interaction on its own, it only becomes the interaction when all relevant lower-order effects (e.g., X &
Z) are in the model as well (lecture 1, slide 11).
It is advised to center both X and Z to prevent multicollinearity (generally XZ is assumed to correlate
with X and Z, this needs to be taken into account in the model) and ease the interpretation of main effects (i.e.,
we can now talk about the average effect of X on Y (centering b1) or the average effect of Z on Y (centering b2).
This was also covered in the first practical
assignment where centered and uncentered analyses
were compared (see Table 1 as copied from the answers
of the practical 1 -solutions.
Centering affects the constant and regression
weight values as they have now a different meaning. In
general, the main effect regression weights describe the
relationship between Y and the predictor (X or Z) when
the other predictor = 0. If the predictors are centered 0
presents the average, whereas if they are uncentered 0
represents 0 (which is often a value that is not present in
the observed range of values).
Centering should not have an impact on the result of the interaction term, because the change in both
predictors is taken into account.
, When looking into the three regression lines for the three different values of the moderator, it is
shown that the overall direction of the interaction effect is the same (as based on regression weights, which
are the same in the centered and uncentered analysis. However, the constants are different (see above).
TESTING THE MODERATION MODEL
The moderation model is tested hierarchically:
Model 1: X, Z Y
Model 2: X, Z, XZ 3
If the b3 (regression weight for XZ in Model 2), is not significant we go back to the simpler model 1 which
includes the main effects only.
SPSS EXAMPLE FOR CASE 1 (LECTURE 1, 14 - 17)
Variables:
Workstress (IV)
social support at work (M)
depressed mood (DV)
RQ: Does social support buffer the effect of work stress on depression?
Steps in the Analysis:
1. Compute means and standard deviation for X and Z (using descriptives from a regression analysis)
2. Centre X, Z, and interaction term
3. Hierarchical regression analysis with the centered
main effects (X and Z) in model 1 and the centered
interaction effect in model 2.
Interpretation:
1. Check whether both the main effect model and the
main effect + interaction effect model are significant.
Look at the significance level of the F-test
(not depicted here)
Look at the R² and R²- Change to assess how
much variance explained the interaction effect adds to the model, e.g., here the model with
workstress-only explains 15,5%, and work stress + social support explains, 20,5% of the
variance. The interaction effect thus adds 5% explained variance, which is significant, F (1, 76)
= 4.82, p = .031.
2. Check the b (not betas) and corresponding p-values to assess the individual contribution of the
covariates. In the example, the first analysis (main effect only) shows that work stress is positively
related to depressive mood (however non-significant) and social support is negatively related to
depressive mood (and significant). The second analysis in which the interaction term is added shows
that in this model work stress is now significant and social support stays significant. Additionally, the
regression weight of the interaction effect is negative and significant.
The negative interaction regression weight means that the positive stress – depression relationship
becomes weaker with higher levels of social support.
3. Compute stress- depression regression lines for three
levels of support: average(Z=0), one SD below
average (Z = - 7.81), and one SD above average
(Z=7.81).
,CASE 2 & 3 – MODERATION IN ANCOVA: VARIABLE X IS NOMINAL, Z IS INTERVAL OR VARIABLE
X IS INTERVAL, Z IS INTERVAL
Since interaction is symmetric, case 2 and 3 follow the same logic (lecture 1, slide 18). Case 2 conceptually
describes the difference between groups (defined by X) on Y explained by different values of the moderator (Z).
Whereas case 3 conceptually described the differences in regression slopes of Y on X in the different groups
(defined by Z). Normally a significant interaction between factor and covariate means a violation of parallelity
assumption, however, for the moderation analysis we use this to our advantages. Again, we built the model
hierarchical:
- Step 1: standard ANCOVA with only main effects
- Step 2: add covariate x factor interaction (here you ignore the main effects, however the main effects
still need to be included).
o if interaction effect non-sig. return to model 1
o if interaction effect sig. compute simple regression lines for Y on Z, thus compute/ draw the
relationship for each group
SPSS EXAMPLE FOR CASE 2 (& 3; PRACTICAL 1 SOLUTION, PP. 4-5)
Variables: duration (of friendship; M), absence (not seen in months, IV), fondness (DV)
RQ: "Out of sight, out of mind", or "Absence makes the heart grow fonder"?
Steps in Analysis:
1. Run an ANCOVA with main effects only, i.e., fondness as (DI), duration as factor (M) and absence as
covariate (IV)
2. Repeat the ANCOVA and add an interaction term of covariate x factor
3. Compute the regression lines for fondness and absence for the two duration groups (using split file by
duration & run the regression analysis)
4. Plot regression lines
Interpretation:
1. ANCOVA without interaction term: The model
is non-significant (corrected model). We can also
see that both duration and absence are non-
significant. The proportion of variance explained
is .015 (R² or calculated using SS (Corrected
model)/ SS (Total))
2. ANCOVA with interaction term: the interaction
(covariate x factor) is significant and there is an
increase in proportion explained (R² = .287).
Since the interaction term is included, we ignore
the main effects.
3. Separate Regression Lines: Looking at the
separate regression lines we see that there is a
difference between “short” friendships and
“long” friendships. I.e., in short friendships
fondness increases, and in long fondness
becomes less with longer absence.
4. Regression Line: The same pattern is depicted in
the graphic representation
, CASE 4 – MODERATION IN ANOVA: VARIABLE X AND Z ARE BOTH NOMINAL
To test an interaction effect in ANOVA, you predict an interaction effect in a standard ANOVA, i.e., there are
group differences for each level of the moderator (lecture 1, slide 21)
PROBLEMS & LIMITATIONS
There are some problems and limitations for the methods discussed in the first lecture (also generalize to the
other lectures).
1. the statistical power of testing interaction effects (i.e., moderation and mediation) is relatively low
leading to the need for a relatively large sample sized (but what is large? Hard to define). And a lack of
power is a reasonable indicator for inconsistent results often found in the literature.
2. finding an interaction effect is only part of the story. If you want to explain and test the meaning of the
interaction effect, there are more steps involved (e.g., post-hoc probing).
3. the lecture only discussed the most basic models; however, this can also be extended to nonlinear and
higher-order moderation and other cases included binary or categorical dependent variables (lecture
1, slide 22).
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller turtle23. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $13.55. You're not tied to anything after your purchase.