This two-day training event aimed to help researchers affiliated with the CLOSER partner studies learn and practice new skills for comparative research (between studies and across time periods) using cohort and panel data.
Overview and aims
We hosted a training event for research staff in the CLOSER partner studies on measurement equivalence testing using structural equation modelling/confirmatory factor analysis. The training covered methods for examining and validating comparability of assessment instruments and the data they generate between studies and across time periods. The event was led by Dr. Daniel Seddig, interim professor for empirical social research at the University of Passau.
The aim of the workshop was to help build capacity amongst researchers affiliated with the CLOSER partner studies to pursue new avenues of comparative research. It comprised two sessions to meet the needs of both newcomers and more advanced analysts.
About the workshop
The rapid response of CLOSER’s partner studies to implement pandemic-relevant survey data collections, often in a methodologically-aligned manner, following the outbreak of COVID-19 has brought increased attention to the novel research utility these studies offer. This shift to aligned questionnaire instruments, albeit not universally and potentially temporarily, facilitates data pooling and coordinated cross-study analysis. However, it also stands in contrast to the differences and heterogeneity in measurement strategies implemented across studies historically. Moreover, it also has consequences for backwards comparability within studies. Validating the equivalence of survey measurements is a vital step in realising the research potential of CLOSER’s partner studies, not just for issues relevant to the pandemic (e.g. its mental health impact), but also for comparative research purposes across all disciplines.
CLOSER’s cross-study harmonisation work to date has similarly highlighted how even where assessment tools administered in different studies or at different times appear to measure similar characteristics of participants, there can be many sources of divergence that inform the validity of any comparisons or pooling of data. Instruments may utilise different phrasing, response scales, or even respondents (e.g., observer-rating versus self-report). Evaluating the impact of such heterogeneity is a key step in data harmonisation and cross-study work more generally.
This training event aimed to help researchers affiliated with the CLOSER partner studies learn and practice new skills for comparative research (between studies and across time periods) using cohort and panel data.
Outline of the agenda
The training comprised two half-day workshops:
- 1 March 2022, from 14:00 GMT: The first afternoon consisted of an introductory session on measurement equivalence testing in longitudinal research. This comprised the following:
14:00-15:30 Comparability of measurements across groups/time: multi-group confirmatory factor analysis and testing measurement invariance
15:45-17:15 Illustrations in R
- 2 March 2022, from 14:00 GMT: The second afternoon was more advanced, addressing more complex and novel analytic issues. It comprised the following:
14:00-15:30 What if strict measurement invariance is not given?
15:45-17:15 Is there something we can learn from non-invariance?
17:30-18:00 Summary and close-up discussion
These workshops combined presentations with interactive elements, including hands-on analytic tasks with guidance, feedback and discussion.
About the session lead
Dr. Seddig is currently interim professor for empirical social research at the University of Passau (staff profile / publications), who has designed and delivered training courses on latent measurement issues in cross-survey and longitudinal comparative analysis at the Essex Summer School in Social Science and at GESIS (e.g., a forthcoming workshop). His experience has applicability to both panel and cohort study data.
If you have any questions or would like any further information about this workshop, please contact Jennie Blows (email@example.com).