Episode Summary

Welcome to our first episode of 2023. In this episode, we’ll discuss the paper entitled, “Measurement Schmeasurement: Questionable Measurement Practices and How to Avoid Them” authored by Prof. Jessica Kay Flake and Assoc. Prof. Eiko I. Fried.  It was published in 2020 in the journal Advances in Methods and Practices in Psychological Science.

Episode Notes

You’ll hear some dismaying statistics around the validity of research papers in general, some comments regarding the peer review process, and then we’ll dissect each of six questions that should be asked BEFORE you design your research.

The paper’s abstract reads:

In this article, we define questionable measurement practices (QMPs) as decisions researchers make that raise doubts about the validity of the measures, and ultimately the validity of study conclusions. Doubts arise for a host of reasons, including a lack of transparency, ignorance, negligence, or misrepresentation of the evidence. We describe the scope of the problem and focus on how transparency is a part of the solution. A lack of measurement transparency makes it impossible to evaluate potential threats to internal, external, statistical-conclusion, and construct validity. We demonstrate that psychology is plagued by a measurement schmeasurement attitude: QMPs are common, hide a stunning source of researcher degrees of freedom, and pose a serious threat to cumulative psychological science, but are largely ignored. We address these challenges by providing a set of questions that researchers and consumers of scientific research can consider to identify and avoid QMPs. Transparent answers to these measurement questions promote rigorous research, allow for thorough evaluations of a study’s inferences, and are necessary for meaningful replication studies.

Discussion Points:

  • The appeal of the foundational question, “are we measuring what we think we’re measuring?”
  • Citations of studies – 40-93% of studies lack evidence that the measurement is valid
  • Psychological research and its lack of defining what measures are used, and the validity of their measurement, etc.
  • The peer review process – it helps, but can’t stop bad research being published
  • Why care about this issue? Lack of validity- the research answer may be the opposite
  • Designing research – like choosing different paths through a garden
  • The six main questions to avoid questionable measurement practices (QMPs)
  • What is your construct?
  • Why/how did you select your measure?
  • What measure to operationalize the construct?
  • How did you quantify your measure?
  • Did you modify the scale? How and why?
  • Did you create a measure on the fly?

Takeaways:

  • Expand your methods section in research papers
  • Ask these questions before you design your research
  • As research consumers, we can’t take results at face value
  • Answering our episode question: How can we get better? Transparency is the starting point.

Resources:

The Safety of Work Podcast

The Safety of Work on LinkedIn

Feedback@safetyofwork