skip to Main Content

INTRODUCTION

Rationale

Medical and health professions educators and scholars face increasing pressure to demonstrate the impact and utility of their work in a climate that requires accountability and a culture of outcome measurement. Counts of publications, dollars of grant capture, and numbers of citations or h-index are typically used to evaluate academics. These conventional measures of productivity are often equated with or used as evidence of impact. However, within health professions education (HPE), these conventional metrics often fail to capture how our field contributes ideas and orientations that influence practice (Weiss, 1979). Increasingly, scholarship from other fields (e.g. health services and policy [Kuruvilla, 2006], management [Aguinis, 2014], research evaluation [Morton, 2015]) has called into question the ability of these conventional metrics to adequately represent their impact. Health professions education should also examine how impact is defined in the field to ensure the metrics it employs align with the values and goals of education. “Indicators change the system through the incentives they establish” (Hicks, 2015) and definitions of impact shape and constrain the types of work that one focuses on and engages in.

Recent changes to promotions criteria, such as the Creative Professional Activity (CPA) at the University of Toronto, have challenged conventional metrics and created a new pathway for advancement in academia. Yet many HPE faculty need guidance to reformulate their thinking about what ‘counts’ as impact and about how to represent the impacts of their work (e.g. education innovations and scholarship), within these new academic pathways.

This OS will guide HPE faculty and scholars in critically examining metrics currently used in academic medicine and provide alternatives to more meaningfully track and represent impact in HPE scholarship and research. The OS will provoke consideration of the broader implications these measures may have to ensure that metrics appropriate for HPE scholarship is valued and used.

Objectives

1) To provide a summary of developments in scholarship and research metrics with which it would behoove HPE faculty to become familiar (e.g. conventional, altmetric, newer measures proposed by other fields)

2) To guide faculty in critically examining these developments in scholarship and research metrics for applicability to HPE scholarship and research, especially reflecting on the language and indicators used to articulate ‘impact’

3) To provide an alternative approach to conceptualizing and articulating education and research scholarship impact, with guiding questions to more meaningfully track and represent the impacts of HPE scholarship and research (grey metrics and impact stories) and how to “translate impact into something coherent and comprehensible” (Marcella 2015). The OS will contribute to the conversations surrounding scholarship and research metrics in academia with special attention to HPE researchers’ and scholars’ contexts and goals. The goal of education is often to expand individuals’ capacity to learn and think differently, and the OS will guide HPE faculty in broadening their own definitions of impact and valuing different representations of impact.

References

  • Weiss CH. The Many Meanings of Research Utilization. Public Adm Rev. 1979 Sep;39(5):426.
  • Kuruvilla S, Mays N, Pleasant A, Walt G. Describing the impact of health research: a Research Impact Framework. BMC Health Serv Res. 2006;6:134.
  • Aguinis H, Shapiro DL, Antonacopoulou EP, Cummings TG. Scholarly Impact: A Pluralist Conceptualization. Acad Manag Learying Educ. 2014;13(4):623-39.
  • Morton S. Progressing research impact assessment: A “contributions” approach. Res Eval. 2015;24(4):405-19.
  • Hicks D, Wouters P, Waltman L, Rijcke S de, Rafols I. The Leiden Manifesto for research metrics. Nature. 2015;520(7548):429-31.
Previous
Next
Back To Top