It’s difficult to put a value on scientific research. In the last two or three decades, universities and other institutions have increasingly turned to quantitative metrics to gauge the impact of research. An individual’s h-index, for example, reports that a scholar with an index of h has published h number of papers each of which has been cited at least h times. Google Scholar also reports an i10 index, which shows how many publications have been cited at least 10 times.

Journals have a citation-based metric, too — the impact factor (IF). Yet another, Altmetric, uses social media shares and likes, together with citations, to assess the reach of a published paper. Important decisions may turn on these measures:

They influence how an individual is promoted or evaluated for tenure within a university, for example, or whether or not a project gets funding.

To Manfred Laubichler, a biologist at Arizona State University and an SFI External Professor, the increased reliance on these metrics is a worrying trend. That dependence risks collapsing judgment and impact. “We have basically outsourced what is the core activity of science, namely to judge the future direction of science,” he says. These metrics may fail to recognize novel ideas or innovative approaches, especially in interdisciplinary fields that aren’t easily categorized.

Laubichler and SFI President David Krakauer suspect that the tools of complexity science can help. They’ve organized an April 3-4 working group at SFI, designed to explore questions about scientific value. “The goal is to basically first get some clarity about what we actually mean by impact, and what judgment means, in the context of the type of science we are pursuing at the SFI,” Laubichler says.

The workshop will bring together researchers in complexity science with institutional leaders to start a conversation about reframing the problem of measuring impact in science. Laubichler says the group will look to the tools of complexity for insights into how to improve judgment.

Laubichler hopes the discussion will spur ideas about new ways of measuring value. “We’ll be attempting to find creative new metrics that actually represent the values we advocate for in the kind of work we’re doing,” he says. 

Read more about the working group, "Judgment to Impact."