Santa Fe
Institute
  • Research
    • Themes
    • Projects
    • SFI Press
    • Researchers
    • Publications
    • Library
    • Sponsored Research
    • Fellowships
    • Miller Scholarships
  • News + Events
    • News
    • Newsletters
    • Podcasts
    • SFI in the Media
    • Media Center
    • Events
    • Community
    • Journalism Fellowship
  • Education
    • Programs
    • Projects
    • Alumni
    • Complexity Explorer
    • Education FAQ
    • Postdoctoral Research
    • Education Supporters
  • People
    • Researchers
    • Fractal Faculty
    • Staff
    • Miller Scholars
    • Trustees
    • Governance
    • Resident Artists
    • Research Supporters
  • Applied Complexity
    • Office
    • Applied Projects
    • ACtioN
    • Applied Fellows
    • Studios
    • Applied Events
    • Login
  • Give
    • Give Now
    • Ways to Give
    • Contact
  • About
    • About SFI
    • Engage
    • Complex Systems
    • FAQ
    • Campuses
    • Jobs
    • Contact
    • Library
    • Employee Portal

Science for a Complex World

Events

Here's what's happening

Give

You make SFI possible

Subscribe

Sign up for research news

Connect

Follow us on social media

© 2026 Santa Fe Institute. All rights reserved. This site is supported by the Miller Omega Program.

Home / News

Stochastic thermodynamics, meet information theory

July 17, 2025

Ten years ago, SFI Professor David Wolpert set out to build a bridge between two scientific fields that might seem to have nothing to say to each other — computer-science theory and a branch of physics called stochastic thermodynamics. Computer-science theorists typically study the “resource cost” of computation — the number of iterations a computer requires to complete a calculation, for example, or the amount of memory needed. But, Wolpert says, there’s also an important energetic cost in computing — how much energy is required — that has not been thoroughly investigated. Physicists who study stochastic thermodynamics, on the other hand, study systems far out of thermal equilibrium, which means they require or
produce heat.

Wolpert recognized that computers operate far from thermal equilibrium: They require energy to run, and they produce heat as they do so. The mathematical tools of stochastic thermodynamics seemed like a perfect and obvious way to probe the energy dynamics of computations. “It was such a match made in heaven,” he says. He assumed that this intersection had already been thoroughly explored, but he was wrong. So, he set out to establish the fundamentals on his own and convince others to join him. “I knew it would probably take about a decade before the engine would really start turning over.” 

A decade has passed, and he says the engine is humming. From June 16 to June 20 at SFI’s Cowan campus, Wolpert and his SFI co-organizers hosted a working group — a follow-up to one held last year — that brought together researchers to explore ideas and forge collaborations between the two fields. Participants included computer scientists and physicists, representing three continents, who shared progress on existing projects, ideas for new ones, and brainstormed ways to forge the new mathematics required to explore fundamental ideas around the thermodynamics of computers. They’re lured into the field, says Wolpert, by the possibility of developing new mathematics. 

“We have no idea what’s coming next,” he says. The nascent collaborations could spin off in many possible directions. Almost every issue of concern in computer-science theory can be translated into terms of energetic costs rather than other kinds of resource costs. Then, these issues can be dissected, analyzed, and modeled by developing new mathematical tools, using thermodynamics. Boolean circuits, for example, are mathematical models of computation that carry out logical operations — and operate far from thermal equilibrium. Researchers at the meeting discussed using stochastic thermodynamics to better understand the energy cost of big communication networks and chemical computers, which use chemical reactions to compute instead of the usual components.

“Basically, every chapter in computer-science-theory textbooks” is fair game, says Wolpert. “It’s all happening.” The meeting participants are already planning the next meeting and exploring possible publications and future books on the emerging subfield, as well. 

“This is what SFI is all about,” he says. “Taking fields that never even knew one another existed and just getting them to finally bump into one another near the punch bowl. That was this meeting.”

Read more about the working group Stochastic Thermodynamics and Computer Science Theory II

Speaker

David WolpertDavid WolpertProfessor at SFI; External Professor at the Complexity Science Hub in Vienna




Share
  • Sign Up For SFI News
News Media Contact

Santa Fe Institute

Office of Communications
news@santafe.edu
505-984-8800



  • Tags
  • SFI News Release


More SFI News

View All News

Looking at AGI through the lens of natural intelligence

A simple baseline for AI forecasting in machine learning

Constantino Tsallis to co-chair the 2027 Nobel Symposium on Statistical Mechanics

How novelty arrives: Review of “The Origins of the New”

Working group asks, what’s the benefit of a brain?

Measuring irreversibility in gene transcription

ACtioN Academy engages industry leaders on AI and complexity

Arguing for a complex adaptive power grid

Mark Newman Awarded 2026 SIAM John von Neumann Prize

Review: Nonesuch, by SFI Miller Scholar Francis Spufford

Laurent Hébert-Dufresne to receive Young Scientist Award

What does it mean to compute?

Reassessing the scientific method

SFI External Professor Santiago Elena elected to the American Academy of Microbiology

From cells to companies: Study shows how diversity scales within complex systems

SFI Press launches “The Economy as an Evolving Complex System IV”

New dataset reveals how U.S. law has grown more complex over the past century

Boldness is key to avoiding self-censorship, model shows

SFI welcomes Program Postdoctoral Fellow Jordan Kemp

Disentangling the Boltzmann brain hypothesis: Memory, entropy, and time