A core concern of the field of computer science theory is how various “resource costs” of a given computation depend on that computation. Canonical examples of such costs are “time” (the number of iterations needed to complete the computation) and “space” (the amount of memory required to complete the computation). However, one of the major ``resource costs’’ of real-world computers is the amount of energy they consume / amount of heat they generate.
This resource cost has not been investigated by computer science theorists before. One of the primary reasons is that real-world computers have many complicating features, even if one does not consider the precise physical details of those computers. These include the fact that these computers operate very far from thermal equilibrium, in finite time, with many quickly (co-)evolving degrees of freedom. Real-world computer systems also obey multiple physical constraints on how they work. For example, all modern digital computers are periodic processes, governed by a global clock. They are also modular, hierarchical systems, with strong restrictions on the connectivity of their subsystems.
Fortunately, the rapidly developing field of stochastic thermodynamic provides the formal tools for analyzing the thermodynamic costs of precisely these kinds of real-world computers. In this meeting we will begin to investigate the vast potential synergy this opens up between computer science theory and stochastic thermodynamics.