Uncategorized

Phys. Rev. E 109, 014143 (2024)


Recent years have seen a surge of interest in the algorithmic estimation of stochastic entropy production (EP) from trajectory data via machine learning. A crucial element of such algorithms is the identification of a loss function whose minimization guarantees the accurate EP estimation. In this study we show that there exists a host of loss functions, namely, those implementing a variational representation of the α-divergence, which can be used for the EP estimation. By fixing α to a value between 1 and 0, the αNEEP (Neural Estimator for Entropy Production) exhibits a much more robust performance against strong nonequilibrium driving or slow dynamics, which adversely affects the existing method based on the Kullback-Leibler divergence (α=0). In particular, the choice of α=0.5 tends to yield the optimal results. To corroborate our findings, we present an exactly solvable simplification of the EP estimation problem, whose loss function landscape and stochastic properties give deeper intuition into the robustness of the αNEEP.

2 More

  • Received 4 March 2023
  • Revised 27 November 2023
  • Accepted 5 January 2024

DOI:https://doi.org/10.1103/PhysRevE.109.014143

©2024 American Physical Society

Statistical Physics & Thermodynamics



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *