Recent years have seen a surge of interest in the algorithmic estimation of stochastic entropy production (EP) from trajectory data via machine learning. A crucial element of such algorithms is the identification of a loss function whose minimization guarantees the accurate EP estimation. In this study we show that there exists a host of loss functions, namely, those implementing a variational representation of the -divergence, which can be used for the EP estimation. By fixing to a value between and 0, the (Neural Estimator for Entropy Production) exhibits a much more robust performance against strong nonequilibrium driving or slow dynamics, which adversely affects the existing method based on the Kullback-Leibler divergence . In particular, the choice of tends to yield the optimal results. To corroborate our findings, we present an exactly solvable simplification of the EP estimation problem, whose loss function landscape and stochastic properties give deeper intuition into the robustness of the .
- Received 4 March 2023
- Revised 27 November 2023
- Accepted 5 January 2024
DOI:https://doi.org/10.1103/PhysRevE.109.014143
©2024 American Physical Society
Statistical Physics & Thermodynamics