Efficient calculation of configurational entropy from molecular simulations by combining the mutual‐information expansion and nearest‐neighbor methods
- 27 May 2008
- journal article
- research article
- Published by Wiley in Journal of Computational Chemistry
- Vol. 29 (10), 1605-1614
- https://doi.org/10.1002/jcc.20919
Abstract
Changes in the configurational entropies of molecules make important contributions to the free energies of reaction for processes such as protein-folding, noncovalent association, and conformational change. However, obtaining entropy from molecular simulations represents a long-standing computational challenge. Here, two recently introduced approaches, the nearest-neighbor (NN) method and the mutual-information expansion (MIE), are combined to furnish an efficient and accurate method of extracting the configurational entropy from a molecular simulation to a given order of correlations among the internal degrees of freedom. The resulting method takes advantage of the strengths of each approach. The NN method is entirely nonparametric (i.e., it makes no assumptions about the underlying probability distribution), its estimates are asymptotically unbiased and consistent, and it makes optimum use of a limited number of available data samples. The MIE, a systematic expansion of entropy in mutual information terms of increasing order, provides a well-characterized approximation for lowering the dimensionality of the numerical problem of calculating the entropy of a high-dimensional system. The combination of these two methods enables obtaining well-converged estimations of the configurational entropy that capture many-body correlations of higher order than is possible with the simple histogramming that was used in the MIE method originally. The combined method is tested here on two simple systems: an idealized system represented by an analytical distribution of six circular variables, where the full joint entropy and all the MIE terms are exactly known, and the R,S stereoisomer of tartaric acid, a molecule with seven internal-rotation degrees of freedom for which the full entropy of internal rotation has been already estimated by the NN method. For these two systems, all the expansion terms of the full MIE of the entropy are estimated by the NN method and, for comparison, the MIE approximations up to third order are also estimated by simple histogramming. The results indicate that the truncation of the MIE at the two-body level can be an accurate, computationally nondemanding approximation to the configurational entropy of anharmonic internal degrees of freedom. If needed, higher-order correlations can be estimated reliably by the NN method without excessive demands on the molecular-simulation sample size and computing time. © 2008 Wiley Periodicals, Inc. J Comput Chem, 2008Keywords
This publication has 15 references indexed in Scilit:
- Extraction of configurational entropy from molecular simulations via an expansion approximationThe Journal of Chemical Physics, 2007
- Nearest‐neighbor nonparametric method for estimating the configurational entropy of complex moleculesJournal of Computational Chemistry, 2006
- Generalized correlation for biomolecular dynamicsProteins, 2005
- Evaluating the Accuracy of the Quasiharmonic ApproximationJournal of Chemical Theory and Computation, 2005
- A new class of random vector entropy estimators and its applications in testing statistical hypothesesJournal of Nonparametric Statistics, 2005
- Estimating mutual informationPhysical Review E, 2004
- Probabilistic model for two dependent circular variablesBiometrika, 2002
- Evaluation of the configurational entropy for proteins: application to molecular dynamics simulations of an α-helixMacromolecules, 1984
- Method for estimating the configurational entropy of macromoleculesMacromolecules, 1981
- An Algorithm for Finding Best Matches in Logarithmic Expected TimeACM Transactions on Mathematical Software, 1977