Reports: DNI655195-DNI6: Emergence of Entropy From the Mixing Dynamics of Complex Liquids

Jason R. Green, PhD, University of Massachusetts Boston

Mixing matter and energy are responsible for the transport and transformation of chemicals in solution. One way to characterize the mixing process is with the thermodynamic entropy. However, predicting the thermodynamic entropy of mixing is a difficult problem for liquids composed of molecules with internal structure, dissimilar intermolecular interactions, or many components, features that are found in the complex hydrocarbon mixtures central to the petroleum field. This project aims to gain a new understanding of entropy, as it emerges from the dynamics of molecules, in liquids that are mixing. The theoretical framework we are applying uses statistical-dynamical entropy measures, namely the Kolmogorov-Sinai (KS) entropy rate, for molecular motion to disentangle the role of these features in mixing. Because they derive directly from the dynamics, these entropy measures are independent of the complexities of hydrocarbon mixtures, the ideality of the mixture formed, and are valid during mixing, far from thermodynamic equilibrium. Nevertheless, our initial work on simple fluids shows the statistics of the KS entropy are consistent with the thermodynamic entropy. Our initial efforts have focussed on two aspects of the emergence of the KS entropy from molecular dynamics.


1. Extensivity and additivity of the Kolmogorov-Sinai entropy and Lyapunov spectra. The Lyapunov spectra and KS entropy from dynamical systems theory have long been observables of interest for statistical mechanics. While the KS entropy is technically a rate, given its name, we have sought similarities and quantitative connections with the thermodynamic entropy, or the entropy production (rate). Two key features of the thermodynamic entropy are its linear growth with system size, or extensivity, and its additivity. Using computer simulations of simple fluids, we have analyzed both the additivity and extensivity of the KS entropy for simple fluids over system sizes ranging from 200 to 2000 molecules.


Leveraging recent computational progress, we showed the KS entropy is linearly extensive in, and an additive function, of the number of particles for atomistic models of simple fluids, in agreement with the thermodynamic entropy. From our molecular dynamics simulations, these properties hold for both three-dimensional Lennard-Jones (LJ) and Weeks-Chandler-Andersen (WCA) fluids at several densities and temperatures. At sufficiently high densities, the Lyapunov spectrum has a linear structure that gives a geometrical relation and justifies the linear extensivity of the KS entropy. While the extensivity is robust, the structure of the spectrum is not always linear, the exact form being sensitively dependent on the temperature, density, and the nature of the interparticle forces. Regardless of interparticle forces, the maximal Lyapunov exponent is intensive for systems ranging from 200 to 2000 particles. Because of the intensive nature of λmax, this structure vanishes in the thermodynamic limit: macroscopic simple fluids will have an effectively uniform Lyapunov exponent spectra at equilibrium.


According to the van der Waals picture, attractive and repulsive forces play distinct roles in the structure of simple fluids. We have examined their roles in dynamics, namely the degree of deterministic chaos with the KS entropy rate and the spectra of Lyapunov exponents. With computer simulations of three-dimensional LJ and WCA fluids, we have found repulsive forces dictate these dynamical properties, with attractive forces reducing the KS entropy at a given thermodynamic state. Our simulations also showed that the properties of these dynamical observables are a consequence of populations of particles sampling the interaction potential, and that the role of these interparticle forces in the chaotic dynamics agree with the van der Waals picture of fluids -- the divergence of trajectories and the magnitude of the KS entropy are dominated by repulsive forces, while attractive forces play a minor role suppressing the divergence of trajectories.

At fixed system size, through a more focused study of the effects of temperature on the Lyapunov instability of these fluids shows that λmax and the KS entropy are nearly linear functions of temperature for the LJ fluid above a threshold. For the WCA fluid, both quantities scale as √T. Furthermore, their monotonic increase with the temperature agrees qualitatively with the thermodynamic entropy. The variation of λmax and the KS entropy with density, however, are nonmonotonic, exhibiting maxima in agreement with previous studies of two-dimensional fluids. We have attributed the maxima to the competition between two effects: as particles are forced to be in closer proximity, there is an enhancement from the sharp curvature of the repulsive potential and a suppression from the diminishing free volume and particle mobility. This characteristic variation of the KS entropy with density is in contrast to the thermodynamic entropy for simple fluids. The extensivity and additivity of the KS entropy and the intensivity of the largest Lyapunov exponent, however, hold over a range of temperatures and densities across the liquid and liquid-vapor coexistence regimes. Overall, these equilibrium properties of the KS entropy suggests that it has some features in common with the thermodynamic entropy and others that are unique to its dynamical nature.


2. Self-averaging fluctuations of Lyapunov exponents and the Kolmogorov-Sinai entropy. Another important feature of thermodynamic properties, as they emerge from time or ensemble averages in statistical mechanics, is that they are self-averaging. We have preliminary results showing that the Lyapunov spectra and the KS entropy are self-averaging properties of simple liquids. Self-averaging behavior is important to understanding the mixing of liquids and the use of these statistical quantities as estimates of thermodynamic observables. Self-averaging physical observables have distributions that are (for large N) sharply peaked around their average values: the variances of their distributions must go to zero for in the thermodynamic limit. This project is still in its early stages, but we have quantified the dependence of the fluctuations in the KS entropy and exponents on N through the scaling exponents γ.

The next steps in this project will be to focus on demonstrating the feasibility of a framework to predict the variation of entropy changes with mixture composition from molecular simulations. Overall, this research is taking steps toward a statistical-mechanical approach to predict the thermodynamics of nonideal, complex mixtures.