Liquid mixtures, which have a phase diagram exhibiting a miscibility gap ending in a critical point of solution, have been used as solvents for chemical reactions. The reaction rate in the forward direction has often been observed to slow down as a function of temperature in the critical region. Theories based upon the Gibbs free energy of reaction as the driving force for chemical change have been invoked to explain this behavior. With the assumption that the reaction is proceeding under relaxation conditions, these theories expand the free energy in a Taylor series about the position of equilibrium. Since the free energy is zero at equilibrium, the leading term in the Taylor series is proportional to the first derivative of the free energy with respect to the extent of reaction. To analyze the critical behavior of this derivative, the theories exploit the principle of critical point isomorphism, which is thought to govern all critical phenomena. They find that the derivative goes to zero in the critical region, which accounts for the slowing down observed in the reaction rate. As has been pointed out, however, most experimental rate investigations have been carried out under irreversible conditions as opposed to relaxation conditions [Shen et al. J. Phys. Chem. A 2015, 119, 8784-8791]. Below, we consider a reaction governed by first order kinetics and invoke transition state theory to take into account the irreversible conditions. We express the apparent activation energy in terms of thermodynamic derivatives evaluated under standard conditions as well as the pseudoequilibrium conditions associated with the reactant and the activated complex. We show that these derivatives approach infinity in the critical region. The apparent activation energy follows this behavior, and its divergence accounts for the slowing down of the reaction rate.
Read full abstract