Abstract

Since its conception over 150 years ago, entropy has enlightened and confused scholars and students alike, from its origins in physics and beyond. More recently, it has been considered within the urban context in a rather eclectic range of applications. The entropy maximization approach, as applied by Alan Wilson and others from the 1960s, contrasts with considerations from the 1990s of the city as a thermodynamic dissipative system, in the tradition of Ilya Prigogine. By reviewing the relevant mathematical theory, we draw the distinction among three interrelated definitions of entropy, the thermodynamic, the figurative, and the information statistical. The applications of these definitions to urban systems within the literature are explored, and the conflation of the thermodynamic and figurative interpretations are disentangled. We close this paper with an outlook on future uses of entropy in urban systems analysis.

Highlights

  • Oxford Dictionaries defines entropy in three categories: (1) physical: “a thermodynamic quantity representing the unavailability of a system’s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system”; (2) figurative: “lack of order or predictability, gradual decline into disorder”; (3) information statistical: “a logarithmic measure of the rate of transfer of information in a particular message or language” [1]

  • The first definition comes from physics, and it may be argued to be equivalent to a special case of the third, information statistical, definition, applied at the microscopic level [2,3,4,5]

  • We argue that this derives primarily from conflation between the first two definitions of entropy detailed in our introduction, the thermodynamic sense, and the figurative sense

Read more

Summary

Introduction

Oxford Dictionaries defines entropy in three categories: (1) physical: “a thermodynamic quantity representing the unavailability of a system’s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system”; (2) figurative: “lack of order or predictability, gradual decline into disorder”; (3) information statistical: “a logarithmic measure of the rate of transfer of information in a particular message or language” [1]. The physical definition of entropy typically yields interpretation of the city as a thermodynamic system, identified within the literature with regard to postulated links between entropy, the irreversibility of the second law of thermodynamics, and a notion of ‘sustainability’ [13,14,15,16,17,18] This approach has its roots with both the work on non-equilibrium thermodynamics of Ilya Prigogine, and that of the economist Nicolas Georgescu-Roegen, who considered entropy in relation to the economic process [19,20].

Information and Entropy
Entropy Maximization
The Second Law and Thermodynamic Applications
Applications of Entropy to Urban Systems
Information Statistical Entropy
Thermodynamic Entropy
Discussion
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.