Laws of evolution parallel the laws of thermodynamics
Само за регистроване кориснике
2018
Чланак у часопису (Објављена верзија)
,
Elsevier
Метаподаци
Приказ свих података о документуАпстракт
We hypothesize that concepts from thermodynamics and statistical mechanics can be used to define summary statistics, similar to thermodynamic entropy, to summarize the convergence of processes driven by random inputs subject to deterministic constraints. The primary example used here is biological evolution. We propose that evolution of biological structures and behaviors is driven by the ability of living organisms to acquire, store, and act on information and that summary statistics can be developed to provide a stochastically deterministic information theory for biological evolution. The statistical concepts that are the basis of thermodynamic entropy are also true for information, and we show that adaptation and evolution have a specific deterministic direction arising from many random events. Therefore, an information theory formulated on the same foundation as the immensely powerful concepts used in statistical mechanics will provide statistics, similar to thermodynamic entropy, ...that summarize distribution functions for environmental properties and organism performance. This work thus establishes foundational principles for a quantitative theory that encompasses both behavioral and biological evolution and may be extended to other fields such as economics, market dynamics and health systems.
Кључне речи:
Information / Disorder / Order / Entropy / Statistical mechanics / Microstate / SequenceИзвор:
The Journal of Chemical Thermodynamics, 2018, 124, 141-148Издавач:
- Elsevier
Институција/група
IHTMTY - JOUR AU - Hansen, Lee D. AU - Popović, Marko AU - Tolley, H. Dennis AU - Woodfield, Brian F. PY - 2018 UR - https://cer.ihtm.bg.ac.rs/handle/123456789/6072 AB - We hypothesize that concepts from thermodynamics and statistical mechanics can be used to define summary statistics, similar to thermodynamic entropy, to summarize the convergence of processes driven by random inputs subject to deterministic constraints. The primary example used here is biological evolution. We propose that evolution of biological structures and behaviors is driven by the ability of living organisms to acquire, store, and act on information and that summary statistics can be developed to provide a stochastically deterministic information theory for biological evolution. The statistical concepts that are the basis of thermodynamic entropy are also true for information, and we show that adaptation and evolution have a specific deterministic direction arising from many random events. Therefore, an information theory formulated on the same foundation as the immensely powerful concepts used in statistical mechanics will provide statistics, similar to thermodynamic entropy, that summarize distribution functions for environmental properties and organism performance. This work thus establishes foundational principles for a quantitative theory that encompasses both behavioral and biological evolution and may be extended to other fields such as economics, market dynamics and health systems. PB - Elsevier T2 - The Journal of Chemical Thermodynamics T1 - Laws of evolution parallel the laws of thermodynamics VL - 124 SP - 141 EP - 148 DO - 10.1016/j.jct.2018.05.005 ER -
@article{ author = "Hansen, Lee D. and Popović, Marko and Tolley, H. Dennis and Woodfield, Brian F.", year = "2018", abstract = "We hypothesize that concepts from thermodynamics and statistical mechanics can be used to define summary statistics, similar to thermodynamic entropy, to summarize the convergence of processes driven by random inputs subject to deterministic constraints. The primary example used here is biological evolution. We propose that evolution of biological structures and behaviors is driven by the ability of living organisms to acquire, store, and act on information and that summary statistics can be developed to provide a stochastically deterministic information theory for biological evolution. The statistical concepts that are the basis of thermodynamic entropy are also true for information, and we show that adaptation and evolution have a specific deterministic direction arising from many random events. Therefore, an information theory formulated on the same foundation as the immensely powerful concepts used in statistical mechanics will provide statistics, similar to thermodynamic entropy, that summarize distribution functions for environmental properties and organism performance. This work thus establishes foundational principles for a quantitative theory that encompasses both behavioral and biological evolution and may be extended to other fields such as economics, market dynamics and health systems.", publisher = "Elsevier", journal = "The Journal of Chemical Thermodynamics", title = "Laws of evolution parallel the laws of thermodynamics", volume = "124", pages = "141-148", doi = "10.1016/j.jct.2018.05.005" }
Hansen, L. D., Popović, M., Tolley, H. D.,& Woodfield, B. F.. (2018). Laws of evolution parallel the laws of thermodynamics. in The Journal of Chemical Thermodynamics Elsevier., 124, 141-148. https://doi.org/10.1016/j.jct.2018.05.005
Hansen LD, Popović M, Tolley HD, Woodfield BF. Laws of evolution parallel the laws of thermodynamics. in The Journal of Chemical Thermodynamics. 2018;124:141-148. doi:10.1016/j.jct.2018.05.005 .
Hansen, Lee D., Popović, Marko, Tolley, H. Dennis, Woodfield, Brian F., "Laws of evolution parallel the laws of thermodynamics" in The Journal of Chemical Thermodynamics, 124 (2018):141-148, https://doi.org/10.1016/j.jct.2018.05.005 . .