WIT Press

Functional Information And Entropy In Living Systems


Free (open access)








407 kb

Paper DOI



WIT Press


A. C. McIntosh


In any living system one quickly becomes aware of the extraordinary complexity that so organises the chemical proteins at the biochemical level as to effectively build digital machinery which for many years, since the discovery by Crick and Watson of DNA, has been the goal of modern software engineers to emulate. The functional complexity of these systems is clearly heavily dependent on the material environment in which such a system is operating and indeed uses all the same chemical and physical laws that are used to such good effect by any man made machines. What though are the laws that such organisation must inherently obey for natural systems? Can one quantify the organisational structure that sits on top of the matter and energy in any real system? In this paper, the author will consider the fundamental aspects of entropy and the second law of thermodynamics applied first of all in the traditional definitions used in heat and chemical systems. Then analogous representations of ‘logical entropy’ will be discussed where for a number of years many scientists (such as Prigogine) have been attempting to simulate in a rational way the idea of functional complexity. Prigogine’s work has primarily been seeking to express self organisation in terms of non-equilibrium thermodynamics and the term ‘Prigogine entropy’ has thus been introduced. Allied closely to this is the concept of the definition of information which must go beyond the simple recipe of Shannon’s Theory, that essentially only deals with the transmission of existing data. The main issue at stake in any discussions of functional complexity is arriving at a logical approach to describing the possible states of the system, and secondly to establishing a valid proportionality constant that is analogous to the Boltzmann constant of traditional thermodynamics. In this paper we discuss how the laws of thermodynamics can be understood in terms of the possible information content of molecules. We build on the concept of information transfer and the notion of ‘logical entropy’, to considering the application of the laws of thermodynamics to non-equilibrium chemistry. This then concerns the basic definition of how information is defined and connected to the fundamental laws of thermodynamics. Although the paper may raise more questions than answers, the aim will be to at least move further towards a rigorous scientific treatment of the whole concept of organisation and system structure by seeking parallel (logical) laws of complexity in system states to the well known laws of thermodynamics.