entropy
//ˈɛntrəpi//
Noun
A thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.
According to the second law of thermodynamics, the entropy of an isolated system always increases over time.
A measure of the disorder or randomness in a closed system; the tendency of systems to move from order to disorder.
According to the second law of thermodynamics, the entropy of an isolated system always increases over time.
In information theory, a measure of the uncertainty or unpredictability of information content.
The entropy of a message increases when its content becomes more unpredictable.
A gradual decline into disorder, chaos, or deterioration.
Without proper maintenance, the old building fell into entropy.