“entropy”的英英意思

单词 entropy
释义 entropy Physics.|ˈɛntrəpɪ|
[f. Gr. τροπή transformation (lit. ‘turning’), after the analogy of energy. First proposed by Clausius (1865) in Ger. form entropie.
Clausius (Pogg. Ann. CXXV. 390), assuming (unhistorically) the etymological sense of energy to be ‘work-contents’ (werk-inhalt), devised the term entropy as a corresponding designation for the ‘transformation-contents’ (verwandlungsinhalt) of a system.]
1. The name given to one of the quantitative elements which determine the thermodynamic condition of a portion of matter. Also transf. and fig.
In Clausius' sense, the entropy of a system is the measure of the unavailability of its thermal energy for conversion into mechanical work. A portion of matter at uniform temperature retains its entropy unchanged so long as no heat passes to or from it, but if it receives a quantity of heat without change of temperature, the entropy is increased by an amount equal to the ratio of the mechanical equivalent of the quantity of heat to the absolute measure of the temperature on the thermodynamic scale. The entropy of a system = the sum of the entropies of its parts, and is always increased by any transport of heat within the system: hence ‘the entropy of the universe tends to a maximum’ (Clausius). The term was first used in Eng. by Prof. Tait (see quot. 1868), who however proposed to use it in a sense exactly opposite to that of Clausius. In this he was followed (with an additional misunderstanding: see quot. 1875) by Maxwell and others; but subsequently Tait and Maxwell reverted to the original definition, which is now generally accepted.
1868Tait Sketch Thermodynamics 29 We shall..use the excellent term Entropy in the opposite sense to that in which Clausius has employed it—viz., so that the Entropy of the Universe tends to zero.1875Maxwell Th. Heat (ed. 4) 189 note, In former editions of this book the meaning of the term Entropy as introduced by Clausius was erroneously stated to be that part of the energy which cannot be converted into work. The book then proceeded to use the term as equivalent to the available energy...In this edition I have endeavoured to use Entropy according to its original definition by Clausius.1885Watson & Burbury Math. Th. Electr. & Magn. I. 245 As in the working of a heat engine, the entropy of the system must be diminished by the process, that is, there must be equalisation of temperature.1925A. & J. Strachey tr. Freud's Coll. Papers III. v. 599 In considering the conversion of psychical energy no less than of physical, we must make use of the concept of an entropy, which opposes the undoing of what has already occurred.1933W. E. Orchard From Faith to Faith xi. 280 The deduction which one of our greatest physicist astronomers draws from the second law of thermodynamics: namely, that since there must be a maximum entropy, there must have been once its maximum opposite.1955Sci. Amer. May 124/2 Certain combinations of balls yield a greater change in entropy than others. Those combinations in which entropy change reaches maximum value lead to solutions.1955Ibid. June 64/1 This equilibrium state..is the thermodynamic condition of maximum entropy—the most disordered state, in which the least amount of energy is available for useful work.1965Financial Times 11 Aug., Moralising by those whose industrial entropy is an accepted fact of life is neither likely to persuade the workers nor assist the trade unions in the task of trying to meet the nation's difficulties.
2. a. Communication Theory. A measure of the average information rate of a message or language; esp. the quantity -σpi log pi (where the pi are the probabilities of occurrence of the symbols of which the message is composed), which represents the average information rate per symbol.
1948Bell Syst. Techn. Jrnl. XXVII. 396 Consider a discrete source of the finite state type... There is an entropy Hi for each state. The entropy of the source will be defined as the average of these Hi weighted in accordance with the probability of occurrence of the states in question... This is the entropy of the source per symbol of text.1953S. Goldman Information Theory 329 The amount of language information (i.e., entropy) in the sequence is..a measure of the choice that was available when the sequence was selected.1953D. A. Bell Stat. Methods Electr. Engin. x. 145 Since entropy increases as the arrangement of a system becomes less distinguishable from other possible arrangements, while the value of a pattern for conveying information depends on its uniqueness, the information capacity of a signal is the negative of its entropy.1960D. Middleton Introd. Stat. Communication Theory vi. 301 H(X) is called the (communication) entropy of the X ensemble, since..it is the direct mathematical analogue of the more familiar entropy measure of statistical mechanics.1964Language XL. 210 The basic probability concept, ‘entropy’, and its quantum, the ‘bit’, are now part of the metalanguage of linguistics.
b. Math. In wider use: any quantity having properties analogous to those of the physical quantity; esp. the quantity -σxi log xi of a distribution {ob}x1, x2,{ddd}{cb}.
1951Jrnl. R. Statistical Soc. B. XIII. 60 The idea of selective entropy provides us with a new and important concept in the analytical theory of probability.1961Proc. Cambr. Philos. Soc. (Math. & Physical Sci.) LVII. 839 The analogue of Boltzmann's H-theorem is not a statement about the monotonicity of the entropy of the chains..but a statement about the ‘entropy’ of the frequency distribution (s1,..sn).1968P. A. P. Moran Introd. Probability Theory i. 50 Since -x log x is a convex function the entropy of a finite set of events is a maximum when their probabilities are equal.

 

英语词典包含277258条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。