Information theory structure definition: Difference between revisions
From Santa Fe Institute Events Wiki
No edit summary |
No edit summary |
||
Line 5: | Line 5: | ||
In information theory, the measure that relates two description of the same system is the mutual information: | In information theory, the measure that relates two description of the same system is the mutual information: | ||
<math>I(X | Y) = H(X) - H(X | Y) </math> | <math>$I(X | Y) = H(X) - H(X | Y) $</math> | ||
However this kind of reduction without information loss is not always possible and usually when we get the simplifiest version of X description exploiting its structure we loss information in such mapping. | However this kind of reduction without information loss is not always possible and usually when we get the simplifiest version of X description exploiting its structure we loss information in such mapping. |
Revision as of 16:13, 11 June 2008
From a information theory point of view, structure is a property of a system S with an original description X that allows to encode it in a more simplified description, namely Y. Therefore, Y is a new description of X that exploits its structure.
If given the description Y we have some function that maps to description X then there is no loss of information in this mapping and the information of X is totally encoded in Y. Then, we can also say that Y is a better description for the original system than X.
In information theory, the measure that relates two description of the same system is the mutual information:
However this kind of reduction without information loss is not always possible and usually when we get the simplifiest version of X description exploiting its structure we loss information in such mapping.