Skip to main content
Fig. 2 | BMC Medical Informatics and Decision Making

Fig. 2

From: Using information theory to identify redundancy in common laboratory tests in the intensive care unit

Fig. 2

An illustration of information theory applied to two variables. This Venn diagram illustrates: the entropies in the two variables X and Y, represented as H(X) and H(Y), respectively; the mutual information (i.e., redundant information) between X and Y, represented as I(X;Y); and the expected amounts of novel information in X and Y, represented as H(X|Y) and H(Y|X), respectively. H(X) is greater than H(Y), which signifies that X is associated with more randomness and less predictability than Y. H(X|Y) represents the expected amount of novel information left in X when Y is known, and vice versa for H(Y|X)

Back to article page