开发者

Computation of Mutual Information

Suppose M is a set of objects m each having attributes X and Y. Now if X and Y can have only one value for given m (i.e. X,Y are random variables with P(X=x_i|M=m_i), P(Y=y_i|M=m_i)), it's possible to calculate mutual information of X and Y. But what if X can have multiple outcomes at once? I.e. for m_3 X={x1,x2} - generally outcome of X is subset of all possible outcomes. Can mutual information or some other measure of dependence be measured in such a case?

Is it possible to split X into binary random variables X_1, X_2, etc where X_1=1 iff X contains x1, X_1=0 otherwise and then compute I(X_i,Y_j) for all combinations i,j and sum up the information in order to get I(X,Y)?

Thanks.

Example:

m_1: X={a,b}, Y={x,y}开发者_如何转开发; m_2: X={c}, Y={z,x}


If I'm not wrong, the premise you set:

If M is a set of objects { m1, m2, ... },
and each mi has two attributes X, Y,
and X, Y can be a set of { x1, x2, ... } , { y1, y2, ... } respectively

then you want to define

*(X, Y) based on each mi's X, Y

Well, this increases the complexity of the problem significantly in terms of computation, but you can still do the same type of correlation, except instead of correlating two values X and Y, you are correlating two subsets X and Y.


Depending on what the sets mean and on what you want to use the mutual information for, you could just treat the sets as atomic values. Then your event space is the powerset of V_X and you can compute mutual information on that larger event space in the usual way (think bitstrings).

There are multivariate generalizations of mutual information, such as interaction information or total correlation, but I don't think they're quite what you're looking for. You might be better off looking at other, non-information theoretic multivariate measures of correlation.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜