Abstract
Source coding problems are treated for Shannon's (1949) cipher system with correlated source outputs (X,Y). Several cases are considered based on whether both X and Y, only X, or only Y must be transmitted to the receiver, whether both X and Y, only X, or only Y must be kept secret, or whether the security level is measured by (1/KH(XK|W), (1/KH(YK|W)) or 1/K H(XKYK|W) where W is a cryptogram. The admissible region of cryptogram rate and key rate for a given security level is derived for each case. Furthermore, two new kinds of common information of X and Y, say C1(X;Y) and C2(X;Y), are considered. C1(X;Y) is defined as the rate of the attainable minimum core of (XK,YK) by removing each private information from (XK,YK) as much as possible, while C2(X;Y) is defined as the rate of the attainable maximum core VC such that if one loses VC , then each uncertainty of XK and YK becomes H(VC). It is proved that C1(X;Y)=I(X;Y) and C2(X;Y)=min {H(X), H(Y)}. C1(X;Y) justifies the author's intuitive feeling that the mutual information represents a common information of X and Y

This publication has 9 references indexed in Scilit: