Minimal Public Communication for Maximum Rate Secret Key Generation
Himanshu Tyagi
Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA.
Secret Key Generation
K1 K2 ≡K: Secret Key
F
X
nY
nSecret key (SK)Kwith interactive communicationFsatisfies:
Pr(K1=K2=K)≈1 : Recoverability 1
nI(K∧F)≈0 : Secrecy Rate of the secret key=n1H(K).
Secret key capacityC=maximum achievable rate of a secret key.
Communication for SK Capacity
What is the min. rate ofFrequired for achieving SK capacity?
◮ Maurer-Ahlswede-Csisz´ar
- Common randomness(CR) generated: Xn orYn.
- Rate of communication required= min{H(X|Y);H(Y|X)}.
- Decomposition:
H(X) =H(X |Y) +I(X∧Y), H(Y) =H(Y |X) +I(X∧Y).
◮ Csisz´ar-Narayan
- Common randomnessgenerated: Xn, Yn.
- Rate of communication required=H(X|Y) +H(Y|X).
- Decomposition:
H(X, Y) =H(X|Y) +H(Y |X) +I(X∧Y).
Communication for SK Capacity
What is the min. rate ofFrequired for achieving SK capacity?
◮ Maurer-Ahlswede-Csisz´ar
- Common randomness(CR) generated: Xn orYn.
- Rate of communication required= min{H(X|Y);H(Y|X)}.
- Decomposition:
H(X) =H(X |Y) +I(X∧Y), H(Y) =H(Y |X) +I(X∧Y).
◮ Csisz´ar-Narayan
- Common randomnessgenerated: Xn, Yn.
- Rate of communication required=H(X|Y) +H(Y|X).
- Decomposition:
H(X, Y) =H(X|Y) +H(Y |X) +I(X∧Y).
Communication for SK Capacity
What is the min. rate ofFrequired for achieving SK capacity?
◮ Maurer-Ahlswede-Csisz´ar
- Common randomness(CR) generated: Xn orYn.
- Rate of communication required= min{H(X|Y);H(Y|X)}.
- Decomposition:
H(X) =H(X |Y) +I(X∧Y), H(Y) =H(Y |X) +I(X∧Y).
◮ Csisz´ar-Narayan
- Common randomnessgenerated: Xn, Yn.
- Rate of communication required=H(X|Y) +H(Y|X).
- Decomposition:
H(X, Y) =H(X|Y) +H(Y |X) +I(X∧Y).
Q1: Forms of CR for agreeing on optimum rate SK ?
Q2: Does minimum communication rate correspond to “minimum” CR?
Interactive Form of Wyner’s Common Information
◮ Wyner’s Common Infomation
CI(X∧Y)≡min. rate of a functionL=L(Xn, Yn)such that 1
nI(Xn∧Yn|L)≈0.
Defined in the context of source generation and source coding.
Interactive Form of Wyner’s Common Information
◮ Wyner’s Common Infomation
CI(X∧Y)≡min. rate of a functionL=L(Xn, Yn)such that 1
nI(Xn∧Yn|L)≈0.
Defined in the context of source generation and source coding.
◮ Interactive Common Information
Terminals agree on CRJ usingr-roundsF.
CIir(X∧Y)≡min. rate of(J,F)such that 1
nI(Xn∧Yn|J,F)≈0.
CIi(X∧Y) = lim
r→∞
CIir(X∧Y).
Note:CI(X∧Y)≤CIi(X∧Y)≤min{H(X);H(Y)}.
Minimum Communication for Optimum SK
For minimum rate of communicationRSK for optimum rate SK:
We characterize the CR associated with optimum rate SK.
◮ A CRJ recoverable from communicationF⇒SK of rate n1H(J|F).
◮ An optimum rate SK corresponds to a CRJrecoverable fromFs.t.
1
nI(Xn∧Yn|J,F)≈0.
-For instance: J=Xn,Ynor(Xn, Yn).
-RCI be the min. rate of communication for such CR.
◮ Shall show: CR of the above form always yields optimum rate SK.
Minimum Communication for Optimum SK
For minimum rate of communicationRSK for optimum rate SK:
We characterize the CR associated with optimum rate SK.
◮ A CRJ recoverable from communicationF⇒SK of rate n1H(J|F).
◮ An optimum rate SK corresponds to a CRJrecoverable fromFs.t.
1
nI(Xn∧Yn|J,F)≈0.
-For instance: J=Xn,Ynor(Xn, Yn).
-RCI be the min. rate of communication for such CR.
◮ Shall show: CR of the above form always yields optimum rate SK.
Theorem
RSK=RCI =CIi(X∧Y)−I(X∧Y).
Minimum Communication for Optimum SK
Theorem
RSK=RCI =CIi(X∧Y)−I(X∧Y).
◮ Lemma: For eachr≥1:
- RrSK≥RCIr ≥Rr+1SK. - Decomposition:
RrCI≥CIir(X∧Y)−I(X∧Y)≥RCIr+1.
◮ Theorem follows by taking the limitr→ ∞in the Lemma.
Idea of the Proof
◮ Proof is based on the observation:
I(X∧Y)≈ 1 n
ˆI(Xn∧Yn|J,F)+H(J,F)−H(F|Xn)−H(F|Yn)˜ .
Characterization of the form of CR in optimum SK generation:
1
nI(Xn∧Yn|J,F)≈0 if and only if
I(X∧Y)≈ n1
»
H(J,F)−ˆ
H(F|Xn) +H(F|Yn)˜ –
.
Idea of the Proof
◮ Proof is based on the observation:
I(X∧Y)≈ 1 n
ˆI(Xn∧Yn|J,F)+H(J,F)−H(F|Xn)−H(F|Yn)˜ .
Characterization of the form of CR in optimum SK generation:
1
nI(Xn∧Yn|J,F)≈0 if and only if
I(X∧Y)≈ n1
»
H(J,F)−ˆ
H(F|Xn) +H(F|Yn)˜ –
.
Minimum Communication for Optimum SK
Theorem
RSK=min. rate ofFrequired for optimal rate SK generation.
RCI =min. rateFrequired for generating CRJ s.t. Xn⊥⊥
∼
Yn|(J,F).
Then,
RSK=RCI =CIi(X∧Y)−I(X∧Y).
CIi(X∧Y) = lim
r→∞CIir(X∧Y).
Characterization of C I
i◮ Given rvsX, Y andr≥1, we have CIir(X∧Y) = min
U1,V1,...,Ur,VrI(X, Y ∧U1, V1, ..., Ur, Vr),
U1−◦−X−◦−Y, U2−◦−X, V1−◦−Y, ...
Ur−◦−X, Ur−1, Vr−1−◦−Y,
V1−◦−Y, U1−◦−X, V2−◦−Y, U1, V1−◦−X ...
Vr−◦−Y, Ur, Vr−1−◦−X.
X−◦−Ur, Vr−◦−Y
Noninteractive Communication for SK Capacity
Communication fromXn toYn: RSKone= min
U−◦−X−◦−Y X−◦−U−◦−Y
I(X, Y ∧U)−I(X∧Y)
U satisfies: U−◦−X−◦−Y, X−◦−U−◦−Y.
◮ [Problem 16.26, Csisz´ar-K¨orner]
U−◦−X−min◦−Y X−◦−U−◦−Y
I(X, Y ∧U) = min
g(X)s.t.
X−◦−g(X)−◦−Y
H(g(X)) .
Common Information Quantities
For a pair of rvsX, Y
CIGC ≤ I(X∧Y) ≤ CI ≤ CIi ≤CIir≤CIir−1≤min{H(X);H(Y)}
Common Information Quantities
For a pair of rvsX, Y
CIGC ≤ I(X∧Y) ≤ CI ≤ CIi ≤CIir≤CIir−1≤min{H(X);H(Y)}
Common Information Quantities
For a pair of rvsX, Y
CIGC ≤ I(X∧Y) ≤ CI ≤ CIi ≤CIir≤CIir−1≤min{H(X);H(Y)}
Common Information Quantities
For a pair of rvsX, Y
CIGC ≤ I(X∧Y) ≤ CI ≤ CIi ≤CIir≤CIir−1≤min{H(X);H(Y)}
Common Information Quantities
For a pair of rvsX, Y
CIGC ≤ I(X∧Y) ≤ CI ≤ CIi ≤CIir≤CIir−1≤min{H(X);H(Y)}
Interactive Common Information
Common Information Quantities
For a pair of rvsX, Y
CIGC ≤ I(X∧Y) ≤ CI ≤ CIi ≤CIir≤CIir−1≤min{H(X);H(Y)}
Interactive Common Information
◮ CIi is indeed a new quantity
For binary rvsX andY: CIi(X∧Y) = min{H(X);H(Y)}.
For binary symmetricX andY: CI(X∧Y)<min{H(X);H(Y)}.