Robust coding for a class of sources: Applications in control and reliable communication over...
Transcript of Robust coding for a class of sources: Applications in control and reliable communication over...
UNCO
RREC
TEDPR
OOF
SCL: 2998 Model 5G pp. 1–8 (col. fig: nil)
ARTICLE IN PRESSSystems & Control Letters xx (xxxx) xxx–xxx
Contents lists available at ScienceDirect
Systems & Control Letters
journal homepage: www.elsevier.com/locate/sysconle
Robust coding for a class of sources: Applications in control and reliablecommunication over limited capacity channelsAlireza Farhadi a,∗, Charalambos D. Charalambous b
a School of Information Technology and Engineering, University of Ottawa, Ontario, Canadab Electrical and Computer Engineering Department, University of Cyprus, 75 Kallipoleos Avenue, Nicosia, Cyprus
a r t i c l e i n f o
Article history:Received 4 February 2007Received in revised form25 June 2008Accepted 25 June 2008Available online xxxx
Keywords:Limited capacity constraintRobust stability and observabilityShannon lower bound
a b s t r a c t
This paper is concerned with the control of a class of dynamical systems over finite capacitycommunication channels. Necessary conditions for reliable data reconstruction and stability of a classof dynamical systems are derived. The methodology is information theoretic. It introduces the notion ofentropy for a class of sources, which is defined as a maximization of the Shannon entropy over a class ofsources. It also introduces the Shannon information transmission theorem for a class of sources, whichstates that channel capacity should be greater or equal to the mini-max rate distortion (maximizationis over the class of sources) for reliable communication. When the class of sources is described by arelative entropy constraint between a class of source densities, and a given nominal source density, theexplicit solution to the maximum entropy, is given, and its connection to Rényi entropy is illustrated.Furthermore, this solution is applied to a class of controlled dynamical systems to address necessaryconditions for reliable data reconstruction and stability of such systems.
© 2008 Published by Elsevier B.V.
1. Introduction1
Over the last few years there is an extensive research activity in2
addressing analysis and design questions associated with control3
of deterministic and stochastic systems over communication4
channels with limited channel capacity. This line of research is5
motivated by applications in which the communication data rates6
from the channel input to the controller input are limited and7
feedback is available from the output of the channel to the input8
of the channel. A typical scenario of such control/communication9
system is the block diagram of Fig. 1, in which the dynamical10
system (i.e., source of information) is controlled via a limited11
capacity communication channel. This system can be viewed as12
a general communication system with feedback in which the13
output of the controlled system is the information source which14
is transmitted over a feedback communication channel to the15
controller, whose output is the input to the controlled system.16
The present paper is concerned with necessary conditions17
for reliable data reconstruction (known as observability in the18
literature) and stability subject to limited channel capacity and19
uncertainty in the source of information. Throughout, we consider20
the system of Fig. 1 in which the dynamical system is controlled21
via a limited capacity communication channel when the source22
statistics belong to a prescribed class.23
∗ Corresponding author. Tel.: +1 613 255 0213.E-mail address: [email protected] (A. Farhadi).
Fig. 1. Control/communication system.
The control/communication system of Fig. 1 is defined on a 24
complete probability space (Ω, F (Ω), P) with filtration Ftt≥0; 25
t ∈ N+ , 0, 1, 2, . . ., where, Yt , Zt Zt , Yt , and Ut , t ∈ N+, are 26
Random Variables (R.V.’s) denoting the source message, channel 27
input codeword, channel output codeword, the reproduction of the 28
source message, and the control input to the source, respectively. 29
0167-6911/$ – see front matter© 2008 Published by Elsevier B.V.doi:10.1016/j.sysconle.2008.06.006
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESS2 A. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx
The objective is to find general necessary conditions for1
observability and stability of a class of sources, which is2
independent of the information available to the encoder, decoder3
and controller (i.e., the information patterns of the encoder,4
decoder and controller). Here, observability means reconstruction5
in the sense that Yt follows Yt with some distortion, while the6
stability means that the state variable is bounded for all times,7
when there is a limited data rate constraint.8
References [1–10] are representative although not exhaustive9
of the recent activity addressing necessary and sufficient condi-10
tions for stability and observability of unstable control systems11
subject to limited channel capacity. The necessary part is often12
obtained through the converse of the Information Transmission13
Theorem ([11], pp. 72), which states that the information capac-14
ity should be at least equal to the rate at which the information is15
generated by the source (subject to a distortion when continuous16
sources are invoked).17
The∧materials presented in this paper compliment those18
found in [5,12,13] in the problem formulation, methodology19
considered and the results obtained. Specifically, we wish to20
extend some of the concepts and results associated with the21
control/communication systems to a class of sources or dynamical22
systems, which are controlled over limited capacity channels.23
Thus, instead of considering a single source generating information24
which is then transmitted over the communication channel,25
we consider a class of sources or dynamical systems. Since26
the methodology is information theoretic, we introduce the27
notion of entropy for a class of sources which represents the28
amount of information generated by this class. This is defined29
as a maximization of the entropy of the source over the class30
of admissible sources considered, which corresponds to the31
maximum amount of information generated by the class of32
sources. Then, we invoke a modified version of the information33
transmission theorem and the Shannon lower bound to account34
for the class of sources, to link them to the definitions of35
uniform observability and robust stability. Although the approach36
is information theoretic, it is evident that the methodology37
is reminiscent to the mini-max approach associated with the38
investigation of robustness of the control systems [14].39
This generalization is important in control applications since40
dynamical models are often simplified representation of the true41
models and hence, the robustness of the necessary conditions for42
reliable communication and robust stability should be investi-43
gated. This may be viewed as the first step towards introducing a44
robust analysis for the system of Fig. 1, in an analogous manner as45
it is done in robust control [15].46
The general necessary conditions presented in this note, for47
existence of an encoder, decoder and controller for uniform48
observability and robust stability state that C ≥ Rr(Dv) ≥49
RS,r(Dv), where C is the information capacity of the channel,50
Rr(Dv) is the mini-max rate distortion in which the maximization51
is with respect to the class of sources and the minimization is52
over the class of reproduction kernels (between the source and its53
reconstruction), and RS,r(Dv) is the maximization of the Shannon54
lower bound over the class of sources. Throughout the paper,55
the class of sources is described by a relative entropy constraint56
between a class of source densities and a given nominal source57
density, an explicit solution to themaximumentropy for such class58
of sources is found; and its connection to Rényi entropy is shown.59
Furthermore, the maximum entropy is calculated for a class of60
partially observed Gauss Markov sources.61
The paper is organized as follows. In Section 2, the robust62
entropy (i.e., the entropy for a class of sources) and the robust63
entropy rate are defined, and the explicit solution to the robust64
entropy and its connection to Rényi entropy is given. In Section 3,65
a class of partially observed Gauss Markov sources is considered,66
and the robust entropy rate is calculated. Finally, in Section 4, 67
the notion of robust entropy rate is employed to derive necessary 68
conditions for uniform observability and robust stability of the 69
control/communication system of Fig. 1. The derivations are given 70
in the Appendix, while the notation employed is presented in 71
Table 1. 72
2. Robust entropy and entropy rate 73
In this section, the definitions of robust entropy and sub- 74
sequently robust entropy rate for a class of sources are given. 75
Subsequently, by considering a class of sources described by a 76
relative entropy constraint, the explicit solution to the robust en- 77
tropy and entropy rate are presented. These results are employed 78
in Section 4 to derive necessary conditions for uniform observ- 79
ability and robust stability of a class of controlled systems over a 80
limited capacity communication channel. 81
Robust entropy and entropy rate definition. Let D denote the 82
set of all Probability Density Functions (PDF’s) corresponding to 83
a R.V. Y : (Ω, F (Ω)) → (Rd, B(Rd)) and D0,T denote the 84
set of all joint PDF’s corresponding to a sequence of such R.V.’s 85
Y T , YtTt=0, Yt : (Ω, F (Ω)) → (Rd, B(Rd)). In the real word 86
applications, the source statistics are not entirely known; rather, 87
they are known to belong to a specific class of sources, known as 88
the uncertain class. This class of sources is often characterized by 89
the source density (resp. joint source density), with respect to a 90
nominal fixed source density gY ∈ D (resp. gY T ∈ D0,T ). Thus, 91
the nominal source corresponds to the a priori knowledge in the 92
absence of the true knowledge of the source. Suppose the true 93
source density, fY ∈ D (resp., fY T ∈ D0,T ) belongs to the class 94
DSU ⊂ D (resp., D0,TSU ⊂ D0,T ), then, the entropy associated with 95
a class of sources is defined as follows. 96
Definition 2.1 (Entropy and Entropy Rate for a Class of Sources). Sup- 97
pose a class of R.V.’s Y : (Ω, F ) → (Rd, B(Rd)) induces a class 98
of PDF’s belonging to the class fY ∈ DSU , DSU ⊂ D . The robust 99
entropy associated with the family DSU of the sources is defined 100
by 101
Hr(f ∗
Y ) , supfY ∈DSU
HS(fY ), (1) 102
where HS(fY ) , −∫
Rd fY (y) log fY (y)dy is the Shannon en- 103
tropy [16] and f ∗
Y ∈ argsupfY ∈DSUHS(fY ). Moreover, for a class of se- 104
quences Y T−1 of R.V’s with length T , which induces a class of joint 105
PDF’s belonging to the class fY T−1 ∈ D0,T−1SU , D0,T−1
SU ⊂ D0,T−1, the 106
robust entropy rate associated with the family D0,T−1SU ⊂ D0,T−1 is 107
defined by 108
Hr(Y) , limT→∞
1THr(f ∗
Y T−1) = limT→∞
1T
supfYT−1∈D
0,T−1SU
Hr(fY T−1), (2) 109
provided the limit exists. 110
The above extension of entropy corresponds to the maximum 111
amount of information generated by a class of sources. An 112
application to lossless source coding for a class of finite alphabet 113
sources is given in [17]. 114
Class of sources described by relative entropy.Denote by gY ∈ D 115
the nominal source density and let the true source density fY ∈ D 116
belong to the class of sources which is described by the relative 117
entropy constraintDSU(gY ) , fY ∈ D;H(fY ‖ gY ) ≤ Rc, gY ∈ D, 118
where H(fY ‖ gY ) ,∫log( fY (y)
gY (y) )fY (y)dy (when log( fY (y)gY (y) )fY (y) is 119
integrable and gY (y) = 0 implies fY (y) = 0 for all such y’s), log(.) 120
is the natural logarithm and Rc ∈ [0, ∞) is fixed. 121
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESSA. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx 3
Table 1Summary of Notation
Notation Description Notation Description
Y T (Y0, Y1, . . . , YT ) D Set of all PDF’sD0,T Set of all PDF’s associated with Y T DSU , D0,T
SU Set of admissible density functionsN+ 0, 1, 2, . . . Rd Set of d-dimensional real vectorslog(.) Natural logarithm L1(Rd, R+) Set of integrable functions‖.‖ Euclidian norm gY , gY T Nominal density functionsfY , fY T True source density function HS(gY ) Shannon entropyHS(Y) Shannon entropy rate HR(gY ) Rényi entropyRSRDr (Dv) Robust sequential rate distortion Hr (fY ) Robust entropy
Hr (Y) Robust entropy rate H(fY ‖ gY ) Relative entropyCn Capacity of n channel uses C CapacityI(.; .) Mutual information Rr (Dv) Robust rate distortionRS,r (Dv) Robust Shannon lower bound Var[.] VarianceCov(.) Covariance o(.) ResidueE[.] Expected value Pr(A) Probability of the event AN(m, Γ ) Gaussian distribution A−1 Matrix inverseIm×m (m × m) identity matrix ′ Matrix transposedet(.) Determinate of a square matrix σ . Sigma algebraDSU (gY ) fY ∈ D;H(fY ‖ gY ) ≤ Rc DSU (gY T−1 ) fY T−1 ;H(fY T−1 ‖ gY T−1 ) ≤ TRc
Next, consider the robust entropy definition over the class of1
sources described by the relative entropy constraint defined as2
follows.3
Hr(f ∗
Y ) , supfY ∈DSU (gY )
HS(fY ). (3)4
In the following theorem, we derive the maximum information5
generated by this class of sources.6
Theorem 2.2. Suppose for some s ∈ [0, ∞), (gY (y))s
1+s ∈7
L1(Rd, R+), where L1(Rd, R+) is the set of non-negative integrable8
functions defined on Rd. Then, we have the following.9
(i) The robust entropy associated with the class DSU(gY ) is given by10
Hr(f∗,s∗Y ) = sup
fY ∈DSU (gY )
HS(fY )11
= mins≥0
[sRc + (1 + s) log
∫(gY (y))
s1+s dy
]. (4)12
Moreover, if for some s ∈ [0, ∞), (gY (y))s
1+s log(gY (y))−1
1+s ∈13
L1(Rd, R+), the supremum f ∗,sY is achieved by14
f ∗,sY (y) =
(gY (y))s
1+s∫(gY (y))
s1+s dy
. (5)15
(ii) If for some s ≥ 0, (log gY (y))(gY (y))s
1+s ∈ L1(Rd, R+) and16
(log gY (y))2(gY (y))s
1+s ∈ L1(Rd, R+), then the minimum in (4)17
with respect to s ≥ 0 is the solution of H(f ∗,sY ‖ gY )|s=s∗ = Rc .18
Moreover, H(f ∗,sY ‖ gY ) is a non-increasing function of s ≥ 0,19
that is, for 0 ≤ s∗ ≤ s1 ≤ s220
0 ≤ H(f ∗,sY ‖ gY )
∣∣s=s2
≤ H(f ∗,sY ‖ gY )
∣∣s=s1
21
≤ H(f ∗,sY ‖ gY )
∣∣s=s∗ = Rc . (6)22
Proof. See Appendix. Q123
Remark 2.3. (4) has a similar form as the error exponent formula24
for finite alphabet source coding problems. However, in the current25
setting this formula is obtained for a class of sources, via the26
relative entropy uncertainty while the alphabet is general.27
(ii) An alternative expression for (4), using Taylor’s expansion,28
is given byHr(f∗,s∗Y ) = mins≥0[sRc+HS(gY )+ 1
2(1+s)Var[log1
gY (y) ]+29
o( 11+s )]. This shows that robust entropy also includes the variance30
of self information − log gY (y) around the entropy HS(gY ).31
(iii) The above solution to the robust entropy is related to the 32
Rényi entropy defined byHR(gY ) , 11−α
log∫gαY (y)dy, α > 0, α 6= 33
1, gαY (y) ∈ L1(Rd, R+) [18]which is obtained by relaxing themean 34
value property of the Shannon entropy from an arithmetic to an 35
exponential mean. Assume s∗ > 0 and let α =s
1+s , s > 0; then 36
minα∈(0,1)
HR(gY ) ≤ Hr(f∗,s∗Y ) = min
α∈(0,1)
α
1 − αRc + HR(gY )
37
≤α
1 − αRc + HR(gY ). (7) 38
Although, it is well known that Rényi entropy gives as a special 39
case the Shannon entropy [19], as discussed in [20] one special 40
applications of Rényi entropy is to measure the complexity of a 41
signal through its so called Time-Frequency Representation (TFR). 42
The negative values taken on bymost TFR’s prohibit the application 43
of the Shannon entropy. As it is shown in [20], for certain values of 44
α > 0, the Rényi entropy measures the signal complexity. Note 45
that the observation that maximization of entropy over a relative 46
entropy constraint yields Rényi entropy, was first investigated 47
in [17] in the context of lossless source coding. Note also that in 48
(4) the term (1 + s) log∫(gY (y))
s1+s dy can be view as the Rényi 49
entropy with α =s
1+s , in which this entropy can be computed for 50
a large class of probability densities (see the anthology of densities 51
given in [19]). 52
The extension of Theorem 2.2 to a class of sequences is 53
presented in the following corollary which is a direct consequence 54
of Theorem 2.2. 55
Corollary 2.4. Consider a class of sequences Y T−1 of R.V.’s with 56
corresponding joint density function fY T−1 ∈ DSU(gY T−1) ⊂ D0,T−1, 57
described by 58
DSU(gY T−1) =fY T−1 ∈ D0,T−1
;H(fY T−1 ‖ gY T−1) ≤ TRc. (8) 59
Then, the statements of Theorem 2.2 hold for the class of sequences 60
Y T−1 by replacing Rc with TRc . Consequently, the robust entropy 61
Hr(f∗,sY T−1) corresponding to the class (8) is given by (4), in which 62
Rc is replaced by TRc and gY by gY T−1 . Subsequently, the robust 63
entropy rate corresponding to the class (8) is given by Hr(Y) = 64
limT→∞1T Hr(f
∗,s∗
Y T−1), provided the limit exists. 65
Note if the sequence Y T−1 is generated via a stochastic 66
partially observed dynamical system, the class of uncertainty 67
described by (8) can model unknown dynamics in both the 68
state and the observation. The observation are affected by state 69
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESS4 A. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx
dynamics. Therefore, uncertainty in the state dynamics also affects1
observation, in which this uncertainty in the dynamics can be2
described by (8).3
Next, in the following Lemma, the robust entropy rate for a class4
of sources described via the relative entropy constraint (8) and5
a Gaussian distributed nominal source, is calculated. This Lemma6
follows from the result of Corollary 2.4.7
Lemma 2.5. Consider a class of sequences Y T−1= Yt
T−1t=0 , Yt ∈ Rd,8
for which the nominal source joint density gY T−1 ∈ D0,T−1 is a Td-9
dimensional Gaussian distributed (i.e., Y T−1∼ N(mT , ΓT ), ΓT ,10
Cov[(Y ′
0, . . . , Y′
T−1)′] with Shannon entropy rate HS(Y) , limT→∞11
1T HS(gY T−1). Then,12
Hr(Y) =d2log
(1 + s∗
s∗
)+ HS(Y), (9)13
where for a given Rc ∈ [0, ∞), s∗ > 0 is the unique solution of the14
following∧non-linear equation15
Rc = −d2log
(1 + s∗
s∗
)+
d2s∗
, Rc ∈ [0, ∞). (10)16
Proof. See Appendix. 17
3. Class of sources18
In this section, the entropy rate is calculated for a class of19
partially observed Gauss Markov sources.20
Consider the following partially observed Gauss Markov21
nominal source model22
(Ω, F (Ω), P; Ftt≥0) :
Xt+1 = AXt + BWt , X0 = X,Yt = CXt + DGt , t ∈ N+,
(11)23
where Xt ∈ Rq denotes the unobserved (state) process, Yt ∈24
Rd is the observed process, Wt ∈ Rm,Gt ∈ Rl, Wt is25
Independent Identically Distributed (i.i.d.) ∼ N(0, Im×m),Gt is26
i.i.d. ∼ N(0, Il×l), X0 ∼ N(x0, V0), and X0,Gt ,Wt are mutually27
independent, ∀t ∈ N+. Here it is assumed that (C, A) is detectable,28
(A, (BB′)12 ) is stabilizable, and D 6= 0.29
Denote by gY T−1 ∈ D0,T−1 the joint density function of the30
sequence Y T−1 produced by the nominal source model (11). Then,31
the class of source densities fY T−1 ∈ D0,T−1 corresponding to32
the nominal source density gY T−1 is described by (8). One possible33
uncertain class of densities is the one generated by letting A →34
A + ∆A and C → C + ∆C in (11), while satisfying the relative35
entropy constraint. This will lead to a specific constraint described36
by∆A and∆C . Next, in the following theorem,we recall the results37
of [13]. We shall use these results to find the robust entropy rate.38
Theorem 3.1 ([13]).39
(i) Let Yt; t ∈ N+, Yt ∈ Rd, be a Gaussian and define ΓT ,40
Cov[(Y ′
0, . . . , Y′
T−1)′], Zt , Yt − E[Yt |σ Y0, . . . , Yt−1], Λt ,41
Cov(Zt), and Λ∞ = limT→∞ ΛT 6= 0, where σ - denotes the42
sigma algebra generated by the sequence Y T−1.43
Then, the Shannon entropy rate of Yt; t ∈ N+ in nats per44
time step is given by45
HS(Y) =d2log(2πe) + lim
T→∞
12T
log detΓT46
=d2log(2πe) +
12log detΛ∞. (12)47
(ii) Consider the nominal source model (11). Then, Λ∞ = CV∞C ′+ 48
DD′, where Vt = E[Xt X ′t ], Xt , Xt − E[Xt |σ Y0, . . . , Yt−1], 49
t ∈ N+, and V∞ = limT→∞ VT is the unique positive semi- 50
definite solution of the following Algebraic Riccati-equation 51
V∞ = AV∞A′− AV∞C ′(CV∞C ′
+ DD′)−1CV∞A′+ BB′. (13) 52
Subsequently, the Shannon entropy rate of the observed process 53
Yt; t ∈ N+ of the nominal source model (11) is given by 54
HS(Y) =d2log(2πe) +
12log detΛ∞. (14) 55
Proof. The proof of the first part follows from Cholesky decompo- 56
sition ([21], pp. 44) and the proof of the second part follows from 57
([21], pp. 156–158, and [21] Theorem 4.2). 58
Next, in the following corollary, using the results of Lemma 2.5 59
and Theorem 3.1, we compute the robust entropy rate. 60
Corollary 3.2. The robust entropy rate of the class of sources 61
described via the relative entropy constraint (8) with the nominal 62
source model (11), is given by Hr(Y) =d2 log( 1+s∗
s∗ ) + 63
HS(Y), HS(Y) =d2 log(2πe) +
12 log detΛ∞, where s∗ > 0 is 64
the unique solution of (10) and Λ∞ is given in Theorem 3.1. 65
Remark 3.3. From (10), it follows that the caseRc = 0 corresponds 66
to s∗ → ∞. Letting s∗ → ∞ in above result, we obtain Hr(Y) = 67
HS(Y). That is, the robust entropy rate is equal to the Shannon 68
entropy rate of the nominal source. This is expected since Rc = 0 69
corresponds to a single source. 70
4. Application in control/communication systems 71
In this section, the robust entropy rate is used to establish 72
necessary conditions for uniform observability and robust stability 73
of the control/communication system of Fig. 1, described by a class 74
of controlled sources. 75
The precise definitions of uniform observability and robust 76
stability of the system of Fig. 1 considered in this paper are defined 77
as follows. 78
Definition 4.1 (Uniform Observability in Probability and r-Mean).∧
79
Consider the block diagram of Fig. 1 described by a class of sources. 80
Let fY T−1 ∈ D0,T−1SU ⊂ D0,T−1 be the joint density function of the 81
sequence Y T−1 produced by the class of sources. 82
(i) The controlled source is called uniform observable in prob- 83
ability if for a given δ ≥ 0 and Dv ∈ [0, 1), there ex- 84
ists (a control sequence), an encoder and a decoder such 85
that limT→∞1T supfYT−1∈D
0,T−1SU
∑T−1k=0 Eρ(Yk, Yk) ≤ Dv, where 86
ρ(Y , Y ) is defined by ρ(Y , Y ) = 1 if ‖Y − Y‖ > δ and 87
ρ(Y , Y ) = 0 if ‖Y − Y‖ ≤ δ. 88
(ii) For a given r > 0 and a finite Dv ≥ 0, the 89
controlled source is called uniform observable in r-mean if 90
limT→∞1T supfYT−1∈D
0,T−1SU
∑T−1k=0 E‖Yk − Yk‖
r≤ Dv . 91
Definition 4.2 (Robust Stability in Probability and r-Mean). Con- 92
sider the block diagram of Fig. 1 described by a class of controlled 93
sources, in which Yt = Ht + Υt , where Ht is the signal to be con- 94
trolled and Υt is a function of perturbation and the measurement 95
noise. Let fY T−1 ∈ D0,T−1SU ⊂ D0,T−1 denote the joint density func- 96
tion of the sequence Y T−1 produced by the class of sources. 97
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESSA. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx 5
(i) The controlled source is called robust stabilizable in prob-1
ability if for a given δ ≥ 0 and Dv ∈ [0, 1), there2
exists a controller, an encoder, and a decoder such that3
limT→∞1T supfYT−1∈D
0,T−1SU
∑T−1k=0 Eρ(Hk, 0) ≤ Dv,whereρ(., .)4
is the one defined in Definition 4.1.5
(ii) For a given r > 0 and a finite Dv ≥ 0, the6
controlled source is called robust stabilizable in r-mean if7
limT→∞1T supfYT−1∈D
0,T−1SU
∑T−1k=0 E‖Hk − 0‖r
≤ Dv .8
In this section, by finding a connection between capacity, robust9
rate distortion (i.e., mini-max rate distortion) and a variant of10
the Shannon lower bound, necessary conditions for uniform11
observability and robust stability in the form of a lower bound on12
the capacity are derived. Since the derivation of these conditions13
involves capacity and rate distortion, we shall recall the definition14
of these measures.15
Definition 4.3 (Information Capacity [16]). Consider a memoryless16
channel without feedback and let Zn−1 and Zn−1 be the channel17
input and output sequences, respectively. Let DCI denotes the set18
of joint density functions fZn−1 associated with the sequence Zn−119
which satisfy certain channel input power constraint. The Shannon20
information capacity for n channel uses, is defined by21
Cn , supfZn−1∈DCI
I(Zn−1; Zn−1)22
I(Zn−1; Zn−1) =
∫log
( fZn−1|Zn−1
fZn−1
)fZn−1|Zn−1 fZn−1dzn−1dzn−1.23
(15)24
Subsequently, the information capacity, in nats per channel use, is25
C , limn→∞1nCn provided the limit exists.26
For Discrete Memoryless Channels (DMC’s) without feed-27
back and memoryless Additive White Gaussian Noise (AWGN)28
channels without feedback, the information channel capacity of29
Definition 4.3 represents the operational capacity [16]; or sim-30
ply the channel capacity. Furthermore, when feedback is em-31
ployed the mutual information in (15) is replaced by the∧
32
directed information I(Zn−1→ Zn−1) ,
∑n−1i=0 I(Z i
; Zi|Z i−1) =33 ∑n−1i=0
∫log(
fZi |Z i−1,Zi
fZi |Z i−1)fZ i|Z i fZ idz
idz i [22]. Under this modification34
the capacity of DMC’s and AWGN channels with and without feed-35
back is the same. Please note that if Z and Z represents the channel36
input-output codewords, respectively, associated with the source37
message Y , the capacity is given in nats per source message; and38
subsequently in nats per time step.39
Definition 4.4 (Robust Information Rate Distortion [23]). Let Y T−140
and Y T−1 be sequences with length T of the source and41
reproduction of the source messages, respectively. Let fY T−1 ∈42
D0,T−1SU ⊂ D0,T−1 denote the probability density function of43
Y T−1 which belongs to the class D0,T−1SU ⊂ D0,T−1. Denote by44
fY T−1|Y T−1 ∈ D0,T−1 the conditional density function of Y T−1 given45
Y T−1 and let DDC , fY T−1|Y T−1 ∈ D0,T−1; EρT (Y T−1, Y T−1) ≤ Dv46
denote the set of distortion constraint, in which Dv ≥ 0 is the47
distortion value and ρT ∈ [0, ∞) is the distortion measure.48
Then, the robust information rate distortion for the class49
D0,T−1SU ∈ D0,T−1 is defined by50
Rr(Dv) , limT→∞
1TRT ,r(Dv),51
RT ,r(Dv) , inffY T−1 |YT−1∈DDC
supfYT−1∈D
0,T−1SU
I(Y T−1; Y T−1) (16)52
provided the limit exists.53
In [23] it is shown that for single letter distortion measure, i.e., 54
ρT (Y T−1, Y T−1) =1T
∑T−1t=0 ρ(Yt , Yt) and a class of memoryless 55
sources in which D0,T−1SU is compact, (16) represents the minimum 56
rate for uniform reliable communication up to the distortion 57
value Dv . For sources which are part of the control loop (see 58
Fig. 1, it is desirable that the time ordering for encoding and 59
decoding to be causal (as described in [6]). For such sources 60
the minimum in (16) must be taken over the set DSRDDC , 61
fYt |Y t−1,Y t T−1t=0 ; E[ρ(Yt , Yt)] ≤ Dv
. That is (16) must be replaced 62
by the following 63
RSRDr (Dv) , lim
T→∞
1TRSRDT ,r (Dv), 64
RSRDT ,r (Dv) , inf
fYt |Y t−1,Y t T−1t=0 ∈DSRD
DC
supfYT−1∈D
0,T−1SU
I(Y T−1→ Y T−1) (17) 65
More elaboration on this issue is found in [13]. 66
Next, in the following theorem, we find a connection between 67
capacity and robust rate distortion. This connection is valid under 68
assumption that the outputs of the source, encoder, channel, and 69
decoder of the control/communication system of Fig. 1 are subject 70
to the conditional independence assumption. That is, for any T , n ∈ 71
1, 2, . . ., the conditional independence assumption of the source 72
output sequence Y T−1, the channel input sequence Zn−1 and the 73
channel output Zn−1 is equivalent to fZt |Z t ,Z t−1,Y T−1 = fZt |Z t ,Z t−1 74
(t ∈ 0, 1, . . . , n − 1), and similarly for the rest of the blocks. 75
Theorem 4.5 (Robust Information Transmission Theorem). Consider 76
the block diagram of Fig. 1 subject to the conditional independence 77
assumption and uncertainty in the source. Then 78
(i) I(Zn−1; Zn−1) ≥ I(Zn−1
→ Zn−1) ≥ I(Y T−1; Zn−1) ≥ 79
I(Y T−1; Y T−1) ≥ I(Y T−1
→ Y T−1), ∀T , n ∈ 1, 2, 3, . . .. 80
(ii) When the control/communication system of Fig. 1 is described 81
by DMC’s or memoryless AWGN channels, a necessary condition 82
for reproducing a sequence Y T−1 of the source messages, up 83
to the distortion value Dv by Y T−1, at the decoder end (i.e., 84
EρT (Y T−1, Y T−1) ≤ Dv , ∀fY T−1 ∈ D0,T−1SU ) using a sequence 85
of the channel inputs-outputs with length n (T ≤ n), is Cn ≥ 86
RT ,r(Dv). 87
Proof. (i) The second inequality follows from ([22], Theorem 3). 88
The first and the last inequalities also follow from ([22], Theorem1) 89
and the third inequality is a direct result of data processing 90
inequality. (ii) See Appendix. 91
Please note that under causal time-ordering, a tight necessary 92
condition is given by Cn ≥ RSRDT ,r (Dv) which also follows from the 93
data processing inequality of Theorem 4.5, (i) Next, we present a 94
lower bound for the robust rate distortion in terms of the robust 95
entropy rate of the source. We use this Lemma and Theorem 4.5 96
to relate the capacity to the robust entropy rate for uniform 97
observability and robust stability. 98
Lemma 4.6 (Robust Shannon Lower Bound). Let Y T−1= Yt
T−1t=0 , 99
Yt ∈ Rd be a sequence with length T of the messages produced by 100
a class of sources with corresponding joint density function fY T−1 ∈ 101
D0,T−1SU ⊂ D0,T−1. Consider the following single letter distortion 102
measure ρT (Y T−1, Y T−1) =1T
∑T−1t=0 ρ(Yt , Yt), where ρ(Yt , Yt) = 103
ρ(Yt − Yt) : Rd→ [0, ∞) is continuous. 104
Then, a lower bound for 1T RT ,r(Dv) is given by 105
1TRT ,r(Dv) ≥
1THr(f ∗
Y T−1) − maxh∈GD
HS(h), (18) 106
where GD is defined by GD , h : Rd→ [0, ∞);
∫Rd h(ξ)dξ = 107
1,∫
Rd ρ(ξ)h(ξ)dξ ≤ Dv, ξ ∈ Rd. Moreover, when
∫Rd esρ(ξ)dξ < 108
∞ for all s < 0, then h∗(ξ) ∈ GD that maximizes HS(h) is
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESS6 A. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx
given by h∗(ξ) =esρ(ξ)∫
Rd esρ(ξ)dξ, where s < 0 satisfies the following1 ∫
Rd ρ(ξ)h∗(ξ)dξ = Dv . Subsequently, when Rr(Dv) andHr(Y) exist,2
the robust Shannon lower bound RS,r(Dv) is given by the following3
Rr(Dv) ≥ Hr(Y) − maxh∈GD
HS(h) , RS,r(Dv). (19)4
Proof. See Appendix. 5
Note that for distortion measure ρT (Y T−1, Y T−1) =1T
∑T−1t=06
‖Yt − Yt‖2, when RT ,r(Dv) = supfYT−1∈D
0,T−1SU
RT (Dv) (RT (Dv) is the7
rate distortion function for a single source), the robust Shannon8
lower bound is exact for sufficiently small Dv .9
Next, combining Theorem 4.5 and Lemma 4.6, necessary10
conditions for uniform observability and robust stability of the11
control/communication system of Fig. 1 are derived in the12
following theorem.13
Theorem 4.7. Consider the control/communication system of Fig. 1,14
under conditional independence assumption, described by a class of15
sources over DMC’s ormemoryless AWGNchannels. Assume the robust16
entropy rate of the class of sources exists and it is finite.17
Then, (i) A necessary condition for uniform observability in18
probability is19
C ≥ Hr(Y) −12log[(2πe)d detΓg ] = RS,r(Dv),20
(nats/time step) (20)21
where Hr(Y) is the robust entropy rate of the class of sources and22
Γg is the covariance matrix of the Gaussian distribution h∗(ξ) ∼23
N(0, Γg), (ξ ∈ Rd) which satisfies∫‖ξ‖>δ
h∗(ξ)dξ = Dv.24
(ii)Anecessary condition for r-meanuniformobservability, in nats25
per time step, is26
C ≥ Hr(Y) −dr
+ log
(r
dVdΓ ( dr )
(d
rDv
) dr)
= RS,r(Dv), (21)27
where Γ (.) is the gamma function and Vd is the volume of the unit28
sphere (e.g., Vd = Vol(Sd); Sd , ξ ∈ Rd; ‖ξ‖ < 1).29
Furthermore, when Yt = Ht +Υt , (20) and (21) are also necessary30
conditions for robust stability in probability and r-mean, respectively.31
Proof. See Appendix. 32
In Theorem 4.7, Hr(Y) can be a function of the control signal.33
Nevertheless, in the following proposition, it is shown that the34
robust entropy rate of a class of controlled sources can be bounded35
below by the robust entropy rate of the uncontrolled analogous36
sources no matter what the information patterns for the encoder37
and decoder are. Subsequently, for such uncertain controlled38
sources, the robust entropy rate of the uncontrolled analogous39
sources can be used in Theorem 4.7.40
Proposition 4.8. Consider a class of controlled sources described via41
(8), in which the nominal source model is a controlled version of the42
nominal source model (11) described by the following state space43
model44
(Ω, F (Ω), P; Ftt≥0) :45 Xt+1 = AXt + BWt + NUt , X0 = X,Yt = Ht + DGt , Ht = CXt , t ∈ N+,
(22)46
where Ut ∈ Ro,N ∈ Rq×o, (C, A) is detectable, (A, (BB′)12 ) is47
stabilizable and D 6= 0.48
Then, the robust entropy rate of this class of controlled sources 49
is bounded below by Hr(Y) ≥d2 log( 1+s∗
s∗ ) +d2 log(2πe) + 50
12 log detΛ∞, where s∗ > 0 is the unique solution of (10) and Λ∞ is 51
given in Theorem 3.1. 52
Proof. Follows from standard chain rule inequality of the Shannon 53
entropy ([16], pp. 239) and from the property that the conditioning 54
reduces entropy ([16], pp. 232) as well as by Gaussianity. 55
We have the following remarks regarding the results of 56
Theorem 4.7. 57
Remark 4.9. (i) The lower bounds (20) and (21) given in Theo- 58
rem 4.7 hold for any observed process, no matter what the infor- 59
mation patterns for the encoder, decoder and controller are. 60
(ii)When Theorem 4.7 is applied to controlled sources, then the 61
entropy rate of the outputs of such sources which depends on the 62
control sequencemust be used. Nevertheless, from Proposition 4.8 63
we deduce that the bounds (20) and (21) also holdwhen the robust 64
entropy rate is replaced by the robust entropy rate of the output 65
process of the uncontrolled analogous sources. 66
(iii) Uniform observability and robust stability can be defined 67
using single letter criteria as done in [5] rather than for sequences. 68
For single letter criteria the definition of rate distortion should 69
be also replaced by its single letter analogous (i.e., T = 1). 70
Subsequently, in (Lemma 4.6, (18)) and Theorem 4.7, the robust 71
entropy of the sequences will be replaced by the robust entropy 72
of the single letter density fY over the family DSU(gY ) defined 73
in Theorem 2.2, which is related to Rényi entropy of a Random 74
Variable. Thus, the lower bounds given in Lemma 4.6 and 75
Theorem 4.7 can be computed for a large classes of sources 76
from [19]. 77
(iv) For a class of controlled sources described via (8) 78
with the nominal source model (22) over memoryless AWGN 79
channels, a sufficient condition for reliable communication and 80
control can be defined (by proposing an encoder/decoder and 81
controller) via generalizations of [13], which is done for a single 82
source. The method is based on implementing a source-channel 83
matching technique [24] which results in a joint source channel 84
encoding/decoding scheme, and controller designed. 85
5. Conclusion 86
In this paper, for the class of sources described by a constraint 87
on the relative entropy, the explicit solution to robust entropy 88
was found and its connection to Rényi entropy was illustrated. 89
Subsequently, an application of robust entropy rate for uniform 90
observability and robust stability of a control/communication 91
system subject to limited capacity constraint was presented. For 92
future direction, it would be interesting to establish the sufficiency 93
of the conditions found in this paper by constructing actual 94
encoder, decoder and controller that can achieve the obtained 95
lower bounds. 96
Appendix 97
Proof of Theorem 2.2. Using ([25], pp. 224) it can be shown 98
that the constraint problem (3) is equivalent to the following 99
unconstrained problem 100
Hr(f∗,s∗Y ) = sup
fY ∈DSU (gY )
HS(fY ) = mins≥0
supfY ∈D
L(s, fY ) 101
L(s, fY ) , HS(fY ) − s(H(fY ‖ gY ) − Rc) 102
= sRc + (1 + s)[∫
log(gY (y))−1
1+s fY (y)dy − H(fY ‖ gY )]
. (23) 103
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESSA. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx 7
Applying calculus of variation to L(s, fY ) yields1
supfY ∈D
(∫log(gY (y))−
11+s fY (y)dy − H(fY ‖ gY )
)2
= log∫
(gY (y))s
1+s dy, (24)3
where supremum in (24) is attained at f ∗,sY (y) =
(gY (y))s
1+s∫(gY (y))
s1+s dy
. This4
completes the proof of part i.5
(ii) It can be easily shown that6
dds
L(s, f ∗,sY ) = Rc − H(f ∗,s
Y ‖ gY )7
d2
ds2L(s, f ∗,s
Y ) =1
(1 + s)3
(∫f ∗,sY (y) log2 gY (y)dy
−
(∫f ∗,sY (y) log gY (y)dy
)2)
,1
(1 + s)3Varf ∗,s
Y[log gY (y)] ≥ 0 ∀s ≥ 0. (25)8
Thus, the function L(s, f ∗,sY ) is a convex function of s ≥ 0. The9
minimum over s ≥ 0 is found by dL(s,f ∗,sY )
ds
∣∣∣s=s∗
= 0. This yields a10
minimizing s∗ ≥ 0 which is the solution of H(f ∗,sY ‖ gY )
∣∣s=s∗ = Rc .11
Further, it is evident that dH(f ∗,sY ‖gY )
ds = −d2L(s,f ∗,s
Y )
ds2≤ 0, which12
implies thatH(f ∗,sY ‖ gY ) is a non-increasing function of s ≥ 0. 13
Proof of Lemma 2.5. Eq. (9) is a result of direct substitution of14
gY T−1 ∼ N(mT , ΓT ) into Theorem 2.2. Further, by computing15
f ∗,sY T−1 from (5), we have H(f ∗,s∗
Y ‖ gY ) = −Td2 log( 1+s∗
s∗ ) +Td2s∗ .16
Consequently, by Theorem 2.2, ii, the minimizing s∗ > 0 is the17
solution of Rc =1T H(f ∗,s∗
Y ‖ gY ) = −d2 log( 1+s∗
s∗ ) +d2s∗ . Note18
that above expression can be written into the form e−2Rc =19
e−ds∗ ( 1+s∗
s∗ )d , M(s∗). But dM(s∗)
ds∗ > 0. Subsequently, M(s∗) is20
strictly increasing function of s∗ > 0. Further, since M(s∗) is a21
continuous function of s∗ > 0, the minimum and the maximum22
of M(s∗) is obtained at s∗ → 0 and s∗ → ∞, respectively.23
lims∗→0 M(s∗) = 0 and lims∗→∞ M(s∗) = 1. Subsequently,24
M(s∗) ∈ (0, 1). On the other hand, ∀Rc ≥ 0, e−2Rc ∈ (0, 1].25
Consequently, as M(s∗) is a continuous and strictly increasing26
function of s∗ > 0 which covers the range of e−2Rc , ∀Rc ≥ 0, there27
exists a unique s∗ > 0 such that e−2Rc = M(s∗), or equivalently28
Eq. (10). 29
Proof of Theorem 4.5. Consider the case without feedback chan-30
nel. If the encoding scheme yields an average distortion EρT (Y T−1,31
Y T−1) ≤ Dv for a class of sources, then from data processing in-32
equality (Theorem 4.5, i) follows that33
I(Zn−1; Zn−1) ≥ I(Y T−1
; Y T−1), ∀fY T−1 ∈ D0,T−1SU ,34
supfZn−1 ;fYT−1∈D
0,T−1SU
I(Zn−1; Zn−1) ≥ sup
fYT−1∈D0,T−1SU
I(Y T−1; Y T−1),35
supfZn−1∈DCI
I(Zn−1; Zn−1) ≥ sup
fZn−1 ;fYT−1∈D0,T−1SU
I(Zn−1; Zn−1)
≥ supfYT−1∈D
0,T−1SU
I(Y T−1; Y T−1),36
Cn , supfZn−1∈DCI
I(Zn−1; Zn−1)
≥ inffY T−1 |YT−1∈DDC
supfYT−1∈D
0,T−1SU
I(Y T−1; Y T−1)
, RT ,r(Dv) (26)37
where the second inequality follows by taking supremum over a 38
class of sources (please note that the channel inputs distribution 39
is affected by the source distribution); and the third inequality 40
follows since the class of channel inputs power constraint must 41
include those channel inputs distributionwhich are affected by the 42
class of sources; otherwise the reliable communication for the class 43
of sources is not possible. 44
Thus Cn ≥ RT ,r(Dv) under the assumption that there 45
exists an encoding scheme that yields an average distortion 46
EρT (Y T−1, Y T−1) ≤ Dv for a class of sources. This means that Cn ≥ 47
RT ,r(Dv) is a necessary condition for existence of such encoding 48
scheme. 49
For the case of feedback channel, in (26) I(Zn−1; Zn−1) must 50
be replaced by I(Zn−1→ Zn−1) and supremum must be 51
taken over fZn−1|Zn−2 =∏n−1
i=0 fZi|Z i−1,Z i−1 ∈ DCI , in which for 52
DMC’s or memoryless AWGN channels with feedback, the capacity 53
supfZn−1 |Zn−2=
∏n−1i=0 fZi |Zi−1,Z i−1∈DCI
I(Zn−1→ Zn−1) is the same as 54
the capacity of those channels without feedback, i.e., Cn [16]. 55
Proof of Lemma 4.6. From [11] it follows that ∀fY T−1 ∈ D0,T−1SU ⊂ 56
D0,T−1, we have 1T RT (Dv) , 1
T inffY T−1 |YT−1∈DDC I(Y T−1; 57
Y T−1) ≥1T HS(fY T−1) − maxh∈GD HS(h), where RT (Dv) is the 58
rate distortion function for the density function fY T−1 ; and when 59∫Rd esρ(ξ)dξ < ∞, ∀s < 0, the maximizer h∗(ξ) ∈ GD is given 60
by h∗(ξ) =esρ(ξ)∫
Rd esρ(ξ)dξ, where s < 0 satisfies the following 61∫
Rd ρ(ξ)h∗(ξ)dξ = Dv . 62
Consequently, 63
supfYT−1∈D
0,T−1SU
1TRT (Dv) ≥ sup
fYT−1∈D0,T−1SU
1THS(fY T−1) − max
h∈GDHS(h). (27) 64
On the other hand, since 65
RT ,r(Dv) , inffY T−1 |YT−1∈DDC
supfYT−1∈D
0,T−1SU
I(Y T−1; Y T−1) 66
≥ supfYT−1∈D
0,T−1SU
inffY T−1 |YT−1∈DDC
I(Y T−1; Y T−1), (28) 67
we deduce the result. 68
Proof of Theorem 4.7 (Uniform Observability). Assume there ex- 69
ists an encoder and decoder such that the uniform observabil- 70
ity in probability in the sense of Definition 4.1 is obtained. This 71
implies that for a given δ ≥ 0 and Dv ∈ [0, 1), there exist 72
T (δ,Dv) such that, ∀t ≥ T (δ,Dv), supfY t−1∈D0,t−1SU
1t
∑t−1k=0 Pr(‖Yk − 73
Yk‖ > δ) ≤ Dv . Subsequently, 1t
∑t−1k=0 Pr(‖Yk − Yk‖ > δ) ≤ 74
Dv, ∀fY t−1 ∈ D0,t−1SU . Next, define the following single letter dis- 75
tortion measure ρt(Y t−1, Y t−1) =1t
∑t−1k=0 ρ(Yk, Yk), where ρ(., .) 76
is given inDefinition 4.1. Then, for t ≥ T (δ,Dv), Eρt(Y t−1, Y t−1) = 77
1t
∑t−1k=0 Pr(‖Yk − Yk‖ > δ) ≤ Dv, ∀fY t−1 ∈ D0,t−1
SU ⊂ D0,t−1. That 78
is, uniform reconstructability up to the distortion value Dv is ob- 79
tained for t ≥ T (δ,Dv). Then, by Theorem 4.5 and Lemma 4.6, the 80
capacity and robust rate distortionmust for all t ≥ T (δ,Dv) satisfy 81
limt→∞
1tCt ≥ lim
t→∞
1tHr(f ∗
Y t−1) − maxh∈GD
HS(h) 82
C ≥ Hr(Y) − maxh∈GD
HS(h). (29) 83
Since among all distribution with the same covariance, the 84
Gaussian distribution has the biggest entropy, h∗(ξ) ∈ GD that 85
maximizes HS(h) is a Gaussian distribution which satisfies the 86
boundary conditions of GD. That is, h∗(ξ) ∼ N(0, Γg), in which 87
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006
UNCO
RREC
TEDPR
OOF
SCL: 2998
ARTICLE IN PRESS8 A. Farhadi, C.D. Charalambous / Systems & Control Letters xx (xxxx) xxx–xxx
Γg satisfies∫‖ξ‖>δ
h∗(ξ)dξ = Dv . Consequently, substituting1
∧maxh∈GD HS(h) = HS(h∗) =
12 log[(2πe)d detΓg ] in (29), the2
lower bound (20) is obtained. That is, (20) holds under the3
assumption that there exist an encoder and decoder that yield4
uniform observability in probability in the sense of Definition 4.1.5
This means that (20) is a necessary condition for existence of such6
encoder and decoder.7
Necessary condition for uniform observability in r-mean is8
obtained along the same lines of above proof. The only difference9
is that from [26] it follows that for this case maxh∈GD HS(h) =10
dr − log( r
dVdΓ ( dr )
( drDv
)dr ) nats per time step.11
(Robust stability). Follows similarly by considering the rate12
distortion between Y t−1 and Υ t−1. 13
References14
[1] A.V. Savkin, I.R. Petersen, Set-valued state estimation via a limited capacity15
communication channel, IEEE Transactions onAutomatic Control 48 (4) (2003)16
676–680.17
[2] V. Malyavej, A.V. Savkin, The problem of optimal robust Kalman state18
estimation via limited capacity digital communication channels, Systems and19
Control Letters 54 (3) (2005) 283–292.20
[3] G.N. Nair, S. Dey, R.J. Evans, Infimum data rates for stabilizing Markov jump21
linear systems, in: Proceedings of 42nd IEEE Conference on Decision and22
Control, Hawaii, December 2003, pp. 1176–1181.23
[4] G.N. Nair, R.J. Evans, I.M.Y. Mareels, W. Moran, Topological feedback entropy24
and nonlinear stabilization, IEEE Transactions on Automatic Control 49 (9)25
(2004) 1585–1597.26
[5] S. Tatikonda, S. Mitter, Control over noisy channels, IEEE Transactions on27
Automatic Control 49 (7) (2004) 1196–1201.28
[6] S. Tatikonda, A. Sahai, S.Mitter, Stochastic linear control over a communication29
channel, IEEE Transactions on Automatic Control 49 (9) (2004) 1549–1561.30
[7] Nicola Elia, When Bode meets Shannon: Control-oriented feedback com-31
munication schemes, IEEE Transactions on Automatic Control 49 (9) (2004)32
1477–1488.33
[8] K. Li, J. Baillieul, Robust quantization for digital finite communication34
bandwidth (DFCB) control, IEEE Transactions on Automatic Control 49 (9)35
(2004) 1573–1584.
[9] D. Liberzon, J.P. Hespanha, Stabilization of nonlinear systems with limited 36
information feedback, IEEE Transactions on Automatic Control 50 (6) (2005) 37
910–915. 38
[10] N.C. Martins, A. Dahleh, N. Elia, Feedback stabilization of uncertain systems in 39
the presence of a direct link, IEEE Transactions on Automatic Control 51 (3) 40
(2006) 438–447. 41
[11] Toby Berger, Rate Distortion Theory: A Mathematical Basis for Data 42
Compression, Printice-Hall, 1971. 43
[12] S. Yuksel, T. Basar, Coding and Control Over Discrete Noisy Forward and 44
Feedback Channels, in: Proceedings of 44th IEEE Conference on Decision and 45
Control, and the European Control Conference, Seville, December 12–15, 2005, 46
pp. 2517–2522. 47Q2[13] C.D. Charalambous, Alireza Farhadi, LQG optimality and separation principle 48
for general discrete time partially observed stochastic systems over finite 49
capacity communication channels, Automatica, in 10 pages (in press). 50
[14] T. Basar, P. Bernhard, H-infinity Optimal Control and Related Minimax Design 51
Problems: A Dynamic Game Approach, 2nd edition, Birkhauser, 1995. 52
[15] C.D. Charalambous, F. Rezaei, Stochastic uncertain systems subject to relative 53
entropy constraints: Induced norms and monotonicity properties of minimax 54
games, IEEE Transactions on Automatic Control 54 (4) (2007) 647–663. 55
[16] T.M Cover, Joy A. Thomas, Elements of Information Theory, John Wiley and 56
Sons, 1991. 57
[17] F. Rezaei, C.D. Charalambous, Robust coding for uncertain sources: A minimax 58
approach, in: Proceedings of the 2005 IEEE International Symposium on 59
Information Theory, Adelaide, Australia, September 4–9, 2005, pp. 1539–1548. 60
[18] A. Rényi, On measures of entropy and information, in: Proceedings of 61
4th Berkeley Symposium on Mathematical Statistics and Probability, vol. I, 62
Berkeley, CA, 1961, pp. 547–561. 63
[19] K. Zografos, S. Nadarajah, Survival exponential entropies, IEEE Transactions on 64
Information Theory 51 (3) (2005). 65
[20] R.G. Baraniuk, P. Flandrin, A.J.E.M. Janssen, O.J.J. Michel, Measuring time- 66
frequency information content using the Rényi entropies, IEEE Transactions 67
on Information Theory 47 (4) (2001) 1391–1409. 68
[21] P.E. Caines, Linear Stochastic Systems, John Wiley and Sons, 1988. 69
[22] J. Massey, Causality, feedback and directed information, in: Proceedings of 70
the 1990 Symposium on Information Theory and its Applications, 1990, 71
pp. 303–305. 72
[23] David J. Sakrison, The rate distortion function for a class of sources, Information 73
and Control 15 (1969) 165–195. 74
[24] M. Gastper, B. Rimoldi, V. Vetterli, To code, or not to code: Lossy source- 75
channel communication revisited, IEEE Transactions on Information Theory 76
49 (5) (2003) 1147–1158. 77
[25] D.G. Luenberger, Optimization by Vector Space Methods, John Wiley, 1969. 78
[26] T. Linder, R. Zamir, On the asymptotic tightness of the Shannon lower bound, 79
IEEE Transactions on Information Theory 40 (6) (1994) 2026–2031. 80
Please cite this article in press as: A. Farhadi, C.D. Charalambous, Robust coding for a class of sources: Applications in control and reliable communication over limitedcapacity channels, Systems & Control Letters (2008), doi:10.1016/j.sysconle.2008.06.006