Lossy Source Coding and RateDistortion Theory

N. Merhav and M. Feder,
``On the cost of universality of block codes for individual sequences,''
Proc. 1994 IEEE Int. Symp. on Information Theory (ISIT `94), p. 263,
Trondheim, Norway, June 1994.

N. Merhav,
``A comment on `A rate of convergence result for
a universal $D$semifaithful code',''
IEEE Trans. Inform. Theory, vol. 41, no. 4, pp. 12001202, July 1995.

N. Merhav,
``On list size exponents in ratedistortion coding,''
IEEE Trans. on Inform. Theory,
vol. 43, no. 2, pp. 765769, March 1997.

N. Merhav and J. Ziv,
``On the
amount of statistical side information
required for lossy data compression,''
IEEE Trans. Inform. Theory
vol. 43, no. 4, pp. 11121121, July 1997.

E. Arikan and N. Merhav,
``Guessing subject to distortion,''
IEEE Trans. Inform. Theory, vol. 44,
no. 3, pp. 10411056, May 1998.

N. Merhav, R. M. Roth, and E. Arikan,
``Hierarchical guessing with a
fidelity criterion,''
IEEE Trans. Inform. Theory, vol. 45,
no. 1, pp. 330337, January 1999.

T. Weissman and N. Merhav,
``Tradeoffs between the excess codelength exponent
and the excess distortion exponent in lossy source coding,''
IEEE Trans. Inform. Theory, vol.
48, no. 2, pp. 396415, February 2002.

T. Weissman and N. Merhav,
``On limiteddelay lossy coding and filtering of
individual sequences,''
IEEE Trans. Inform. Theory, vol. 48, no. 3, pp. 721733, March 2002.

N. Merhav and I. Kontoyiannis,
``Source coding exponents for zerodelay coding with finite memory,''
IEEE Trans. Inform. Theory, vol. 49, no. 3, pp. 609625, March 2003.

T. Weissman and N. Merhav,
``On competitive predictability and its relation to ratedistortion
theory and to channel capacity theory,''
IEEE Trans. Inform.
Theory, vol. 49, no. 12, pp. 31853194, December 2003.

Y. Steinberg and N. Merhav,
``On successive refinement for the WynerZiv problem,''
IEEE Trans. Inform.
Theory, vol. 50, no. 8, pp. 16361654, August 2004.

I. Hen and N. Merhav,
``On the error exponent of trellis source coding,''
IEEE Trans. Inform. Theory, vol. 51, no. 11, pp. 37343741,
November 2005.

T. Weissman and N. Merhav,
``On causal source codes with side information,''
IEEE Trans. Inform. Theory, vol. 51, no. 11, pp. 40034013,
Novemeber 2005.

N. Merhav and J. Ziv,
``On the WynerZiv problem for individual sequences,''
IEEE Trans. Inform. Theory, vol. 52, no. 3, pp. 867873, March 2006.

N. Merhav,
``The generalized random energy model of spin glasses and its
application to the statistical physics of code ensembles with
hierarchical structures,''
IEEE Trans. Inform. Theory , vol. 55, no. 3, pp. 12501268, March
2009.

N. Merhav,
``On the statistical physics of directed polymers in a random medium
and their relation to tree codes,''
IEEE Trans. Inform. Theory, vol. 56, no. 3, pp. 13451350, March 2010.

A. Reani and N. Merhav,
``Efficient online schemes for encoding
individual sequences with side information at the decoder,''
Proc. ISIT 2009, Seoul, Korea, JuneJuly 2009.
Full version:
IEEE Trans. Inform. Theory, vol. 57, no. 10, pp. 68606876,
October 2011.

N. Merhav,
``Another look at the physics of large deviations with application to
ratedistortion theory,'' Technical Report, CCIT Pub. no. 742,
EE Pub. no. 1699, August 2009. Also, available in
arXiv and
here is the
conference version, which appears in Proc. ISIT 2010, Austin,
Texas, U.S.A., June 2010.

Y. Kaspi and N. Merhav,
``Structure theorem for realtime variablerate lossy source encoders
and memorylimited decoders with side information,
Proc. ISIT 2010, Austin, Texas, U.S.A., June 2010.
Full version is
here (vol. 58, no. 12,
pp. 71357153,
December 2012).

N. Merhav,
``Ratedistortion function via minimum mean square error estimation,''
IEEE Trans. Inform. Theory, vol. 57, no. 6, pp. 31963206, June
2011. Comment: There is a slight problem in Theorem 1. The parametric
representation therein is guaranteed to hold
for the optimal output distribution q. For a
general distribution, it gives a lower bound on R_q(D), thus the method
proposed can still be used to generate lower bounds. Most of the examples
(but the last one) are fine since they are defined with the optimal q.
The last example holds as well if the
input pdf is the convolution between q and the corresponding generalized
Gaussian, so that q is optimum for that pdf. Thanks to Jon Scarlett for
drawing my attention.
 N. Merhav,
``A statisticalmechanical view on source coding: physical compression and
data compression,''
Journal of Statistical Mechanics: Theory and
Experiment, P01029, January 2011.
doi: 10.1088/17425468/2011/01/P01029
[With a certain overlap to no. 18, but with a
different emphasis and some other results.]

A. Reani and N. Merhav,
``Data processing lower bounds for scalar lossy source codes with side
information at the decoder,''
in ISIT 2012, Cambridge, MA, USA, July 2012.
Full version
in IEEE Trans. Inform. Theory, vol. 59, no. 7, pp. 40574070,
July 2013. and
can be found
here.
 Y. Kaspi and N. Merhav,
``On zerodelay lossy source coding with side information at the
encoder,'' presented at the 2012 IEEE 27th Convention of
Electrical and Electrnoics Engineers in Israel,, November 1417, 2012.
Full version (with a slightly different title) appeared in
IEEE Trans. Inform. Theory,
vol. 60, no. 11, pp. 69316942, November 2014, and can be found
here.

Y. Kaspi and N. Merhav,
``Zerodelay and causal secure source coding,''
IEEE Trans. Inform. Theory,
vol. 61, no. 11, pp. 62386250,
November 2015.

A. Reani and N. Merhav,
``Universal quantization for separate encodings and joint decoding of
correlated sources,'' Proc.\ ISIT 2014, pp. 761765, Honolulu,
Hawaii, JuneJuly 2014. Full version in
IEEE Trans. Inform. Theory,
vol. 61, no. 12, pp. 64656474,
December 2015.
Can be found
here.