Lossy Source Coding and Rate-Distortion Theory
-
N. Merhav and M. Feder,
``On the cost of universality of block codes for individual sequences,''
Proc. 1994 IEEE Int. Symp. on Information Theory (ISIT `94), p. 263,
Trondheim, Norway, June 1994.
-
N. Merhav,
``A comment on `A rate of convergence result for
a universal $D-$semifaithful code',''
IEEE Trans. Inform. Theory, vol. 41, no. 4, pp. 1200-1202, July 1995.
-
N. Merhav,
``On list size exponents in rate-distortion coding,''
IEEE Trans. on Inform. Theory,
vol. 43, no. 2, pp. 765-769, March 1997.
-
N. Merhav and J. Ziv,
``On the
amount of statistical side information
required for lossy data compression,''
IEEE Trans. Inform. Theory
vol. 43, no. 4, pp. 1112-1121, July 1997.
-
E. Arikan and N. Merhav,
``Guessing subject to distortion,''
IEEE Trans. Inform. Theory, vol. 44,
no. 3, pp. 1041-1056, May 1998.
-
N. Merhav, R. M. Roth, and E. Arikan,
``Hierarchical guessing with a
fidelity criterion,''
IEEE Trans. Inform. Theory, vol. 45,
no. 1, pp. 330-337, January 1999.
-
T. Weissman and N. Merhav,
``Tradeoffs between the excess code-length exponent
and the excess distortion exponent in lossy source coding,''
IEEE Trans. Inform. Theory, vol.
48, no. 2, pp. 396-415, February 2002.
-
T. Weissman and N. Merhav,
``On limited-delay lossy coding and filtering of
individual sequences,''
IEEE Trans. Inform. Theory, vol. 48, no. 3, pp. 721-733, March 2002.
-
N. Merhav and I. Kontoyiannis,
``Source coding exponents for zero-delay coding with finite memory,''
IEEE Trans. Inform. Theory, vol. 49, no. 3, pp. 609-625, March 2003.
-
T. Weissman and N. Merhav,
``On competitive predictability and its relation to rate-distortion
theory and to channel capacity theory,''
IEEE Trans. Inform.
Theory, vol. 49, no. 12, pp. 3185-3194, December 2003.
-
Y. Steinberg and N. Merhav,
``On successive refinement for the Wyner-Ziv problem,''
IEEE Trans. Inform.
Theory, vol. 50, no. 8, pp. 1636-1654, August 2004.
-
I. Hen and N. Merhav,
``On the error exponent of trellis source coding,''
IEEE Trans. Inform. Theory, vol. 51, no. 11, pp. 3734-3741,
November 2005.
-
T. Weissman and N. Merhav,
``On causal source codes with side information,''
IEEE Trans. Inform. Theory, vol. 51, no. 11, pp. 4003-4013,
Novemeber 2005.
-
N. Merhav and J. Ziv,
``On the Wyner-Ziv problem for individual sequences,''
IEEE Trans. Inform. Theory, vol. 52, no. 3, pp. 867-873, March 2006.
-
N. Merhav,
``The generalized random energy model of spin glasses and its
application to the statistical physics of code ensembles with
hierarchical structures,''
IEEE Trans. Inform. Theory , vol. 55, no. 3, pp. 1250-1268, March
2009.
-
N. Merhav,
``On the statistical physics of directed polymers in a random medium
and their relation to tree codes,''
IEEE Trans. Inform. Theory, vol. 56, no. 3, pp. 1345-1350, March 2010.
-
A. Reani and N. Merhav,
``Efficient on-line schemes for encoding
individual sequences with side information at the decoder,''
Proc. ISIT 2009, Seoul, Korea, June-July 2009.
Full version:
IEEE Trans. Inform. Theory, vol. 57, no. 10, pp. 6860-6876,
October 2011.
-
N. Merhav,
``Another look at the physics of large deviations with application to
rate-distortion theory,'' Technical Report, CCIT Pub. no. 742,
EE Pub. no. 1699, August 2009. Also, available in
arXiv and
here is the
conference version, which appears in Proc. ISIT 2010, Austin,
Texas, U.S.A., June 2010.
-
Y. Kaspi and N. Merhav,
``Structure theorem for real-time variable-rate lossy source encoders
and memory-limited decoders with side information,
Proc. ISIT 2010, Austin, Texas, U.S.A., June 2010.
Full version is
here (vol. 58, no. 12,
pp. 7135-7153,
December 2012).
-
N. Merhav,
``Rate-distortion function via minimum mean square error estimation,''
IEEE Trans. Inform. Theory, vol. 57, no. 6, pp. 3196-3206, June
2011. Comment: There is a slight problem in Theorem 1. The parametric
representation therein is guaranteed to hold
for the optimal output distribution q. For a
general distribution, it gives a lower bound on R_q(D), thus the method
proposed can still be used to generate lower bounds. Most of the examples
(but the last one) are fine since they are defined with the optimal q.
The last example holds as well if the
input pdf is the convolution between q and the corresponding generalized
Gaussian, so that q is optimum for that pdf. Thanks to Jon Scarlett for
drawing my attention.
- N. Merhav,
``A statistical-mechanical view on source coding: physical compression and
data compression,''
Journal of Statistical Mechanics: Theory and
Experiment, P01029, January 2011.
doi: 10.1088/1742-5468/2011/01/P01029
[With a certain overlap to no. 18, but with a
different emphasis and some other results.]
-
A. Reani and N. Merhav,
``Data processing lower bounds for scalar lossy source codes with side
information at the decoder,''
in ISIT 2012, Cambridge, MA, USA, July 2012.
Full version
in IEEE Trans. Inform. Theory, vol. 59, no. 7, pp. 4057-4070,
July 2013. and
can be found
here.
- Y. Kaspi and N. Merhav,
``On zero-delay lossy source coding with side information at the
encoder,'' presented at the 2012 IEEE 27-th Convention of
Electrical and Electrnoics Engineers in Israel,, November 14-17, 2012.
Full version (with a slightly different title) appeared in
IEEE Trans. Inform. Theory,
vol. 60, no. 11, pp. 6931-6942, November 2014, and can be found
here.
-
Y. Kaspi and N. Merhav,
``Zero-delay and causal secure source coding,''
IEEE Trans. Inform. Theory,
vol. 61, no. 11, pp. 6238-6250,
November 2015.
-
A. Reani and N. Merhav,
``Universal quantization for separate encodings and joint decoding of
correlated sources,'' Proc.\ ISIT 2014, pp. 761-765, Honolulu,
Hawaii, June-July 2014. Full version in
IEEE Trans. Inform. Theory,
vol. 61, no. 12, pp. 6465-6474,
December 2015.
Can be found
here.
-
A. Cohen and N. Merhav,
``Universal randomized guessing subjected to distortion,''
IEEE Trans. Inform.
Theory, vol. 68, no. 12, pp.
7714-7734, December 2022.
-
N. Merhav,
``D-semifaithful codes that are universal
over both memoryless sources and
distortion measures,''
IEEE Trans. Inform. Theory,
vol. 69, no. 7, pp. 4746-4757, July 2023.
-
N. Merhav,
``A universal ensemble for sample-wise lossy compression,''
Entropy, 2023, 25(8), 1199;
https://doi.org/10.3390/e25081199
August 2023.
-
N. Merhav,
``Lossy compression of individual sequences revisited: fundamental limits of
finite-state encoders,'' Entropy 2024, 26, 116.
https://doi.org/10.3390/e26020116 January 2024.