Back to main page


Conference Talks (Aug. 2012 - Present)
  1. I. Sason, ``Entropy-based proofs of combinatorial results on bipartite graphs," Proceedings of the 2021 IEEE International Symposium on Information Theory, pp. 3225-3230, July 12-20, 2021 (an online virtual conference). See the paper, slides, recorded short and full presentations.

  2. N. Merhav and I. Sason, ``Exact expressions in source and channel coding problems using integral representations,'' Proceedings of the 2020 IEEE International Symposium on Information Theory, pp. 2361-2366, June 21-26, 2020 (an online virtual conference). See the paper, slides and recorded talk.

  3. I. Sason, ``On data-processing and majorization inequalities for f-divergences,'' Proceedings of the 2020 International Zurich Seminar on Information and Communication, pp. 101--105, Zurich, Switzerland, February 26-28, 2020. See the paper and presentation.

  4. I. Sason, ``Tight bounds on the Renyi entropy via majorization with applications to guessing and compression,'' Information Theory and Applications Workshop (ITA 2020), San-Diego, California, USA, February 2-7, 2020. See the presentation.

  5. I. Sason, ``Entropy and guessing: old and new results,'' 2019 Workshop on Mathematical Data Science, Durnstein, Austria, October 13-15, 2019. See the presentation.

  6. I. Sason, ``Tight bounds on the Renyi entropy via majorization with applications to guessing and lossless compression,'' Proceedings of the Prague Stochastics 2019, p. 13, Institute of Information Theory and Automation, Czech Academy of Science, Prague, Czech Republic, August 19-23, 2019. See the presentation.

  7. I. Sason and S. Verdu, ``Improved bounds on guessing moments via Renyi measures,'' Proceedings of the 2018 IEEE International Symposium on Information Theory, pp. 566-570, Vail, Colorado, USA, June 17-22, 2018. See the conference paper and presentation.

  8. I. Sason and S. Verdu, ``Non-asymptotic bounds for optimal fixed-to-variable lossless compression without prefix constraints,'' Proceedings of the 2018 IEEE International Symposium on Information Theory, pp. 2211-2215, Vail, Colorado, USA, June 17-22, 2018. See the conference paper and presentation.

  9. I. Sason, ``On Csiszar's f-divergences and informativities with applications,'' Conference on Channels, Statistics, Information, Secrecy and Randomness for celebrating the 80th birthday of Imre Csiszar, the Alfred Renyi Institute of Mathematics, Hungarian Academy of Sciences, Budapest, Hungary, June 4-5, 2018. Abstract.

  10. I. Sason and S. Verdu, ``Arimoto-Renyi conditional entropy and Bayesian M-ary hypothesis testing,'' seminar talk, Nov. 2017, Department of Electrical Engineering, Technion - Israel Institute of Technology, Haifa, Israel.

  11. I. Sason and S. Verdu, ``Arimoto-Renyi conditional entropy and Bayesian M-ary hypothesis testing,'' Proceedings of the 2017 IEEE International Symposium on Information Theory (ISIT 2017), pp. 2975-2979, Aachen, Germany, June 25-30, 2017. See the conference paper and presentation.

  12. I. Sason, ``On f- and Renyi divergences,'' seminar talk (part of this talk relies on a joint work with S. Verdu), Dec. 2016, Department of Electrical Engineering, Technion - Israel Institute of Technology, Haifa, Israel.

  13. I. Sason and S. Verdu, ``f-divergence inequalities via functional domination,'' Proceedings of the 2016 IEEE International Conference on the Science of Electrical Engineering, Eilat, Israel, November 16--18, 2016. See the conference paper and presentation.

  14. E. Ram and I. Sason, ``On Renyi entropy power inequalities,'' Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT 2016), pp. 2289-2293, Barcelona, Spain, July 10-15, 2016. See the conference paper and presentation.

  15. M. A. Kumar and I. Sason, ``On projections of the Renyi divergence on generalized convex sets,'' Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT 2016), pp. 1123-1127, Barcelona, Spain, July 10-15, 2016. See the conference paper and presentation.

  16. I. Sason and S. Verdu, "Upper bounds on the relative entropy and Renyi divergence as a function of total variation distance for finite alphabets,'' Proceedings of the 2015 IEEE Information Theory Workshop (ITW 2015), pp. 214-218, Jeju Island, Korea, October 11-15, 2015. See the conference paper and presentation.

  17. M. Raginsky and I. Sason, Concentration of Measure and Its Applications in Information Theory, Communications and Coding, 2015 IEEE International Symposium on Information Theory (ISIT 2015), Hong Kong, invited tutorial of 3 hours, June 2015. Slides: Part 1 and Part 2. See also our monograph (or in the IEEE Explore), which has been published in the Foundations and Trends in Communications and Information Theory, as a third edition, in March 2019.

  18. I. Sason, "On the Renyi divergence, the joint range of relative entropies and a channel coding theorem,'' Proceedings of the 2015 IEEE International Symposium on Information Theory (ISIT 2015), pp. 1610-1614, Hong Kong, June 14-19, 2015. See the conference paper and presentation. See also extended version of this talk.

  19. I. Sason, "Tight bounds on symmetric divergence measures and a new inequality relating f-divergences,'' Proceedings of the IEEE 2015 Information Theory Workshop (ITW 2015), Jerusalem, Israel, April 26-May 1, 2015. See the conference paper and presentation.

  20. I. Sason, ``On the corner points of the capacity region of a Gaussian interference channel,'' Proceedings of the 2014 IEEE International Symposium on Information Theory, pp. 2739--2743, Honolulu, Hawaii, USA, July 2014. See the conference paper and presentation.

  21. M. Mondelli, S. H. Hassani, I. Sason and R. Urbanke, ``Achieving Marton's region for broadcast channels using polar codes,'' Proceedings of the 2014 IEEE International Symposium on Information Theory, pp. 306-310, Honolulu, Hawaii, USA, July 2014. See the conference paper and presentation.

  22. M. Raginsky and I. Sason, ``Refined bounds on the empirical distribution of good channel codes via concentration inequalities,'' Proceedings of the 2013 IEEE International Symposium on Information Theory, pp. 221-225, Istanbul, Turkey, July 2013. See the conference paper and presentation.

  23. I. Sason, ``Entropy bounds for discrete random variables via coupling,'' Proceedings of the 2013 IEEE International Symposium on Information Theory, pp. 414-418, Istanbul, Turkey, July 2013. See the conference paper and presentation.

  24. I. Sason, "New lower bounds on the total variation distance and relative entropy for the Poisson approximation,'' Proceedings of the 2013 Information Theory and Applications Workshop (ITA 2013), pp. 1-4, San-Diego, California, USA, February 2013. See the conference paper and presentation.

  25. I. Sason, a series of three talks related to some information-theoretic aspects that are linked to concentration and the Chen-Stein method, ETH - Swiss Federal Institute of Technology, Zurich, August 20-23, 2012. See: Talk 1, and Talks 2-3.