## Bayesian
Estimation with Partial Knowledge
## Overview
A common problem in signal processing is to recover a
signal x)
and that the statistical relation between y and x is
characterized by a likelihood function f_{Y}_{|X}(y|x).The prior x) can typically be
learned from a set of examples of clean signals. Indeed, a large variety of databases
of all kinds of signals are available online, including facial images,
fingerprints, iris scans and speech signals to name a few. The likelihood f_{Y}_{|X}(y|x),
on the other hand, is associated with the degradation mechanism in a specific
application and thus cannot be learned from databases of this sort. One
possibility for obtaining f_{Y}_{|X}(y|x)
is to assume a known degradation model such as additive white Gaussian noise.
However, in many situations this assumption is over-simplistic since the
degradation includes complicated effects, which are hard to model and
sometimes not even known. Nonlinear distortion in CCD sensors, unknown blur
and signal dependent noise are a few examples of such phenomena. An
alternative approach for obtaining f_{Y}_{|X}(y|x)
is to learn it by collecting a paired set of examples of clean and degraded
signals. Unfortunately, constructing such a database requires a complicated
experimental setting in which our sensor is co-calibrated with some
high-quality sensor, and is therefore usually impractical. Specifically, it
is typically quite simple to obtain a set of clean signals from some a
high-grade sensor (or from an existing database). Similarly, it is also easy
to collect a set of degraded signals taken with our low-grade sensor. But these
two sets are unpaired.How
can we estimate z}_{n}of the noisy video and its associated audio (taken with the
cellular-phone), as well as paired examples {_{
}x,_{n}z}
of clean video sequences with their audio (taken from a high-quality
camcorder), as schematically shown below._{n}The training sets in this situation can be used to learn
the densities f x,z) and f(_{YZ}y,z)
but are generally insufficient for determining f_{Y}_{|X}(y|x).The difference
between the MSE of an estimator and the lowest possible MSE that could be
achieved if f (_{X}y|x).
Our approach in this work is to design an estimator whose regret for the
worst-case likelihood, which is consistent with our knowledge of f(_{XZ}x,z)
and f(_{YZ}y,z), is minimal. We call this
technique the partial knowledge minimax regret estimator. |

## References
T. Michaeli and Y. C. Eldar,
"Hidden
relationships: Bayesian estimation with partial knowledge," to
appear in IEEE Transactions on Signal Processing. T. Michaeli and Y. C. Eldar,
"A
Minimax approach to Bayesian estimation with partial knowledge of the
observation model," IEEE Int. Conf. on Acoustics, Speech and Signal
Processing (ICASSP 2010). |

## Software
Download
The minimax regret estimator is implemented in
the Matlab function MinimaxRegretEstimator.m. A simple example of its use on synthetic data can be
found in TestMinimax.m.
In this example, z}
and {_{n}y,_{n}z}. Its MSE is compared with
that of the MMSE estimator, which knows the true joint distribution of _{n}X
and Y. |