Adel Mohammadpour1,2, Ali Mohammad-Djafari2
1School of Intelligent Systems (IPM) and
Amirkabir University of Technology (Dept. of Stat.), Tehran, Iran.
2LSS (CNRS-Supélec-Univ. Paris 11),
Supélec, Plateau de Moulon, 91192 Gif-sur-Yvette, France.
E-mail: firstname.lastname@example.org, email@example.com
Received: 14 February 2006 / Accepted: 9 June 2006 / Published: 13 June 2006
Abstract: We consider the problem of inference on one of the two parameters of a probability distribution when we have some prior information on a nuisance parameter. When a prior probability distribution on this nuisance parameter is given, the marginal distribution is the classical tool to account for it. If the prior distribution is not given, but we have partial knowledge such as a fixed number of moments, we can use the maximum entropy principle to assign a prior law and thus go back to the previous case. In this work, we consider the case where we only know the median of the prior and propose a new tool for this case. This new inference tool looks like a marginal distribution. It is obtained by first remarking that the marginal distribution can be considered as the mean value of the original distribution with respect to the prior probability law of the nuisance parameter, and then, by using the median in place of the mean.
Keywords: nuisance parameter; maximum entropy; marginalization; incomplete knowledge.
MSC 2000 codes: 62F30