home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!dtix!darwin.sura.net!wupost!micro-heart-of-gold.mit.edu!news.media.mit.edu!news.media.mit.edu!popat
- From: popat@image.mit.edu (Kris Popat)
- Newsgroups: sci.math.stat
- Subject: min ave self-information inference
- Message-ID: <POPAT.92Sep14212000@image.mit.edu>
- Date: 15 Sep 92 02:20:00 GMT
- Sender: news@news.media.mit.edu (USENET News System)
- Organization: MIT Advanced Television Research Program
- Lines: 25
-
-
- Suppose you have a parametric model for the probability density
- function of a discrete random variable, and a set of observed values.
- Call the model pdf p(x), and call the observed values x_1,...x_N. The
- goal is to find parameter values that make the model pdf p(x_i)
- approximate the "true" but unknown pdf q(x_i).
-
- One way to fit the model to the data would be to find parameter values
- that minimize
-
- -sum(log p(x_i))
-
- i.e., to minimize the total self-information of the observed points
- with respect to the model. This "works" because for a given true pdf
- q(x_i) and for all valid model pdfs p(x_i),
-
- E[-log(p(x_i))] = -sum( q(x_i) log p(x_i) )
-
- is minimized when p = q.
-
- I'd like a pointer to any papers that discuss this or similar
- approaches to parametric pdf fitting.
-
- Kris Popat
- MIT Rm E15-391 Cambridge, MA 02139
-