[R] PDF with computationally expensive normalizing constant

From: Robin Hankin <r.hankin_at_noc.soton.ac.uk>
Date: Mon, 11 Feb 2008 11:57:16 +0000


I am writing some functionality for a multivariate PDF.

One problem is that evaluating the normalizing constant (NC) is massively computationally intensive [one recent example took 4 hours and bigger examples would take much much longer] and it would be good allow for this in the design of the package somehow.

For example, the likelihood function doesn't need the NC but (eg) the moment generating function does.

So a user wanting a maximum-likelihood estimate shouldn't have to evaluate the NC but a user wanting a
mean has to. Some simple forms of the PDF have an easily-evaluated analytical expression for the NC.

And once the NC is evaluated, it would be good to store it somehow.

I thought perhaps I could define an S4 class with a slot for the parameters and a slot for the NC; and if the NC is unknown this would have an "NA" entry.

Then a user could execute something like

a <- CalculateNormalizingConstant(a)

and after this, object "a" would then have the numerically computed NC in place.

Is this a Good Idea?

Are there any PDFs implemented in R in which this is an issue?

Robin Hankin
Uncertainty Analyst and Neutral Theorist,
National Oceanography Centre, Southampton
European Way, Southampton SO14 3ZH, UK
  tel  023-8059-7743

R-help_at_r-project.org mailing list
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Received on Mon 11 Feb 2008 - 12:06:09 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Mon 11 Feb 2008 - 13:30:13 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive