May 7th, 2009| 11:14 am | Posted by hlee

One of [ArXiv] papers from yesterday whose title might drag lots of attentions from astronomers. Furthermore, it’s a short paper.

[arxiv:math.CO:0905.0483] by Harmany, Marcia, and Willet.

Continue reading ‘[ArXiv] Sparse Poisson Intensity Reconstruction Algorithms’ »

Tags:

compressed sensing,

decomposition,

EM algorithm,

intensity,

MPLE,

multiscale,

penalty,

Poisson,

Poisson Intensity,

Sparcity,

wavelet Category:

Algorithms,

arXiv,

Astro,

Cross-Cultural,

Data Processing,

High-Energy,

Imaging,

Jargon |

Comment
Apr 10th, 2009| 03:16 pm | Posted by vlk

Probability density functions are another way of summarizing the consequences of assuming a Gaussian error distribution when the true distribution is Poisson. We can compute the posterior probability of the intensity of a source, when some number of counts are observed in a source region, and the background is estimated using counts observed in a different region. We can then compare it to the equivalent Gaussian.

The figure below (AAS 472.09) compares the pdfs for the Poisson intensity (red curves) and the Gaussian equivalent (black curves) for two cases: when the number of counts in the source region is 50 (top) and 8 (bottom) respectively. In both cases a background of 200 counts collected in an area 40x the source area is used. The hatched region represents the 68% equal-tailed interval for the Poisson case, and the solid horizontal line is the ±1σ width of the equivalent Gaussian.

Clearly, for small counts, the support of the Poisson distribution is bounded below at zero, but that of the Gaussian is not. This introduces a visibly large bias in the interval coverage as well as in the normalization properties. Even at high counts, the Poisson is skewed such that larger values are slightly more likely to occur by chance than in the Gaussian case. This skew can be quite critical for marginal results. Continue reading ‘Poisson vs Gaussian, Part 2’ »

Apr 9th, 2009| 07:01 pm | Posted by vlk

We astronomers are rather fond of approximating our counting statistics with Gaussian error distributions, and a lot of ink has been spilled justifying and/or denigrating this habit. But just how bad is the approximation anyway?

I ran a simple Monte Carlo based test to compute the expected bias between a Poisson sample and the “equivalent” Gaussian sample. The result is shown in the plot below.

The jagged red line is the fractional expected bias relative to the true intensity. The typical recommendation in high-energy astronomy is to bin up events until there are about 25 or so counts per bin. This leads to an average bias of about 2% in the estimate of the true intensity. The bias drops below 1% for counts >50. Continue reading ‘Poisson vs Gaussian’ »

Jul 2nd, 2008| 01:00 pm | Posted by vlk

Astrophysics, especially high-energy astrophysics, is all about counting photons. And this, it is said, naturally leads to all our data being generated by a Poisson process. True enough, but most astronomers don’t know exactly how it works out, so this derivation is for them. Continue reading ‘Poisson Likelihood [Equation of the Week]’ »

May 26th, 2008| 02:59 pm | Posted by hlee

Tags:

clustering,

high dimension,

LF,

maximum likelihood,

multivariate,

Poisson,

Schechter,

zero count Category:

arXiv,

Bayesian,

Fitting,

MCMC,

Methods,

Stat |

Comment
May 6th, 2008| 06:12 pm | Posted by vlk

The *gamma* function [not the Gamma -- note upper-case G -- which is related to the factorial] is one of those insanely useful functions that after one finds out about it, one wonders “why haven’t we been using this all the time?” It is defined only on the ~~positive~~ non-negative real line, is a highly flexible function that can emulate almost any kind of skewness in a distribution, and is a perfect complement to the Poisson likelihood. In fact, it is the conjugate prior to the Poisson likelihood, and is therefore a natural choice for a prior in all cases that start off with counts. Continue reading ‘*gamma* function (Equation of the Week)’ »

Apr 29th, 2008| 02:24 am | Posted by hlee

Scheming arXiv:astro-ph abstracts almost an year never offered me an occasion that the fit of the Poisson distribution is tested in different ways, instead it is taken for granted by plugging data and (source) model into a (modified) χ^{2} function. If any doubts on the Poisson distribution occur, the following paper might be useful: Continue reading ‘tests of fit for the Poisson distribution’ »

Nov 2nd, 2007| 05:59 pm | Posted by hlee

To be exact, the title of this posting should contain *5th week, Oct*, which seems to be the week of EGRET. In addition to astro-ph papers, although they are not directly related to astrostatistics, I include a few statistics papers which may be profitable for astronomical data analysis. Continue reading ‘[ArXiv] 1st week, Nov. 2007’ »

Tags:

bootstrap,

EGRET,

Fisher information,

Laplace transform,

maximum likelihood,

PCA,

PDF,

Poisson,

Ratio,

Uncertainty,

variance Category:

arXiv |

1 Comment
Aug 17th, 2007| 06:15 pm | Posted by hlee

From arxiv/math.st:0708.2153v1

**Estimating the number of classes** by Mao and Lindsay

This study could be linked to identifying the number of lines from Poisson nature x-ray count data, one of the key interests for astronomers. However, as pointed by the authors, estimating the numbers of classes is a difficult statistical problem. I.J.Good^{[1]} said that

I don’t believe it is usually possible to estimate the number of species, but only an appropriate lower bound to that number. This is because there is nearly always a good chance that there are a very large number of extremely rare species.

Continue reading ‘[ArXiv] Poisson Mixture, Aug. 16, 2007’ »

Aug 16th, 2007| 04:36 pm | Posted by hlee

I’ve been heard so much, without knowing fundamental reasons (most likely physics), about coverage problems from astrostat/phystat groups. This paper might be an interest for those: Interval Estimation in Exponential Families by Brown, Cai,and DasGupta ; Statistica Sinica (2003), **13**, pp. 19-49

*Abstract summary:*

The authors investigated issues in interval estimation of the mean in the exponential family, such as binomial, Poisson, negative binomial, normal, gamma, and a sixth distribution. The poor performance of the Wald interval has been known not only for discrete cases but for nonnormal continuous cases with significant negative bias. Their computation suggested that the equal tailed Jeffreys interval and the likelihood ratio interval are the best alternatives to the Wald interval. Continue reading ‘Coverage issues in exponential families’ »

Tags:

bias,

binomial,

coverage,

Edgeworth expansion,

exponential family,

gamma,

Gehrels,

interval,

Jeffreys,

likelihood ratio,

negative binomial,

normal,

Poisson,

Rao score,

Wald Category:

arXiv,

Stat,

Uncertainty |

Comment