Comments on: Kaplan-Meier Estimator (Equation of the Week) http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/ Weaving together Astronomy+Statistics+Computer Science+Engineering+Intrumentation, far beyond the growing borders Fri, 01 Jun 2012 18:47:52 +0000 hourly 1 http://wordpress.org/?v=3.4 By: vlk http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-302 vlk Wed, 16 Jul 2008 18:27:10 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-302 No, not Bayesian. Frequentist. Like simulating large numbers of independent observations non-parametrically. No, not Bayesian. Frequentist. Like simulating large numbers of independent observations non-parametrically.

]]>
By: hlee http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-301 hlee Wed, 16 Jul 2008 14:26:56 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-301 This commenting within comment cannot be edited. There's something called <i>Bootstrap after Bootstrap</i> and <i>Jackknife after Bootstrap</i><i>, which I want to distinguish from Monte Carlo Bootstrap.</i> This commenting within comment cannot be edited. There’s something called Bootstrap after Bootstrap and Jackknife after Bootstrap, which I want to distinguish from Monte Carlo Bootstrap.

]]>
By: hlee http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-300 hlee Wed, 16 Jul 2008 14:19:45 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-300 You mean Bayesian, right? According to your description, "jiggle them" corresponds to Monte Carlo or "data augmentation", and the Bootstrap corresponds to estimation or "parameter draws" if I make an analogy with Gibbs sampling. I don't think a frequentist would do Bootstrap and Monte Carlo at the same time. They complement each other in some sense, i.e. redundant. Bootstrap describes a methodology for statistical inference and Monte Carlo implies a random number generating mechanism. I was asking references with sensible results since there's almost no chance that someone just throws Monte Carlo Bootstrap without justification (proofs or empirical results). I understand you don't like Bootstrap and I don't like methods with heavy computations (Monte Carlo Bootstrap sounds wasting to me). Unless someone makes a point that why this Monte Carlo Bootstrap works (theoretic) or clearly shows the results (empirical), we cannot discuss results are sensible or not. You mean Bayesian, right? According to your description, “jiggle them” corresponds to Monte Carlo or “data augmentation”, and the Bootstrap corresponds to estimation or “parameter draws” if I make an analogy with Gibbs sampling. I don’t think a frequentist would do Bootstrap and Monte Carlo at the same time. They complement each other in some sense, i.e. redundant. Bootstrap describes a methodology for statistical inference and Monte Carlo implies a random number generating mechanism. I was asking references with sensible results since there’s almost no chance that someone just throws Monte Carlo Bootstrap without justification (proofs or empirical results).

I understand you don’t like Bootstrap and I don’t like methods with heavy computations (Monte Carlo Bootstrap sounds wasting to me). Unless someone makes a point that why this Monte Carlo Bootstrap works (theoretic) or clearly shows the results (empirical), we cannot discuss results are sensible or not.

]]>
By: vlk http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-299 vlk Tue, 15 Jul 2008 17:32:51 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-299 Not that I know of. From an empirical standpoint, I haven't seen it give nonsense results. And why should it? It is a frequentist universe on your desktop. Not that I know of.

From an empirical standpoint, I haven’t seen it give nonsense results. And why should it? It is a frequentist universe on your desktop.

]]>
By: hlee http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-297 hlee Tue, 15 Jul 2008 17:03:24 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-297 Are there proofs or empirical studies from a mathematics/statistics viewpoint that this Monte Carlo Bootstrap works? Are there proofs or empirical studies from a mathematics/statistics viewpoint that this Monte Carlo Bootstrap works?

]]>
By: brianISU http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-289 brianISU Fri, 11 Jul 2008 01:46:56 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-289 Another note just for completeness. One thing that I recommend to do with K-M estimate in hand is to plot this on some sort of "probability paper" (figure I'd reference a term from an earlier topic). This is a simple and effective way to test distributional fit. This is especially effective if the CDF has been linearized (this way if the distribution is a good fit it will appear as a straight line on the plot, which is much easier to recognize than a curve). Another note just for completeness. One thing that I recommend to do with K-M estimate in hand is to plot this on some sort of “probability paper” (figure I’d reference a term from an earlier topic). This is a simple and effective way to test distributional fit. This is especially effective if the CDF has been linearized (this way if the distribution is a good fit it will appear as a straight line on the plot, which is much easier to recognize than a curve).

]]>
By: vlk http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-288 vlk Thu, 10 Jul 2008 21:14:00 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-288 I suspect that Monte Carlo Bootstrap is a term and technique only astronomers use, where if you have measurements y_i, i=1..N, you draw N times from {y_i} with replacement (the bootstrap part), and then jiggle them according to sigma(y), usually N(y_i,s_i) (the Monte Carlo part), before computing whatever summary statistic you are interested in. There may be other variations on that. I suspect that Monte Carlo Bootstrap is a term and technique only astronomers use, where if you have measurements y_i, i=1..N, you draw N times from {y_i} with replacement (the bootstrap part), and then jiggle them according to sigma(y), usually N(y_i,s_i) (the Monte Carlo part), before computing whatever summary statistic you are interested in. There may be other variations on that.

]]>
By: hlee http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-287 hlee Thu, 10 Jul 2008 17:42:47 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-287 Thanks Both! Your discussion helped me to be guided in survival analysis. Vinay's post particularly helped me to relate survival curve and luminosity function. Once I retrieve relevant statistics papers I saw years back (I couldn't comprehend at that time), I'll put some references and probably examples if there are some similarities. I'd like to ask an off-the-track question about bootstrap and Monte Carlo. Either parametric or nonparametric, and other modification of bootstrapping involves resampling from data, whereas, regardless parameters are known or not, Monte Carlo simulates sample from a distribution of known or estimated parameters (the latter uses data to estimate parameters and brings up testing issues, like goodness-of-fit test). One impression that I also obtained from reading astro-ph preprints is that sometimes bootstrap is quoted in replacement of Monte Carlo method. No one taught what Monte Carlo is. I just acquired it through books and papers. Sometimes I felt that the concept of Monte Carlo simulation is different across disciplines (based on conversations with college students in engineering). Thanks Both! Your discussion helped me to be guided in survival analysis. Vinay’s post particularly helped me to relate survival curve and luminosity function. Once I retrieve relevant statistics papers I saw years back (I couldn’t comprehend at that time), I’ll put some references and probably examples if there are some similarities.

I’d like to ask an off-the-track question about bootstrap and Monte Carlo. Either parametric or nonparametric, and other modification of bootstrapping involves resampling from data, whereas, regardless parameters are known or not, Monte Carlo simulates sample from a distribution of known or estimated parameters (the latter uses data to estimate parameters and brings up testing issues, like goodness-of-fit test). One impression that I also obtained from reading astro-ph preprints is that sometimes bootstrap is quoted in replacement of Monte Carlo method.

No one taught what Monte Carlo is. I just acquired it through books and papers. Sometimes I felt that the concept of Monte Carlo simulation is different across disciplines (based on conversations with college students in engineering).

]]>
By: vlk http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-286 vlk Thu, 10 Jul 2008 03:07:28 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-286 PS: I notice now that Feigelson and Nelson (1985) actually discuss the Greenwood formula in their paper! PS: I notice now that Feigelson and Nelson (1985) actually discuss the Greenwood formula in their paper!

]]>
By: vlk http://hea-www.harvard.edu/AstroStat/slog/2008/eotw-kaplan-meier/comment-page-1/#comment-285 vlk Thu, 10 Jul 2008 03:01:57 +0000 http://hea-www.harvard.edu/AstroStat/slog/?p=356#comment-285 Take a look at Appendix G of Schmitt 1985 linked to above. This paper is very detailed, but the following papers that cite it (usually because they have done some kind of bootstrap KM estimator of X-ray luminosity functions) largely skip over the methodology. Take a look at Appendix G of Schmitt 1985 linked to above. This paper is very detailed, but the following papers that cite it (usually because they have done some kind of bootstrap KM estimator of X-ray luminosity functions) largely skip over the methodology.

]]>