Hyunsook, could you explain how quantile regression helps to generate smooth curves? I was under the impression that they are just another way to fit straight lines.

Alex, you bring up another bugaboo: when one bootstraps loess curves, it is easy to get them braided up like a frayed rope. In such cases, a density plot tells only half the story. What kind of strategies do statisticians use to deal with that?

]]>By the way, thank you, Nick for pointing a technical improvement for the slog. I’m not sure it’s due to WordPress, or the current theme, or my laziness that I wasn’t able to find a plug in. I’ll definitely look into it and will do best to include a preview button.

]]>You might check out Gelman’s post on it, , but he says that there are no Bayesian versions of it. The comments back in 2005 do mention some Bayesian alternatives.

One such alternative I would think is Gaussian Processes. If you google Gaussian Processes, you’ll see that there is even a webpage on them. The difficult part is choosing a prior for the covariance function. This choice could give a wide range of alternatives (you could even get ARIMA/ARMA type fits or probably a wide range of splines). It’s extremely general. Since posteriors only give confidence intervals in parameter space, I guess I’d use predictive distributions to get the confidence intervals in data space.

BTW, it would be great to have a “preview” button for comments on this blog.

]]>Rather, people tend to use boostrap to find standard errors (like they use cross-validation to find “best fits”). For an example of bootstrapped standard errors in Loess, check out the link: toward the middle of the page under the heading “Curve Fitting Example, Efron & Tibshirani, 7.3″

]]>I suppose it is always possible to run a thousand Monte Carlo simulations based on the measured data errors, but I was looking for a faster, hopefully analytical, way to get the confidence band on the curve.

]]>