Presentations |
Fall/Winter 2004-2005
Siemiginowska, A. / Connors, A. / Kashyap, V. / Zezas, A. / Devor, J. / Drake, J. / Kolaczyk, E. / Izem, R. / Kang, H. / Yu, Y. / van Dyk, D. |
Fall/Winter 2005-2006
van Dyk, D. / Ratner, M. / Jin, J. / Park, T. / CCW / Zezas, A. / Hong, J. / Siemiginowska, A. & Kashyap, V. / Meng, X.-L. |
Hyunsook Lee (Penn State) 7 Sep 2006 |
- A Convex Hull Peeling Depth Approach to Nonparametric
Massive Multivariate Data Analysis with Applications
- Abstract:
We explore the convex hull peeling process to develop
empirical tools for statistical inferences on multivariate massive data.
Convex hull and its peeling process has intuitive appeals for robust
location estimation. We define the convex hull peeling depth,
which enables to order multivariate data. This ordering process
provides ways to obtain multivariate quantiles including
median. Based on the generalized quantile process, we define
a convex hull peeling central region, a convex hull level set, and
a volume functional, which lead us to invent one dimensional mappings,
describing shapes of multivariate distributions along data depth.
We define empirical skewness and kurtosis measures
based on the convex hull peeling process.
In addition to these empirical descriptive statistics, we find
a few methodologies to separate multivariate outliers in massive data sets.
Those outlier detection algorithms
are (1) estimating multivariate quantiles up to the level $\alpha$,
(2) detecting changes in a measure sequence of convex hull level sets,
and (3) constructing a balloon to exclude outliers. The convex hull peeling
depth is a robust estimator so that
the existence of outliers do not affect properties of inner convex hull
level sets. Overall, we illustrate all these characteristics and algorithms
of the convex hull peeling process through bivariate synthetic data sets.
We show that these empirical procedures are applicable to
real massive data set by employing Quasars and galaxies from the Sloan
Digital Sky Survey.
- Presentation [.pdf]
|
Alanna Connors, et al. (CHASC) 19 Sep 2006 |
- A Sense of Motion:
California-Harvard Astronomy and Statistics Collaboration in 2006-2007
- Abstract:
Join us for an overview of interdisciplinary problem solving as
hosted by Harvard Statistics Dept. and Harvard-Smithsonian Center for
Astrophysics, 1997-2006.
In the first Ph.D. in Astronomy ever awarded by Harvard, Cecelia
Payne applied new theory and calculational techniqes from one field
(Physics) to the new large survey astronomy data of its time (visible
energy spectra of many stars).
(www.harvardmagazine.com/on-line/030236.html)
Eight decades later, we have new data, from satellites and
ground-based observatories, now at many wavelengths beyond the
visible. It is becoming increasing high-resolution at even previously
inaccessible wavebands. The amount of data is large and slated to
become much larger as new missions are launched. In astrophysics, we
use the knowledge gained in the Sun/Earth environment as a strong
physics-based prior to infer the behavior of all objects in the
Universe.
But there is a lack: even with all these astonishing leaps in
measurement, there are still basic statistical problems in data
analysis techniques which require a deep statistical background to
solve. These problems need the perspective of statisticians rather
than astronomers.
Hence there are many problems, both simple and complex, for
which even a beginning student in statistics can make a serious
contribution to modern astronomy.
In our opening talk, we will give a general historical overview,
including a summary of some of our work so far. Our group has tended
to concentrate on astronomy data that cannot be well described by a
Gauss-normal distribution: especially UV, X-ray, and gamma-ray
data. However all kinds of problems are welcome. We will introduce
progress and challenges from last Spring's widely attended SAMSI
Special Workshop in Astrostatistics. We will also introduce a
contest
in the related field of high energy physics! We invite all to come
and comment.
- The High Energy Groove [.mov]
|
Pavlos Protopapas (CfA) 14 Nov 2006 |
- New Challenge: Time Series Center
- Abstract:
The Time Series Center is a new center hosted at the Initiative of
Innovative Computing (IIC) at Harvard. The center will collect among
others a large set of Astronomical time series (light curves) from
various surveys.
I will describe few projects underway with emphasis on unsolved
statistical question.
- Presentation: [.pdf] [.ppt]
|
Jonathan McDowell (CXC/CfA) 12 Dec 2006 |
- Astrophysics for Mathematicians
- Abstract:
Astronomical image, spectroscopic, and variability data challenges for Statisticians and Mathematicians.
- Presentation: [.ppt] [.pdf]
|
Rima Izem (Harvard) 30 Jan 2007 |
- Reducing the dimensionality of RMFs
-
-
|
Stephane Blondin (CfA/OIR) 13 Mar 2007 |
- How to Classify Spectra of Exploding Stars(?)
- Abstract:
Professional and amateur astronomers discover several hundreds of
exploding stars (supernovae) each year. These are classified into four
main types based on characteristic features in their optical
spectra. Nevertheless, the supernova classification scheme poses
several conceptual problems, all of which point towards the need for
an unbiased and automated classification system. I will present a
simple cross-correlation algorithm that was adapted to this
issue, although the aim of this seminar is to seek advice from the
statistics community as to which tools are best suited to classify
supernova spectra.
- Presentation: [.pdf]
|
Hyunsook Lee (CfA) 10 Apr 2007 |
- Introducing BLoCXS and using it to estimate calibration uncertainties
- Abstract: The flowchart of BLoCXS is presented for a discussion among
participants to refine the proposed methods in resolving calibration uncertainties
from high energy astrophysics.
- Presentation: [.pdf]
Flowcharts: [.ps] [.ps]
[.ps]
Updated flowchart: [.pdf]
|
Andreas Zezas & Hyunsook Lee (CfA) 24 Apr 2007 |
- Developing an alternative method to measure the history of stellar populations"
- Abstract:
Optical photometry and in particular color-magnitude diagrams are a
standard tool for studies of stellar populations. In this presentation
we will discuss the basic astrophysical background, describe the most
commonly used methods and contrast them with a new more robust method
which is currently under development.
- Andreas' Presentation: [.pdf]
[.ppt]
- Hyunsook's Presentation: [.pdf]
|
Jing Chen Liu (Harvard) w. Xiao-Li Meng, Michael Ratner, Irwin Shapiro 08 May 2007 |
- An Exploratory Statistical Study of the Measured Motion
of the Guide Star Used in a New Test of Einstein's Universe
- Abstract:
General relativity predicts that the phenomenon of "frame-dragging"
slowly alters the direction of spin of Earth-orbiting gyroscopes. NASA's
Gravity Probe B satellite measured the orientation of such freely
falling gyroscopes with respect to a guide star. This star's motion with
respect to "fixed points" in the distant universe thus needed to be
measured independently. VLBI measurements spread over 14 years yielded
39 positions of this radio-emitting guide star. We use Bayesian methods
and a linear regression model of the star's motion to study the effects
of various assumptions involving the parameters of both the physical
model and the measurement noise model. We find that a wide range of
different assumptions lead to a narrow range of point estimates and
error distributions.
|
David van Dyk (UC Irvine) 14 May 2007 |
- CHASC meeting
|
John Rice (UC Berkeley) 08 June 2007 Rm 403 @ 60 Oxford |
- Event Weighted Tests for Periodicity in a Sequence of Photon Arrival Times:
Detecting Gamma-ray Pulsars
|