Presentations 

Workshop 20 Sep 2017 10am4pm EDT Phillips Auditorium at CfA 
 AstroStat Day
 10:00am  12:10pm : Siemiginowska, Vikhlinin, Finkbeiner, Portillo, Daylan, Speagle, B. Johnson
 12:30pm  1:30pm : Reeves, Winter
(SolarStat, in conjunction with HEAD Lunch Talks)
 2:00pm  4:00pm : Grindlay, M. Johnson, Blackburn, Bouman, Avelino, Zucker
 4:00pm  5:00pm : Discussion


Josh Speagle (CfA) 26 Sep 2017 1:07pm EDT Pratt (PerkinG, CfA) 
 Dynamic Nested Sampling
 Nested Sampling is a relatively new method for estimating the Bayesian evidence (with the posterior estimated as a byproduct) that integrates over the posterior by sampling in nested "shells" of constant likelihood. Its ability to sample from complex, multimodal distributions in a flexible yet efficient way combined with several available sampling packages has contributed to its growing popularity in (astro)physics. In this talk I will outline the basic motivation and theory behind Nested Sampling, derive various statistical properties associated with the method, and discuss how it is applied in practice. I will then talk about how the overall framework can be extended in Dynamic Nested Sampling to accommodate adding samples "dynamically" during the course of a run. These samples can be allocated to maximize arbitrary objective functions, allowing Dynamic Nested Sampling to function as a posteriororiented sampling method such as MCMC but with the added benefit of welldefined stopping criteria. I will end by applying Dynamic Nested Sampling to a variety of synthetic and realworld problems using an opensource Python package I've been developing (dynesty).
 Presentation slides [.pdf]
 See also: MultiNest ; PolyChord [url


Gabriel Collin (MIT) 21 Nov 2017 1:07pm EST Pratt (PerkinG, CfA) 
 Searching for the origin of astrophysical neutrinos using a nonPoissonian statistical method
 Abstract: The IceCube neutrino observatory was designed to detect astrophysical neutrinos, which originate from outside of our solar system. IceCube has detected candidate astrophysical events, and measured a diffuse flux, but the source of these neutrinos so far remains unknown. Current approaches look for "hot spots" of neutrino events in the sky. It is also possible to describe a population of sources in terms of the number of observed events, forming a nonPoissonian statistical distribution. This distribution was used to show that the excess of gamma rays measured by FermiLAT around the galactic center was likely due to point sources rather than decaying dark matter. In this talk, I will present the application of this statistical method to the search for point sources in IceCube.
 Evidence for Unresolved GammaRay Point Sources in the Inner Galaxy, Lee et al. arxiv:1506.05124 [.url]
 NPTfit github.com/bsafdi/NPTFit [.url]


Katy McKeough & Shihao Yang (Harvard) 28 Nov 2017 1:07pm EST SciCen 706 
 Defining regions that contain Xray jets in highredshift quasars
 Abstract:
Using only the Xray observation of a quasar and a jet, we are interested in creating an outline around an extended source (jet). Astronomers are interested in delineating jets from their quasar source and background radiation. This is particularly difficult in images of high redshift jets taken in Xray where there are a limited number of pixel counts. McKeough et al. 2016 and Stein et al. 2015 proposes a method where jets are detected using previously defined regions of interest (ROI). However, we do not always have supplementary information to predetermine these ROI and the size and shape can greatly affect flux/luminosity measurements and power of detection. Low Count Image Reconstruction and Analysis (LIRA) has been tremendously successful in analyzing low counts images and extracting structure smeared out by the PSF. However, the intensities derived using it are pixellated. That is, LIRA is unaware of correlations that may exist between adjacent pixels in the real image. In order to group pixels of a similar nature, we impose a successor or postmodel on the output of LIRA. We adopt the Ising model, which has been used extensively in Condensed Matter Physics to model electron spin states, as a prior on assigning the pixels to either the background or the ROI.
 Presentation slides [.pdf]


Katy McKeough & Luis Campos (Harvard) 12 Dec 2017 12:37pm EST CfA Library 
 Ask A Statistician: An oppportunity for astronomers at the CfA to ask statistics questions of statisticians; from the mundane to the philosophical, bring your statistics problems to be discussed by the panel
 We will be going through several applications of statistics in astronomy. Each application will serve as the backdrop for discussing a different statistical technique. We will suggest partial solutions or new directions for each of these proposed issues that we hope will stimulate further questions and discussion.
The following examples are:
 Propagating asymmetrical error bars via parametric bootstrap.
 Correlation between two time series observations.
 Using external information as a prior in Bayesian inference.
 Explanation of shrinkage.
 Detection significance with multiple hypothesis testing.
 Presentation Slides [url]
 JamesStein Estimator R model [.rmd]


Michelle Ntampaka (DSI/CfA) 23 Jan 2018 1:07pm EDT SciCen 706 
 Constraining Sigma8 and OmegaMatter with the Velocity Distribution Function
 Abstract: I will present the Velocity Distribution Function (VDF), a new approach for quantifying the abundance of galaxy clusters and constraining cosmological parameters using dynamical measurements. In this new method, the probability distribution of velocities for each cluster in the sample are summed to create a new test statistic, which can be measured more directly and precisely than the more standard halo mass function, and can be robustly predicted with cosmological simulations which capture the dynamics of subhalos or galaxies. I will present preliminary constraints on sigma8 and omegamatter from spectroscopic observations of the HeCSSZ clusters.
 Presentation slides [.pdf]


Herman Marshall (MIT) 06 Feb 2018 1:07pm EDT CGIS South  S153 
 Computational Challenges from Imaging Xray Polarimetry
 Abstract: I will provide an overview of an approved NASA astrophysics mission called IXPE, the Imaging Xray Polarimetry Explorer, scheduled for launch in 2021. IXPE will obtain Xray polarization measurements for a wide variety of astrophysical objects. While many targets will be pointlike, such as most active galaxies, Xray binaries in the Galaxy, and isolated neutron stars, others will be resolved, such as scattering clouds, supernova remnants, and pulsar wind nebulae. I will outline three interesting problems where advanced statistical and numerical methods may be beneficial. The first problem relates to how Xray events are measured to yield polarization information; we have begun a project to apply machine learning techniques to improve the result. The second problem involves the multidimensionality of the data, with each event carrying time, energy, sky position, and direction information, so Bayesian methods may help in the analysis by bringing in external data as priors. The third problem is devising efficient ways to test models against the multidimensioned event list.
 Presentation slides [.pdf]


Daniela Huppenkothen (University of Washington) 06 Mar 2018 1:07pm EST M340, 160 Concord, CfA 
 Fun Statistics with Fourier Spectra
 Abstract: In recent years, the cross spectrum has received considerable attention as a means of characterising the variability of astronomical sources as a function of wavelength. While much has been written about the statistics of time and phase lags, the cospectrumthe real part of the cross spectrumhas only recently been understood as means of mitigating instrumental effects dependent on temporal frequency in astronomical detectors, as well as a method of characterizing the coherent variability in two wavelength ranges on different time scales. In this talk, I will present research that started as small exercise to answer what seemed like a simple questionwhat is the statistical distribution of the cospectrum?and took us on an adventure through 80 years of statistical research instead.
I will show recent advances made in understanding the statistical properties of cospectra, leading to much improved inferences for periodic and quasiperiodic signals. I will also present a new method to reliably mitigate instrumental effects such as dead time in Xray detectors, and show how we can use the cospectrum to model highly variable sources such as Xray binaries or Active Galactic Nuclei.
 Presentation slides: [.pdf]

Yu Xixi (Imperial) 3 Apr 2018 1:07pm EDT SciCen 706 
 Statistical methods in Solar Spectral Analyses with Uncertain Atomic Physical Models
 Abstract: Information about the physical properties of astrophysical objects cannot be measured directly but is inferred by interpreting spectroscopic observations in the context of atomic physics calculations. A critical component of this analysis is understanding how uncertainties in the underlying atomic physics propagates to the uncertainties in the inferred plasma parameters. We previously introduced a fully Bayesian method to address this problem, where, in this model setting, we allowed for the observed data to update the atomic data uncertainties and there were only $1000$ equally likely emissivity curves as priori. Following that work, we assume a new model that the above emissivity curve samples come from a high dimensional distribution. We summarize this distribution with a Normal distribution via principal component analysis (PCA) to efficiently represent the uncertainty of the emissivity curves.
In the context of the first model using Hamiltonian Monte Carlo (HMC) via Stan, owing to an insufficient number of emissivity curves, we could not estimate the relative size of the modes right for the posterior distribution that come from assuming each of these emissivity curves was equally likely and there was nothing in between them. We proposed a computational trick by adding a few linear combinations of the selected emissivity curves to the original set. These linear combinations acted as a bridge to allow a perfet jump between the modes and to measure the relative size of these modes accurately. However, we cannot rely completely on the linear combinations since the emissivity curves can be off all of these linear combinations and we do not really know where the mass of their distribution is. Here, in the second model, PCA aggregates all possibilities and is a better representation of the full ensemble of the emissivity curves. HMC via Stan can be used to sample the corresponding posterior distribution so far.
 Presentation slides [.pdf]


Rosanne DiStefano (CfA), Jennifer Yee (CfA), and Hyungsuk Tak (SAMSI) 17 Apr 2018 1:07pm EDT SciCen 706 
 [RDS] On Microlensing
 [JY] The Microlensing Challenge
 [HT] Two data analytic challenges of gravitational lensing: From Micro to Macro
 Abstract:
Light emitted from distant objects  ranging from stars to quasars  is deflected when it moves through a gravitational field of an intervening object, such as planets or galaxies. According to the strength of the gravitational field, we observe various phenomena in the sky, e.g., microlensing and macrolensing. Astronomers have been interested in this gravitational deflection of light for various purposes, e.g., using microlensing as a tool to search for exoplanets (planets outside our solar system) and using macrolensing to probe the current expansion rate of the Universe (the Hubble constant). Two data analytic blind competitions are currently ongoing as an effort to improve existing computational and statistical tools and to encourage the development of new methods. We introduce these data analytic challenges, describing the basics of microlensing and macrolensing, their data types (time series and image data), and challenging issues in analyzing these data.
 Data Challenge [url]
 Presentation slides:
Rosanne DiStefano  [.pptx]
Jennifer Yee  [.pdf] ; [.pptx]
Hyungsuk Tak  [.pdf] ; [video1.mp4] ; [video2.mp4]


Arturo Avelino (CfA) 24 Apr 2018 1:15pm EDT Phillips Auditorium, CfA 
 Nearinfrared Type Ia Supernovae as standard candles
 Abstract:
Explosions of Type Ia Supernovae (SNe Ia) observed in the near infrared wavelengths (NIR) are very good "standard candles" to measure relative distances in the Universe. The investigation on NIR SNe Ia as standard candles is crucial to reconstruct the expansion history of the Universe and to determine its properties with more accuracy compared with traditional methods using optical data.
In this talk I will describe how to estimate the relative distances (the distance moduli) of NIR SNe Ia using a sample of photometric timeseries observations in NIR, a GaussianProcesses regression and a simple hierarchical Bayesian model. At the end, I will use the estimated distance moduli to quantify how good standard candles the NIR SNe Ia are compared with the optical SNe Ia observations.
 Presentation slides [.pdf]
 YouTube Video [url]





