The AstroStat Slog » Information http://hea-www.harvard.edu/AstroStat/slog Weaving together Astronomy+Statistics+Computer Science+Engineering+Intrumentation, far beyond the growing borders Fri, 09 Sep 2011 17:05:33 +0000 en-US hourly 1 http://wordpress.org/?v=3.4 Wapedia http://hea-www.harvard.edu/AstroStat/slog/2008/wapedi/ http://hea-www.harvard.edu/AstroStat/slog/2008/wapedi/#comments Tue, 09 Dec 2008 23:35:55 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/?p=1354 I do not rely much on my cell phone. It functions as a tool for confronting emergencies. On the other hand, it seems like people do lots of things with their smart phones and I like to add one thing to your “what I do with my phone.”

According to Wikipedia, Wapedia brings the contents of Wikipedia to mobile devices like mobile phones and PDAs. I might say that Wapedia is an optimized version of Wikipedia for mobile phones.

It’s hard to think about life without Wikipedia these days. I use it for getting summaries about stellar clusters (Messier or NGC). I use it for learning theories in astrophysics. I use it for reminding me the functional forms of various distributions and their moments. I use it for getting definitions of scientific and engineering jargon. And I use it for many many reasons.

Probably if someone gets along well with electronic gadgets, they already use Wapedia. It might be a cliche. Then, can you help me? I like to hear how people handle cheating with cell phone when one can retrieve any information through Wapedia during exams. I believe some sloggers are in education.

]]>
http://hea-www.harvard.edu/AstroStat/slog/2008/wapedi/feed/ 0
An anecdote on entrophy http://hea-www.harvard.edu/AstroStat/slog/2008/anecdote-entrophy/ http://hea-www.harvard.edu/AstroStat/slog/2008/anecdote-entrophy/#comments Sat, 06 Sep 2008 00:28:05 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/?p=590

My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”

When, for the first time, learning laws of thermodynamics[1], regarding entropy, I exactly felt like what von Neumann said about it, “nobody knows what entropy really is.” Only having basic physics courses many years back, I don’t know how the epistemology of this subject has been evolved but it was a very surprising moment when I confronted the book by Thomas and Cover and Shannon’s paper (see my 2nd comment from this slog post Model vs. Model for links related to these references). I’ve been wondering about what philosophy behind the story of sharing the same lexicon in different disciplines (physics, information theory, and statistics in timely order) and this quote somewhat alleviates my burden of curiosity.

Wikiquote says that the Shannon’s words were given in Energy and Information by Tribus and McIrvine in Scientific American (Vol.224, pp.178-184, yr. 1971) and I didn’t know until now that Wikiquote exists.

  1. Wikipedia link
]]>
http://hea-www.harvard.edu/AstroStat/slog/2008/anecdote-entrophy/feed/ 0
A lecture note of great utility http://hea-www.harvard.edu/AstroStat/slog/2008/a-lecture-note-of-great-utility/ http://hea-www.harvard.edu/AstroStat/slog/2008/a-lecture-note-of-great-utility/#comments Wed, 27 Aug 2008 18:35:14 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/?p=439 I didn’t realize this post was sitting for a month during which I almost neglected the slog. As if great books about probability and information theory for statisticians and engineers exist, I believe there are great statistical physics books for physicists. On the other hand, relatively less exist that introduce one subject to the other kind audience. In this regard, I thought the lecture note can be useful.

[arxiv:physics.data-an:0808.0012]
Lectures on Probability, Entropy, and Statistical Physics by Ariel Caticha
Abstract: These lectures deal with the problem of inductive inference, that is, the problem of reasoning under conditions of incomplete information. Is there a general method for handling uncertainty? Or, at least, are there rules that could in principle be followed by an ideally rational mind when discussing scientific matters? What makes one statement more plausible than another? How much more plausible? And then, when new information is acquired how do we change our minds? Or, to put it differently, are there rules for learning? Are there rules for processing information that are objective and consistent? Are they unique? And, come to think of it, what, after all, is information? It is clear that data contains or conveys information, but what does this precisely mean? Can information be conveyed in other ways? Is information physical? Can we measure amounts of information? Do we need to? Our goal is to develop the main tools for inductive inference–probability and entropy–from a thoroughly Bayesian point of view and to illustrate their use in physics with examples borrowed from the foundations of classical statistical physics.

]]>
http://hea-www.harvard.edu/AstroStat/slog/2008/a-lecture-note-of-great-utility/feed/ 0
On-line Machine Learning Lectures and Notes http://hea-www.harvard.edu/AstroStat/slog/2008/on-line-machine-learning-lectures/ http://hea-www.harvard.edu/AstroStat/slog/2008/on-line-machine-learning-lectures/#comments Thu, 03 Jan 2008 18:44:14 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/2008/on-line-machine-learning-lectures/ I found this website a while ago but haven’t checked until now. They are quite useful by its contents (even pages of the lecture notes are properly flipped for you while the lecture is given). Increasing popularity of machine learning among astronomers will find more use of such lectures. If you have time to learn machine learning and other related subjects, please visit http://videolectures.net/. Specifically classified links to interesting subjects are found by your click.

Mathematics:
Mathematics>Operations Research (lectures by Gene Golub, Professor at Stanford and Lieven Vandenberghe, one of the authors of Convex Optimzation – a link to the pdf file)
Mathematics>Statistics (including Peter Bickel, Professor at UC Berkeley).

Computer Science:
Computer Science>Bioinformatics
Computer Science>Data Mining
Computer Science>Data Visualisation
Computer Science>Image Analysis
Computer Science>Information Extraction
Computer Science>Information Retrieval
Computer Science>Machine Learning
Computer Science>Machine Learning>Bayesian Learning
Computer Science>Machine Learning>Clustering
Computer Science>Machine Learning>Neural Networks
Computer Science>Machine Learning>Pattern Recognition
Computer Science>Machine Learning>Principal Component Analysis
Computer Science>Machine Learning>Semi-supervised Learning
Computer Science>Machine Learning>Statistical Learning
Computer Science>Machine Learning>Unsupervised learning

Physics:
Physics (You’ll see Randall Smith)

[In the near future, some selected lectures with summary note might be suggested; nevertheless, your recommendations are mostly welcome.]

]]>
http://hea-www.harvard.edu/AstroStat/slog/2008/on-line-machine-learning-lectures/feed/ 0
[ArXiv] 2nd week, Oct. 2007 http://hea-www.harvard.edu/AstroStat/slog/2007/arxiv-2nd-week-oct-2007/ http://hea-www.harvard.edu/AstroStat/slog/2007/arxiv-2nd-week-oct-2007/#comments Fri, 12 Oct 2007 20:00:40 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/2007/arxiv-2nd-week-oct-2007/ Frankly, there was no astrostatistically interesting paper from astro-ph this week but profitable papers from the statistics side were posted. For the list, click

In terms of detecting sources, my personal thought is that the topics of size, power, false discovery rates, and multiple testing ask for astronomers attention. To my understanding, the power of observation has increased but the methodology of source detection cites tools of two to three decades ago when assuming normality, iid, or homogeneity for data and instruments was agreeable.

]]>
http://hea-www.harvard.edu/AstroStat/slog/2007/arxiv-2nd-week-oct-2007/feed/ 3