AST 475/875  Astronomical Observations
Clemson University, Fall 2004

 
Course Info
Syllabus:  (PDF PS

 
Homework
HW #1   Due Th  8/26      PDF PS
HW #2   Due T  8/31       PDF  PS
HW #3   Due F   9/03       PDF  PS 
HW #4   Due F   9/17       PDF  PS
HW #5   Due Th 9/23     PDF  PS
HW #6   Due  T 09/28    PDF   PS
HW #7   Due  Th 10/21   PDF  PS
HW #8   Due  T  11/16    PDF  PS

 
Exercises
Exercise #1  (due T 09/07) PDF  PS
Exercise #2  (due F 09/10) PDF  PS
Exercise #3   (due F 09/17)   PDF  PS radioflat.dat radiosteep.dat
Exercise #4  (due F 10/15)  PDF  PS
Exercise #5                              PDF  PS  
Exercise #6&7                        PDF  PS 

 
Readings
For Th 8/26, give a qualitative non-detailed read to the 1990 paper on linear regression in astronomy by Isobe et al (a PDF version of which is below).   As usual, the detailed mathematical details are not particularly important.  I want you coming away with the general idea behind ordinary least squares, when the approach is valid, and a sense that you probably had no idea it was such a muddy area requiring thought before utilizing. 
PDF file

For T 8/24, read J. Patrick Harrington's (U MD) basic statistics primer/review.  This will either remind you or introduce you to the basic key distributions in astronomy, and some of the salient results.  Note in particular the use of moment equations to define the mean and variance.  As we've said, these are used in other areas of physics (esp. statistical physics), and we may someday see more when we take moments of the radiative transfer equation in a future stellar atmospheres course.  I also want you to be sure that you don't drift off when reading the section on least squares fitting.  First, note the math and derivations look downright ugly with a quick glance, but I think even with a tiny bit of attention they are really easy to follow.  Second, note that this is a generalized technique-- you could apply a "least squares" approach to fitting any function.  Third, note that this is a so-called "maximum likelihood estimate" of a best-fit line.  Maximum likelihood techniques generally maximize likelihood by minimizing the variance (the sum of the squares of the residuals between the data y values and the fitted value at some x) with respect to relevant parameters of the fitting function (for a line, the slope and zero-point). 
PDF file   PS file 

For the first class Th 8/19  Read sections 1 and 2 of  this introductory review of Bayesian inference by Tom Loredo.   Your focus should be on a (perhaps initially nebulous) qualitative understanding of Bayes theorem and how Bayesian inference differs from the "frequentist" approach.  You are encouraged to look at some of the sample applications in later sections of the article (they might be useful for homework)-- particularly the subtle art of establishing prior probabilities-- but we will try to establish "ok, so how do I actually use this?" in class. 
PDF file    PS file