Study Guide for Ph.D. Qualifying Exam Statistics Portion of Qualifying Exam (Not in order of presentation) Winter 2014-2015 --------------------------------------------------------------------------- 1. Random elements and variables defined on underlying probability space a. measurable function of random variable b. expectation c. variance/covariance d. higher moments e. moments, cumulants 2. Probability inequalities and techniques - know the ones used to prove WLLN. 3. Distributions - Know the basic distributions by heart a. Univariate distributions b. Multivariate normal, multinomial, ... c. Exponential class - important properties (Sufficient statistics, completeness, MLR, properties of the scaling function, etc.) d. Transform domain of density function: MGF, c.f., CGF, PGF, etc. NOTE that any use of MGF requires explicit statement about region of convergence of the transform. Failure to do this will result in points deducted. 4. Transformations of random variables - discrete and continuous, bijective and non-bijective, univariate and multivariate. 5. Understand the different modes of convergence: a.s., Lp mean, complete, probability, distribution (law) 6. Law of Large Numbers (weak, strong, and uniform; certainly the difference between the three) 7. Importance of concepts such as independence, unbiasedness, consistency, efficiency, asymptotic efficiency, sufficiency, [minimal] sufficiency, [maximal] ancillary, and completeness. 8. Order statistics and their distribution - might limit yourself to Min and Max since these are easy to derive from scratch if you forget them; in the latter case, it might also be a good idea to recall their means. 9. Regularity Conditions - see handy cheatsheets at http://tinyurl.com/7a6e6a3. Needed for everything below and a lot of stuff above. 10. Estimation a. Properties of random sample b. Maximum Likelihood, Method of Moments, basic Bayesian approach c. Score function, Expected and Observed Fisher Information d. Sufficient Statistics, MSS, Complete SS e. Rao-Blackwellization 11. Basic asymptotic techniques a. Continuous mapping principle b. Component extension theorems (state, not prove) i.e., --> in R lets you --> in Rd b. Slutsky theorems b. CLT (IID), Ind, Partial (and certainly know how to spell Lindeberg, Lévy, and Lyapunov) c. Delta method(s) d. MLE, MOM e. asymptotic hypothesis tests f. asymptotic confidence intervals g. accuracy of the normal approximation (Berry-Essene theorem) and asymptotic expansions (e.g., 2nd order Hermitian, etc.) 12. Hypothesis Testing - Power of a test a. Neyman-Pearson Lemma (simple hypothesis) and extensions b. Likelihood Ratio Tests c. Other techniques d. UMP tests e. Karlin Rubin and MLR theorems d. Asymptotic techniques 13. Confidence sets a. Inverting hypothesis test b. Pivotal quantities c. Asymptotic techniques 14. You should be familiar with the proofs of the key theorems: NOTE: This is one reason that we have provided the Regularity Conditions above (item 9). Most of these you should be able to handle in Rd, although some are more easily done in R1 (e.g., IID CLT). Remember, most are T.S. expansions in some form or the other, except Barry Essene and asymptotic expansions, which are based on inversions of characteristic functions, but which were beyond the scope of our course. o WLLN o Delta method (1st order) o Cramer-Rao variance limit o IID central limit theorem o Neyman Pearson Lemma o Asymptotic Efficiency of MLE 15. Bootstrapping o Parametric o Non-parametric 16. Linear Regression o Base assumptions required for LS fits o Fitting parameters for single dimension X or k-dimension X o Distributional assumption required for inference - How Least Squares Parameter Estimation is Maximum Likelihood - Test statistics with Normal, uncorrelated errors If you have any questions, please contact J.A. Dobelman at x5681, or dobelman@stat.rice.edu ============================= These might also be on the Exam 8. Finite Markov chains - Probably will not be covered on the Exam 1. irreducibility, aperiodicity 2. ergodicity, stationary distribution