Based on a course in the theory of statistics this text concentrates on what can be achieved using the likelihood/Fisherian method of taking account of uncertainty when studying a statistical problem. It takes the concept ot the likelihood as providing the best methods for unifying the demandsof statistical modelling and the theory of inference. Every likelihood concept is illustrated by realistic examples, which are not compromised by computational problems. Examples range from a simile comparison of two accident rates, to complex studies that require generalised linear orsemiparametric modelling.The emphasis is that the likelihood is not simply a device to produce an estimate, but an important tool for modelling. The book generally takes an informal approach, where most important results are established using heuristic arguments and motivated with realistic examples. With the currentlyavailable computing power, examples are not contrived to allow a closed analytical solution, and the book can concentrate on the statistical aspects of the data modelling. In addition to classical likelihood theory, the book covers many modern topics such as generalized linear models and mixedmodels, non parametric smoothing, robustness, the EM algorithm and empirical likelihood.
1: Introduction2: Elements of likelihood inference3: More properties of the likelihood4: Basic models and simple applications5: Frequentist properties6: Modelling relationships: regression models7: Evidence and the likelihood principle8: Score function and Fisher information9: Large Sample Results10: Dealing with nuisance parameters11: Complex data structure12: EM Algorithm13: Robustness of likelihood specification14: Estimating equation and quasi-likelihood15: Empirical likelihood16: Likelihood of random parameters17: Random and mixed effects models18: Nonparametric smoothing