STA422/2162 - Theory of Statistical Inference (2026)
Announcements
- Jan. 5 - Details on organizing a Recognized Study Group for the course can be found at:
https://sidneysmithcommons.artsci.utoronto.ca/recognized-study-groups/
Students may find this helpful in studying the course material.
- Jan. 15 - I've renumbered my lectures to make my numbering a bit more logical. So far we have covered Lecture I and Lecture II.1.
- Jan. 19 - I've been asked to set up a Piazza group for the course so I have done this and you should receive an invitation to join. I don't have a lot of experience with this, so let me know if there is anything I should do in addition.
- Jan. 19 Tentative Schedule for Course (to be discussed in class)
- Feb. 12 - Midterm 1 - first hour of class
- Feb. 17-20 Reading Week (no class)
- March 26 - Midterm 2 - first 1.5 hour of class
- April 2 - last class
- April 20 - project handed in via email (not sure but regulations may force an earlier date)
- Jan. 19 Project details
For this pick a topic that relates to something that has been discussed in the course (fairly broadly interpreted) and get this approved by me (email is fine). Look for some published resources on the topic. Summarize what you have learned, mostly the pros and cons that have been raised with some critical commentary by you. You don't have to take a position in favor or against but you are welcome to do so, provided you present a good argument for the position you take. The final paper should be about 10-12 pages using something like laTex (12 pt). Here are some possible topics but you are free to choose others.
- the likelihood ordering as a necessary part of any valid inference
- all inference methodology must be frequentist
- p-values, Bayes factors as measures of evidence
- all inference methodology must be Bayesian
- the concept of utility (loss) and its relevance to statistics
- evidence-based versus behavioristic-based theory
- fiducial inference
- improper priors as objective inference
- the use of default priors
- machine learning renders a theory of statistical inference irrelevant
- Jan. 30 - Definition III.5.2 of a maximal ancillary is correct (I thought I had written it down wrong). A maximal ancillary induces a partition such that no subpartition is ancillary.
- Jan. 30 - I updated Lecture III.4 and simplified Example III.4.2.
- Feb 1. I've added Lectures III.6 and III.7 and there are videos to accompany them. See these lectures below for the links. These contain the material I would have lectured on Feb. 5. The midterm is on Feb. 12 and it will cover all the material up to the end of Lecture III.7. I will supply the solutions to the rest of the Exercises by the end of the week and I will post some office hours (probably via Zoom) probably next weekend or early next week.
- Feb. 4 - I added the solution to Exercise III.2.1 which I had previously missed. I've also added the soutions to all the Exercises through Lecture III.7.
- Feb. 4 - The midterm is Feb. 12, 5-6pm and it is online. It covers the material up to and including Lecture III.7. I will send you the midterm via email just before 5 and you email me back your solutions just after 6. I think I have the emails for everybody but just to be sure, please send me an email indicating that you will be writing. I will try to add some office hours online several times between now and the midterm. This depends on my ability to speak which, at the moment, isn't good. I'm hoping it will improve enough to do this by the weekend and I will post the times of the office hours here. If you have questions, however, you can email these to me.
- Feb. 6 - The Midterm I've put this together now.
Format: Basically there is one model presented with a finite parameter space and a finite sample space (3x4 set). You are then asked in a series of 7 questions (each worth 10 marks) to implement various concepts that we have discussed in the course such as likelihood, sufficiency, ancillarity, etc. There is one bonus question (worth 10 marks) that involves a bit of extrapolation in the sense that we didn't really discuss it fully. The point in this form of the midterm is to make sure that we are all clear about the basic concepts. There are no questions that involve some of the more qualitative (philosophical?) things that I've discussed about the various approaches so far. I'll say more about this in the next class and why I don't view any of that material as "philosophy". In any case, those aspects can't be clearly grasped unless we are clear and precise about the concepts. The discussion I gave about Birnbaum's Theorem is a good example of that as, without being precise, the vagueness leads to conclusions that just don't stand up.
Timing: You should receive the email about 5pm on Thursday. If you don't receive the email then email me at mevansthree.evans@utoronto.ca. At 6pm you need to start putting the exam (you write your answers on the sheets) into a format so that it can be emailed to me. I think I should have them all back by about 6:15pm but if you run into a problem let me know. I may be able to use Crowdmark to do all of this, we'll see.
Let me know of any questions you may have about the process or the exam. My speaking is still a bit too rough to offer online office hours but I can answer emails with no problems.
- Feb. 10 - I sent out a test email to the entire class Feb. 10 at aroun 10:45am. If you did not get this please email me so that I can add you to the list, otherwise no action is required.
- Feb. 10 - I've up my notes for Lecture III.8 which is the lecture I would have given after the midterm. I will make a video to accompany this as soon as I'm able. This material is not on the midterm. This lecture completes our look at what I would call likelihood based and pure frequentist inference. The next sequence of lectures will be on decision theory, followed by fiducial and then Bayesian inference.
- Feb. 10 - Over Reading Week you should try to come up with a topic for the Project. I've given some suggestions (under Jan. 19) and here are some more:
- confidence distributions
- e-values
- how neural nets and/or LLM's are making inferences
- causal inference
- Feb. 12 - I will send out the midterm via email just after 4:55pm today. If you haven't received it by 5pm, then email me. You can work at it until 6pm and then prepare to email me your solutions. I should have them all by 6:15pm unless there is a problem. I will acknowledge receipt via a return email. If you don't receive the acknowlegement, email me. Don't worry too much about getting the arithmetic right, although there are a few places where that matters, basically you will be marked correct by showing a clear understanding of the concept in question. Good luck and I truly hope everybody gets 100! The notes for the lecture for today (Lecture III.8) are up and I will make a video to accompany it over the next few days, as my speaking improves. Based on my progress so far, I'm pretty confident that in-person classes can resume after Reading Week.
- Feb. 18 - I will send you an email today with your marked midterm and the Solutions to the midterm. I have put up a video to accompany Lecture III.8. Near the end of the video there are some comments about the project. In-class lectures resume on Feb. 26.
- March 6 - I updated Example IV.3.1 in Lecture IV.3 to make it clearer.
Instructor
Professor Michael Evans
Office:Ontario Power Building, 700 University Avenue, 9th floor, Room 9110
email: mevansthree.evans@utoronto.ca
Time and Place
Three hours of lectures per week every Thursday.
First class is Thursday, January 8, 5-8pm in WB119.
Website
http://www.utstat.utoronto.ca/mikevans/sta422/sta4222026.html
Office Hours
The in-person office hours will be on Thursdays 2-4pm.
Course Description
Statistical inference is concerned with using the evidence, available from observed data,
to draw inferences about aspects of an unknown probability measure. A variety of
theoretical approaches have been developed to address this problem and these can lead to
quite different inferences. A natural question is then concerned with how one
determines and validates appropriate statistical methodology in a given problem.
The course considers this larger statistical question. This involves a discussion
of topics such as model specification and checking, the likelihood function and likelihood
inferences, repeated sampling criteria, loss (utility) functions and optimality, prior specification
and checking, Bayesian inferences, principles and axioms, etc. The overall goal of the course is to leave students
with an understanding of the different approaches to the theory of statistical inference while developing
a critical point-of-view.
The following topics will be covered.
- the meaning of probability
- the evidential versus behavioristic approaches to statistical theory
- pure and frequentist likelihood theory
- decision theory - frequentist and Bayesian
- Birnbaum's theorem
- fiducial theory and close associates such as structural inference
- relative belief and the definition of statistical evidence
Textbook
There is no textbook but several references will be helpful at different points in the course.
Some material will also be drawn from particular papers whose references will be provided.
- Berger, J. (2006) Statistical Decision Theory and Bayesian Analysis. Springer.
- Casella, G. and Berger, R. (1990) Statistical Inference. Duxbury.
- Cox, D.R.(2006) Principles of Statistical Inference. Cambridge.
- Evans, M. (2015) Measuring Statistical Evidence Using Relative Belief. Chapman & Hall. Available online through the U. of Toronto Library.
- Evans, M. and Rosenthal, J. (2010) Probabilty and Statistics: The Science of Uncertainty. Available online at
book.
- Fraser, D.A.S. (1979) Inference and Linear Models. McGraw-Hill.
- Robert, C. (2001) The Bayesian Choice. Springer.
- Royall, R. (1997) Statistical Evidence: A likelihood paradigm. Chapman & Hall.
Evaluation
There will be 2 midterms held during class, each worth 25%, and a final project worth 50%.
If a midterm is missed, then there will be a make-up.
Class Notes
I will post my class notes here before each class. There will be some Exercises associated with the notes that
help to prepare for the midterms.