Past Lectures

Computational Thinking, Inferential Thinking and Data Science

by Michael I. Jordan (UC Berkeley)

Wednesday March 29 , 2016, 5:30pm
Computer Science 104

Abstract: The phenomenon of Data Science is creating a need for research perspectives that blend computational thinking (with its focus on, e.g., abstractions, algorithms and scalability) with inferential thinking (with its focus on, e.g., underlying populations, sampling patterns, error bars and predictions). I present several examples of such blending, in domains such as distributed inference, asynchronous optimization and private data analysis. I also discuss the design of a new freshman-level course at Berkeley in which this blend is being taught successfully to a wide range of students.

Wilks Lecture
Michael Jordan Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley. His research interests bridge the computational, statistical, cognitive and biological sciences, and have focused in recent years on Bayesian nonparametric analysis, probabilistic graphical models, spectral methods, kernel machines and applications to problems in distributed computing systems, natural language processing, signal processing and statistical genetics. Prof. Jordan is a member of the National Academy of Sciences, a member of the National Academy of Engineering and a member of the American Academy of Arts and Sciences. He is a fellow of the American Association for the Advancement of Science. He has been named a Neyman Lecturer and a Medallion Lecturer by the Institute of Mathematical Statistics. He received the David E. Rumelhart Prize in 2015 and the ACM/AAAI Allen Newell Award in 2009.

The Expanding Realm of Statistical Shrinkage

by Lawrence D. Brown (University of Pennsylvania)

Monday, April 13, 2015, 5:00pm
Computer Science 104

Abstract: The broad concept of shrinkage as statistically desirable was already present in Galton’s 1889 description of the regression effect: “However paradoxical it may appear at first sight, it is theoretically a necessary fact … that the Stature of the adult offspring must [on average] be more mediocre [= closer to the mean] than the stature of their parents.” Contemporary use of shrinkage as a general technique, not tied to linear regression, stems from Stein’s remarkable discovery in 1956 of a better estimator in three or more dimensions than the customary, intuitive sample mean. Since then, concepts of shrinkage have permeated contemporary statistical methodologies - often in the guise of random effects models, empirical and hierarchical Bayes modeling, and via regularization and other high-dimensional estimation techniques. I will survey some of the justifications for shrinkage, including a version of the geometrical argument in Stein (1956) and a version of Stigler’s (1990) argument to show that regression is a form of shrinkage. In 1962 Stein realized that the deservedly popular James-Stein estimator (1961) could be statistically justified by a hierarchical Bayes argument. I will also review this justification and discuss some more recent developments and ramifications.

Wilks Lecture
Stephen Fienberg Lawrence D. Brown is Miers Busch Professor and Professor of Statistics at the Wharton School of the University of Pennsylvania in Philadelphia, Pennsylvania. He was educated at the California Institute of Technology and Cornell University, where he earned his Ph.D. in 1964. He has earned numerous honors, including election to the United States National Academy of Sciences, and has published widely. He was president-elect, president, and past-president of the Institute of Mathematical Statistics in 1991-94 and the co-editor of The Annals of Statistics in 1997-2000. He was elected to the American Academy of Arts and Sciences in 2013. After having been assistant professor at University of California at Berkeley, associate professor at Cornell University, and professor at Cornell University and Rutgers University, he was invited to join the Department of Statistics at the Wharton School of the University of Pennsylvania.

From Fisher to “Big Data”: Continuities and discontinuities

by Peter J. Bickel (UC Berkeley)

Wednesday March 26, 2014, 5:00pm
Computer Science 104

Abstract: In two major papers in 1922 and 1925 Fisher introduced many of the ideas ,parameters , sufficiency, efficiency, maximum likelihood, which when coupled with Wald’s decision theoretic point of view of 1950 have underlain the structure of statistics until the 1980’s.That period coincided ,not accidentally,with the beginnings of the widespread introduction of computers and our ability to use them to gather “big data” and implement methods to analyze such data .In this lecture I will try to see how the Fisherian concepts have evolved in response to the new environment and to isolate and study new ideas that have been brought in and where they have come from. Thus,I will argue that “sufficiency” has evolved to “data compression”,”efficiency” has had to include computational considerations, and issues of scale ,“parameters” and procedures such as “maximum likelihood” have had to be considered in the context of larger semi and nonparametric models and in robustness. The steady rise in computational capability during the last 30-40 years has enabled the implementation of the older Bayesian point of view .computer intensive methods such as Efron’s “bootstrap” as well as the introduction of the “machine learning” point of view and methods from computer science.I will try to support my argument from the literature ,some of my own work and my experience with ENCODE .a “Big Data” project in biology.

Wilks Lecture
Peter Bickel Peter Bickel is Professor of Statistics, University of California, Berkeley. He was awarded an honorary Doctorate degree from Hebrew University,Jerusalem in 1986. He is past President of the Bernoulli Society and of the Institute of Mathematical Statistics, a MacArthur Fellow, a COPSS prize winner, and a member of the American Academy of Arts and Sciences and of the National Academy of Sciences.

A 250-Year Argument (Belief, Behavior, and the Bootstrap)

by Bradley Efron (Stanford University)

Wednesday April 17, 2013, 5:30pm
Friend 101

Abstract: The year 2013 marks the 250th anniversary of Bayes rule. The rule has been influential over the entire period, and controversial over most of it. It's reliance on prior beliefs has been challenged by frequentist methods, which focus instead on the behavior of specific estimates and tests. The bootstrap helps connect the two philosophies, particularly when Bayes inference is based on "uninformative" priors. Some examples will be used to illustrate the connection, without much in the way of theory.

Wilks Lecture
Bradley Efron Bradley Efron is the Max H. Stein Professor and Professor of Statistics and of Health Research and Policy at Stanford University. He is one of the world's most often-cited mathematical scientists and is best known for proposing the bootstrap resampling technique, which has had a major impact in the field of statistics and virtually every area of statistical application. He earned his doctorate in statistics from Stanford in 1964 and joined the Stanford faculty in 1965. Winner of a 1983 MacArthur Prize, he has served as president of the American Statistical Association and of the Institute of Mathematical Statistics. He has made many important contributions to many areas of statistics. Efron's work has spanned both theoretical and applied topics, including empirical Bayes analysis (with Carl Morris), applications of differential geometry to statistical inference, the analysis of survival data, and inference for microarray gene expression data. He is the author of a classic monograph, "The Jackknife, the Bootstrap and Other Resampling Plans" (1982) and has also co-authored (with R. Tibshirani) the text "An Introduction to the Bootstrap" (1994). In 2010, he published the monograph "Large-Scale Inference: Empirical Bayes Methods for Estimation, Testing, and Prediction"

Statistics in Service to the Nation: Personal Reflections

by Stephen E. Fienberg (Carnegie Mellon University)

Monday April 23, 2012, 5:30pm
Computer Science 104

Abstract: Statistics and statisticians spend much of their time and effort working at the interface of other fields and dealing with problems arising in the course of public policy. In this presentation, I describe several of my own experiences and how such work, especially that associated with problems arising at the national level, has shaped my statistical research and activities. What I and others have done in bringing statistics in service to the nation is rooted in a tradition set in motion many years ago by statistical leaders such as Sam Wilks at Princeton University.

Wilks Lecture
Stephen Fienberg Stephen E. Fienberg is the Maurice Falk University Professor of Statistics and Social Science at Carnegie Mellon University, with appointments in the Department of Statistics, the Machine Learning Department, the Heinz College, and Cylab. He is the Carnegie Mellon co-director of the Living Analytics Research Centre, a joint project with Singapore Management University. He is the author or editor of over 20 books and 400 papers and related publications. His 1975 book on categorical data analysis with Bishop and Holland, Discrete Multivariate Analysis: Theory and Practice, and his 1980 book The Analysis of Cross-Classified Categorical Data are both Citation Classics. He is a member of the U. S. National Academy of Sciences, and a fellow of the Royal Society of Canada and the American Academy of Arts and Sciences.