Abbas Khalili, McGill University

New Estimation and Feature Selection Methods in Mixture-of-experts Models with Diverging Number of Parameters
Date
Nov 23, 2016, 12:30 pm1:30 pm
Location
101 - Sherrerd Hall

Details

Event Description

Recent advancements in medical and other fields of scientific research have allowed scientists to collect data of unprecedented size and complexity. A common statistical problem in these applications is to model a response variable of interest as a function of a small subset of a large number of covariates (features). The problem becomes even more complex when a population under study is made up of hidden sub-populations and the relationship between the response variable and covariates varies across sub-populations. Mixture-of-experts (MOE) models provide a flexible statistical tool for studying such relationships. In this talk we discuss some new developments on estimation and feature selection methods in MOE models with diverging number of parameters.

Event Category
S. S. Wilks Memorial Seminar in Statistics