课程预告(A Short Course on Information Theory and Statistical Inference)

 

A Short Course on Information Theory and Statistical Inference

授课人:郑立中(Lizhong Zheng)教授,麻省理工学院 电子工程与计算机科学系

授课对象:硕士和博士研究生,相关专业青年教师。

时间:66日至68日上午900-1200 下午14:30-17:30

地点:知新楼C502

报名方式:

1、请准备听课的老师发送“学院+姓名+手机号”至邮箱duanqikecheng@163.com报名;

22014级通信与信息系统、信号与信息处理专业的硕士生将“学号+姓名+专业+手机号”汇总到房海腾964072324@qq.com;                                             

3、其他年级专业的硕士、博士研究生请发送“学号+姓名+专业+手机号”到 高丽梅517961818@qq.com邮箱报名。                                           

报名截止日期为2016529日晚,课程讲义按照报名名单签到发放。

授课人简介:Lizhong Zheng received the B.S and M.S. degrees, in 1994 and 1997 respectively, from the Department of Electronic Engineering, Tsinghua University, China, and the Ph.D. degree, in 2002, from the Department of Electrical Engineering and Computer Sciences, University of California, Berkeley. Since 2002, he has been working in the Department of Electrical Engineering and Computer Sciences, where he is currently a professor of Electrical Engineering and Computer Sciences at Massachusetts Institute of Technology. His research interests include information theory, wireless communications, and statistical inference. He received the IEEE Information Theory Society Paper Award in 2003, and NSF CAREER award in 2004, and the AFOSR Young Investigator Award in 2007.

授课内容:

The goal of the course is to give an introduction of some of the information theoretic metrics, and their applications in the context of statistical data
analysis, including decision making, model learning, as well as the general mathematical frameworks for data analysis.The specific contents covered within this short course include
1.Introduction to information metrics including entropy, mutual information and K-L divergence; the basic properties and operations of these quantities, and the operational meanings in statistics.
2.Introduction to the concept of information geometry, including divergence optimization, Sanov’s theorem, i-projection and m-projection, and their applications.
3.EM algorithm as a special case of iterative projections.
4.Local geometric framework, and relation to efficient estimate, model selection, least informative priors, conjugate prior family.
5.The recent results used in information decomposition, dimension reduction, feature selection.
Students who wish to take this class are assumed to be familiar with the basics of probability theory, should be fluent in the classical treatments on hypothesis testing, parameter estimation; and should try to seek opportunities to apply this knowledge after the course, in order to retain it.The course will be carried out at a fast pace. Some of the materials come with written notes that will be distributed in the class. We do ask that these notes not to be further copied and distributed. Some other less well developed materials will be only presented on the blackboards. So please be prepared to take notes.
发布人:       最后修改日期: 2016-05-25 14:08:26.0
该新闻已被浏览 次      [ 后退] [ 返回首页]

版权所有 Copyright©2018 yh533388银河

地址:中国山东省青岛市即墨滨海路72号 邮编:266237