2024-03-21

机器学习与数据科学博士生系列论坛(第六十八期)—— Mathematical Perspectives of Neural Networks

摘要:
With the significant success of deep learning in practical applications in recent years, researchers have become curious about whether it is possible to explain neural networks with mathematical theory. In the process of solving practical problems using neural networks and deep learning algorithms, there are three natural types of errors: approximation error, optimization error, and generalization error. In this seminar, we will specifically introduce the sources of these three errors and showcase some representative works of theoretical analysis in recent years. Currently, these theories still have a significant gap from what we actually want to explain. They provide little practical guidance for deep learning algorithms, and the design of specific algorithms and determination of hyperparameters for practical problems still heavily rely on practical experience and meta-learning. Perhaps researchers have not yet found valuable theoretical directions for explaining neural networks. At present, the implementation of specific algorithms remains a more valuable direction of work.

论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。

返回