计算与应用数学拔尖博士生系列论坛——The Expressive Power of Neural Networks: A View from the Width
主 题: 计算与应用数学拔尖博士生系列论坛——The Expressive Power of Neural Networks: A View from the Width
报告人: Lu Zhou (Peking University)
时 间: 2018-06-08 12:00-13:30
地 点: Room 1560, Sciences Building No. 1
12:00-12:30 lunch；12:30-13:30 Talk
Abstract: In this paper, we study how width affects the expressiveness of neural networks. Classical results state that depth-bounded (e.g. depth-2) networks with suitable activation functions are universal approximators. We show a universal approximation theorem for width-bounded ReLU networks: width-(n + 4) ReLU networks, where n is the input dimension, are universal approximators. More over, except for a measure zero set, all functions cannot be approximated by width-n ReLU networks, which exhibits a phase transition. We also show that there exist classes of wide networks which cannot be realized by any narrow network whose depth is no more than a polynomial bound. On the other hand, we demonstrate by extensive experiments that narrow networks whose size exceed the polynomial bound by a constant factor can approximate wide and shallow network with high accuracy.