上海财经大学信管学院·讲座预告 | On the Key Problems of GANs

上海财经大学信息管理与工程学院
2020-04-13 20:55 浏览量: 2203

题目

On the Key Problems of GANs

主讲人

Zhiming Zhou is currently a Ph.D. candidate in Computer Science at Shanghai Jiao Tong University. He received his B.S. degree in Computer Science from ACM Honors Class at Shanghai Jiao Tong University in 2014.

Zhiming’s research is mainly focused on generative adversarial networks (GANs). Leading a group in the Apex lab at Shanghai Jiao Tong University, he also works on first-order optimization, optimal transport, and other related problems.

Prior to that, Zhiming worked on computer graphics in his early years of his Ph.D., which is focused on surface reflectance acquisition and related to sparsity, low-rank, and deblurring.

Zhiming has a broad interest in machine learning and deep learning, and he prefers fundamental and theoretical research. Currently, he holds a special interest in the optimization and generalization of deep neural networks and GANs.

时间

4月15日(星期三)10:00

地点

Zoom ID:925693818

参会链接:https://zoom.com.cn/j/925693818

内容简介

Generative adversarial networks (GANs) is one of the most promising generative models. However, it was known hard to train with high training instability and low sample quality. These issues have now been largely solved, and we will introduce our contributions towards addressing these issues. We will also present the new challenges in GANs, including the unreachable optimum and the generalization issue, and share some of our thoughts on them.

Three of my existing works are related and will be introduced therein. For the training instability issue, we studied the convergence guarantee from the perspective of optimal discriminative function, showing its superiority against the divergenceperspective. Under a generalized formulation of GANs, we show that GANs with unrestricteddiscriminative function space generally does not guarantee itsconvergence (which is the source of training instability), and Lipschitz regularization on the discriminative function can be a general solution for this issue (which leads to a new family of GANs, named Lipschitz GANs). For the sample quality issue, we studied how class labels interact with GANs training (when introduced) and how it improves the sample quality of GANs. Based on the analysis, an improved method for leveraging class labels in GANs had been proposed (AM-GAN).

As a study aimed to solve the unreachable optimum issue, we studied the convergence issue of Adam optimizer, which is one of the most popular optimizers in deep learning and heavily used in GANs. With the proposed accumulated step size perspective, we showed that the key issue in Adam lies in its biased adaptive learning rate caused by the correlation between the adaptive term v_t and the current gradient g_t, and a temporal shift operation is proposed to solve such an issue (AdaShift). Our new understanding of the role of v_t also free v_t from its traditional update rule, leading to more interestingvariants. Particularly, with dimension reduction operation in v_t, we achieve the so-called adaptive learning rate SGD, which removes the global gradient scale but keeps the relative scales.

[1] Zhou, Zhiming, Jiadong Liang, Yuxuan Song, Lantao Yu, Hongwei Wang, Weinan Zhang, Yong Yu, and Zhihua Zhang. "Lipschitz Generative Adversarial Nets." InInternational Conference on Machine Learning (ICML), 2019.

[2] Zhou, Zhiming, Han Cai, Shu Rong, Yuxuan Song, Kan Ren, Weinan Zhang, Jun Wang, and Yong Yu. "Activation Maximization Generative Adversarial Nets." In International Conference on Learning Representations (ICLR), 2018.

[3] Zhou, Zhiming*, Qingru Zhang*, Guansong Lu, Hongwei Wang, Weinan Zhang, and Yong Yu. "AdaShift: Decorrelation and Convergence of Adaptive Learning Rate Methods." In International Conference on Learning Representations (ICLR), 2019.

编辑:刘晔

(本文转载自上海财经大学信息管理与工程学院 ,如有侵权请电话联系13810995524)

* 文章为作者独立观点,不代表MBAChina立场。采编部邮箱:news@mbachina.com,欢迎交流与合作。

收藏
订阅

备考交流

免费领取价值5000元MBA备考学习包(含近8年真题) 购买管理类联考MBA/MPAcc/MEM/MPA大纲配套新教材

扫码关注我们

  • 获取报考资讯
  • 了解院校活动
  • 学习备考干货
  • 研究上岸攻略