博彩公司-真人在线博彩公司大全_百家乐园首选去澳_全讯网赢足一世 (中国)·官方网站

網站頁面已加載完成

由于您當前的瀏覽器版本過低,存在安全隱患。建議您盡快更新,以便獲取更好的體驗。推薦使用最新版Chrome、Firefox、Opera、Edge

Chrome

Firefox

Opera

Edge

ENG

當前位置: 首頁 · 學術交流 · 正文

學術交流

【學術報告】研究生“靈犀學術殿堂”第560期之徐揚揚教授報告會通知

發布時間:2020年09月18日 來源:研究生院 點擊數:

全校師生:

我校定于2020年09月26日舉辦研究生靈犀學術殿堂——徐揚揚教授報告會,現將有關事項通知如下:

1.報告會簡介

報告人:徐揚揚教授

時間:2020年09月26日(星期六)10:00

地點:騰訊會議(會議號:688 912 696)

報告題目:Accelerating stochastic gradient methods

內容簡介:Stochastic gradient method has been extensively used to train machine learning models, in particular for deep learning. Various techniques have been applied to accelerate stochastic gradient methods, either numerically or theoretically, such as momentum acceleration and adapting learning rates. In this talk, I will present two ways to accelerate stochastic gradient methods. The first one is to accelerate the popular adaptive (Adam-type) stochastic gradient method by asynchronous (async) parallel computing. Numerically, async-parallel computing can have significantly higher parallelization speed-up than its sync-parallel counterpart. Several previous works have studied async-parallel non-adaptive stochastic gradient methods. However, a non-adaptive stochastic gradient method often converges significantly slower than an adaptive one. I will show that our async-parallel adaptive stochastic gradient method can have near-linear speed-up on top of the fast convergence of an adaptive stochastic gradient method. In the second part, I will present a momentum-accelerated proximal stochastic gradient method. It can have provably faster convergence than a standard proximal stochastic gradient method. I will also show experimental results to demonstrate its superiority on training a sparse deep learning model.

2.歡迎各學院師生前來聽報告。報告會期間請關閉手機或將手機調至靜音模式。

西北工業大學黨委學生工作部

數學與統計學院

復雜系統動力學與控制工信部重點實驗室

2020年9月18日

報告人簡介

Dr. Yangyang Xu(徐揚揚) is now a tenure-track assistant professor in the Department of Mathematical Sciences at Rensselaer Polytechnic Institute. He received his B.S. in Computational Mathematics from Nanjing University in 2007, M.S. in Operations Research from Chinese Academy of Sciences in 2010, and Ph.D from the Department of Computational and Applied Mathematics at Rice University in 2014. His research interests are optimization theory and methods and their applications such as in machine learning, statistics, and signal processing. He developed optimization algorithms for compressed sensing, matrix completion, and tensor factorization and learning. Recently, his research focuses on first-order methods, operator splitting, stochastic optimization methods, and high performance parallel computing. He has published over 30 papers in prestigious journals and conference proceedings. He was awarded the gold medal in 2017 International Consortium of Chinese Mathematicians.

百家乐官网永利娱乐场| 大发888娱乐场登陆| 网络百家乐金海岸破解软件| 大发888娱乐城 手机版| 百家乐官网视频台球下载| 百家乐官网技巧平注常赢法| 丽都百家乐的玩法技巧和规则| 永利高足球博彩网| 太阳城百家乐官网外挂| 游戏百家乐官网的玩法技巧和规则| 凯斯百家乐的玩法技巧和规则 | 蒙特卡罗娱乐| 百家乐官网心得打法| 同乐城百家乐现金网| 速博百家乐的玩法技巧和规则 | 百家乐官网园小区户型图| 百家乐平台下载| 大发888战神娱乐| 百家乐官网视频金币| 百家乐官网网上赌有作假吗| 百家乐l路单| 晋城| 百家乐龙虎斗扎金花| 太阳城金旭园| 百家乐官网视频多开| 百家乐赌场优势| 南召县| 真人百家乐官网赌场娱乐网规则| 太阳城百家乐娱乐官方网| 澳博足球| 广州百家乐官网扫描分析| 百家乐制胜法宝| 百家乐官网有人赢过吗| 百家乐筹码盒| 威尼斯人娱乐城佣金| 百家乐官网群html| 百家乐心术| 网络百家乐官网路单图| 百家乐里靴是什么意识| 百家乐官网平台开发| 蓝盾百家乐打法|