2018年非線性與變分分析國際研討會


為了促進非線性與變分分析及其應用領域的學術交流,增進相互了解,加強合作,擬于2018年4月12日至13日在電子科技大學基礎與前沿研究院舉辦非線性與變分分析國際研討會,熱烈歡迎廣大專家學者報名參會。 本次會議主題是交流優化算法及其應用領域的最新成果,議題包括(但不局限于)非線性泛函分析,微分方程,非線性規劃、錐優化、全局優化、變分不等式與互補問題、非光滑優化等。會議將邀請非線性與變分分析及應用領域知名專家學者就相關領域做學術報告。

 

                                              11.png

Keynote Speaker: Nguyen Dong Yen (Vietnam Academy of Science and Technology, Vietnam)

Title: Second-order Subdifferentials and Optimality Conditions for C^1-smooth Optimization Problems

Time: 09:10pm – 10:00 pm, April 12,2018

Location: Room725, Communication Building, Shashe Campus

Profile: Professor Nguyen Dong Yen is currently a high level professor in Institute of Mathematics, Vietnam Academy of Science and Technology. He also hold visiting professor positions in several world renowned university. He received his B. Sc. and Ph. D. degree from Vietnam and Poland, respectively. His research interests are Optimization Theory, Nonsmooth Analysis, Set-Valued Analysis, Variational Inequalities, Numerical Analysis and Scientific Computing. Professor Nguyen Dong Yen now serves several international journal, including SIAM-Optim as editors. Up to now, he has published more than 100 papers in international peer-reviewed journals and received much attention from the mathematical community.

Abstract: We investigate the possibility of using the Fr′echet and limiting second-order subdifferentials to characterize locally optimal solutions of C 1 - smooth unconstrained minimization problems. We prove that, for a C^1 -smooth function of one real variable or a C^1 -smooth function on a Banach space with its derivative being calm at the reference point, the positive semi-definiteness of its Fr′echet second-order subdifferential at the point in question is a necessary optimality condition, while it is not true for the limiting counterpart. However, the limiting second-order subdifferential of a C^1,1 -smooth function on R n at a local minimizer is positively semi-definite along some of its selection. We also show that, for a C^1 -smooth function on an Asplund space, the positive semi-definiteness of its Fr′echet second-order subdifferential around a stationary point is sufficient for this point to be a local minimizer of the function. Besides, a sufficient condition via the Fr′echet second-order subdifferential for a point to be a tilt stable minimizer is given.

12.png

Keynote Speaker: Shih-Sen Chang, (China Medical University, Taiwan)

Title: The modified proximal point algorithm in Hadamard spaces

Time: 14:30pm – 15:20 pm, April 12,2018

Location: Room725, Communication Building, Shashe Campus

Profile: Professor Shih-Sen Chang is currently a visit chair professor in China Medical University, Taiwan, and a professor in Sichuan University, Chengdu. Professor Shih-Sen Chang’s research centers around the study of Functional Analysis, in particular, fixed points of nonlinear operators and the study of Optimization Theory, in particular, solutions of monotone variational inequalities. Professor Shih-Sen Chang was awarded natural science awards and granted the National Natural Science Foundation of China several times. Up to now, he has published more than 500 papers in international peer-reviewed journals and 6 books in international publishers. Professor Shih-Sen Chang was included the list of Highly Cited Researchers by Clarivate Analytics in 2016.

Abstract: The purpose of this paper is to propose a modified proximal point algorithm for solving minimization problems in Hadamard spaces. We then prove that the sequence generated by the algorithm converges strongly (convergence in metric) to a min-imizer of convex objective functions. The results extend several results in Hilbert spaces, Hadamard manifolds and non-positive curvature metric spaces.

13.png

Keynote Speaker: Hong-Kun Xu (Hangzhou Dianzi University, Hangzhou)

Title:  Convergence Analysis of the Frank-Wolfe Algorithm in Banach Spaces under Holder Continuous Gradients

Time: 09:10pm – 10:00 pm, April 13,2018

Location: Room725, Communication Building, Shashe Campus

Profile: Professor Hong Kun Xu is currently a distinguished professor at Hangzhou Dianzi University in Hangzhou, China. He received his BS., M.S. and Ph.D. degrees from Zhejiang Normal University, Zhejiang University and Xi’an Jiaotong University, respectively. Professor Xu held visiting positions at many institutions in several countries and was a Japan JSPS Invitational Fellow with Tokyo Institute of Technology from December 2003 to January 2004. In 2014 he was selected by the Zhejiang “1000 Talents” program. Professor Xu is the winner of several awards, including the 2004 South African Mathematical Society Research Distinction. He was elected fellow to the Academy of Science of South Africa in 2005 and to TWAS, the World Academy of Sciences, in 2012. He has been Thomson Reuters Highly Cited Researcher since 2013. Professor Xu's research areas include nonlinear functional analysis, differential and integral equations, optimization algorithms for big data problems, and mathematical finance. Up to now, he has published more than 100 papers in international peer-reviewed journals.

Abstract: The Frank-Wolfe algorithm (FWA), also known as the conditional gradient algorithm, was introduced by Marguerite Frank and Philip Wolfe in 1956. Due to its simple linear subproblems, FWA has recently been paid much attention to solve constrained optimization problems over closed convex bounded sets. The convergence of FWA depends on the way of choosing the sequence of stepsizes. In this talk, we will report some recent convergence results on FWA in Banach spaces under Holder continuous gradients by using two ways of choosing the stepsizes: one way is by the one-dimensional line minimization and the other is by the open loop rule. In addition, we will also discuss the sublinear rate of convergence by introducing the concept of curvature constant of order bigger than one, which includes the case where the Frechet derivative of the objective function satis es the Holder continuity condition, in particular, the Lipschitz continuity condition.  

14.png

Keynote Speaker: Yunhai Xiao (Henan University, Kaifeng)

Title: A Generalized ADMM with Semi-Proximal Terms for Convex Composite Conic Programming

Time: 14:30pm – 15:20 pm, April 13,2018

Location: Room725, Communication Building, Shashe Campus

Profile: Yunhai Xiao is a professor at the School of Mathematics and Statistics, Henan University. He received his Ph.D. in College of Mathematics and Econometrics at Hunan University in 2007. He worked at Department of Mathematics, Nanjing University, for his Post-doctoral research from 2009 to 2010. He also worked as a Post-doctoral research fellow in National Cheng Kung University (Taiwan) in 2011. As an academic scholar, he visited the Department of Mathematics, National University of Singapore, from 2015 to 2016.  His research interests include optimization theory, algorithms and applications in image processing and statistics.

Abstract: In this paper, we propose a generalized alternating direction method of multipliers (ADMM) with semi-proximal terms for solving a class of convex composite conic optimization problems, of which some are high-dimensional, to moderate accuracy. Our primary motivation is that this method, together with properly chosen semi-proximal terms, such as those generated by the recent advance of block symmetric Gauss-Seidel technique, is capable of tackling these problems. Moreover, the proposed method, which relaxes both the primal and the dual variables in a natural way with a common relaxation factor in the interval of $(0,2)$, has the potential of enhancing the performance of the classic ADMM. Extensive numerical experiments on various doubly non-negative semidefinite programming problems, with or without inequality constraints, are conducted. The corresponding results showed that all these multi-block problems can be successively solved, and the advantage of using the relaxation step is apparent.

88必发平台注册
<ins id="1jrrj"></ins>
<ins id="1jrrj"><noframes id="1jrrj">
<cite id="1jrrj"></cite>
<ins id="1jrrj"></ins><ins id="1jrrj"></ins>
<ins id="1jrrj"></ins>
<ins id="1jrrj"></ins>
<ins id="1jrrj"><noframes id="1jrrj"><cite id="1jrrj"></cite>
<ins id="1jrrj"><th id="1jrrj"><ins id="1jrrj"></ins></th></ins>
<cite id="1jrrj"><span id="1jrrj"></span></cite>
<ins id="1jrrj"></ins>
<cite id="1jrrj"></cite><var id="1jrrj"></var><del id="1jrrj"></del>
<cite id="1jrrj"><noframes id="1jrrj"><cite id="1jrrj"></cite>
<cite id="1jrrj"><noframes id="1jrrj"><cite id="1jrrj"></cite>
<ins id="1jrrj"><noframes id="1jrrj"><cite id="1jrrj"></cite><ins id="1jrrj"><noframes id="1jrrj"><cite id="1jrrj"></cite>