为了促进非线性与变分分析及其应用领域的学术交流，增进相互了解，加强合作，拟于2018年4月12日至13日在电子科技大学基础与前沿研究院举办非线性与变分分析国际研讨会，热烈欢迎广大专家学者报名参会。 本次会议主题是交流优化算法及其应用领域的最新成果，议题包括（但不局限于）非线性泛函分析，微分方程，非线性规划、锥优化、全局优化、变分不等式与互补问题、非光滑优化等。会议将邀请非线性与变分分析及应用领域知名专家学者就相关领域做学术报告。

**Keynote Speaker:** Nguyen Dong Yen (Vietnam Academy of Science and Technology, Vietnam)

**Title:** Second-order Subdifferentials and Optimality Conditions for C^1-smooth Optimization Problems

**Time:** 09:10pm – 10:00 pm, April 12，2018

**Location: **Room725, Communication Building, Shashe Campus

**Profile:** Professor Nguyen Dong Yen is currently a high level professor in Institute of Mathematics, Vietnam Academy of Science and Technology. He also hold visiting professor positions in several world renowned university. He received his B. Sc. and Ph. D. degree from Vietnam and Poland, respectively. His research interests are Optimization Theory, Nonsmooth Analysis, Set-Valued Analysis, Variational Inequalities, Numerical Analysis and Scientific Computing. Professor Nguyen Dong Yen now serves several international journal, including SIAM-Optim as editors. Up to now, he has published more than 100 papers in international peer-reviewed journals and received much attention from the mathematical community.

**Abstract:** We investigate the possibility of using the Fr´echet and limiting second-order subdifferentials to characterize locally optimal solutions of C 1 - smooth unconstrained minimization problems. We prove that, for a C^1 -smooth function of one real variable or a C^1 -smooth function on a Banach space with its derivative being calm at the reference point, the positive semi-definiteness of its Fr´echet second-order subdifferential at the point in question is a necessary optimality condition, while it is not true for the limiting counterpart. However, the limiting second-order subdifferential of a C^1,1 -smooth function on R n at a local minimizer is positively semi-definite along some of its selection. We also show that, for a C^1 -smooth function on an Asplund space, the positive semi-definiteness of its Fr´echet second-order subdifferential around a stationary point is sufficient for this point to be a local minimizer of the function. Besides, a sufficient condition via the Fr´echet second-order subdifferential for a point to be a tilt stable minimizer is given.

**Keynote Speaker: **Shih-Sen Chang, (China Medical University, Taiwan)

**Title: **The modified proximal point algorithm in Hadamard spaces

**Time: **14:30pm – 15:20 pm, April 12，2018

**Location: **Room725, Communication Building, Shashe Campus

**Profile:** Professor Shih-Sen Chang is currently a visit chair professor in China Medical University, Taiwan, and a professor in Sichuan University, Chengdu. Professor Shih-Sen Chang’s research centers around the study of Functional Analysis, in particular, fixed points of nonlinear operators and the study of Optimization Theory, in particular, solutions of monotone variational inequalities. Professor Shih-Sen Chang was awarded natural science awards and granted the National Natural Science Foundation of China several times. Up to now, he has published more than 500 papers in international peer-reviewed journals and 6 books in international publishers. Professor Shih-Sen Chang was included the list of Highly Cited Researchers by Clarivate Analytics in 2016.

**Abstract:** The purpose of this paper is to propose a modified proximal point algorithm for solving minimization problems in Hadamard spaces. We then prove that the sequence generated by the algorithm converges strongly (convergence in metric) to a min-imizer of convex objective functions. The results extend several results in Hilbert spaces, Hadamard manifolds and non-positive curvature metric spaces.

**Keynote Speaker:** Hong-Kun Xu (Hangzhou Dianzi University, Hangzhou)

**Title: **Convergence Analysis of the Frank-Wolfe Algorithm in Banach Spaces under Holder Continuous Gradients

**Time:** 09:10pm – 10:00 pm, April 13，2018

**Location: **Room725, Communication Building, Shashe Campus

**Profile: **Professor Hong Kun Xu is currently a distinguished professor at Hangzhou Dianzi University in Hangzhou, China. He received his BS., M.S. and Ph.D. degrees from Zhejiang Normal University, Zhejiang University and Xi’an Jiaotong University, respectively. Professor Xu held visiting positions at many institutions in several countries and was a Japan JSPS Invitational Fellow with Tokyo Institute of Technology from December 2003 to January 2004. In 2014 he was selected by the Zhejiang “1000 Talents” program. Professor Xu is the winner of several awards, including the 2004 South African Mathematical Society Research Distinction. He was elected fellow to the Academy of Science of South Africa in 2005 and to TWAS, the World Academy of Sciences, in 2012. He has been Thomson Reuters Highly Cited Researcher since 2013. Professor Xu's research areas include nonlinear functional analysis, differential and integral equations, optimization algorithms for big data problems, and mathematical finance. Up to now, he has published more than 100 papers in international peer-reviewed journals.

**Abstract:** The Frank-Wolfe algorithm (FWA), also known as the conditional gradient algorithm, was introduced by Marguerite Frank and Philip Wolfe in 1956. Due to its simple linear subproblems, FWA has recently been paid much attention to solve constrained optimization problems over closed convex bounded sets. The convergence of FWA depends on the way of choosing the sequence of stepsizes. In this talk, we will report some recent convergence results on FWA in Banach spaces under Holder continuous gradients by using two ways of choosing the stepsizes: one way is by the one-dimensional line minimization and the other is by the open loop rule. In addition, we will also discuss the sublinear rate of convergence by introducing the concept of curvature constant of order bigger than one, which includes the case where the Frechet derivative of the objective function satis es the Holder continuity condition, in particular, the Lipschitz continuity condition.

**Keynote Speaker:** Yunhai Xiao (Henan University, Kaifeng)

**Title:** A Generalized ADMM with Semi-Proximal Terms for Convex Composite Conic Programming

**Time:** 14:30pm – 15:20 pm, April 13，2018

**Location:** Room725, Communication Building, Shashe Campus

**Profile:** Yunhai Xiao is a professor at the School of Mathematics and Statistics, Henan University. He received his Ph.D. in College of Mathematics and Econometrics at Hunan University in 2007. He worked at Department of Mathematics, Nanjing University, for his Post-doctoral research from 2009 to 2010. He also worked as a Post-doctoral research fellow in National Cheng Kung University (Taiwan) in 2011. As an academic scholar, he visited the Department of Mathematics, National University of Singapore, from 2015 to 2016. His research interests include optimization theory, algorithms and applications in image processing and statistics.

**Abstract: **In this paper, we propose a generalized alternating direction method of multipliers (ADMM) with semi-proximal terms for solving a class of convex composite conic optimization problems, of which some are high-dimensional, to moderate accuracy. Our primary motivation is that this method, together with properly chosen semi-proximal terms, such as those generated by the recent advance of block symmetric Gauss-Seidel technique, is capable of tackling these problems. Moreover, the proposed method, which relaxes both the primal and the dual variables in a natural way with a common relaxation factor in the interval of $(0,2)$, has the potential of enhancing the performance of the classic ADMM. Extensive numerical experiments on various doubly non-negative semidefinite programming problems, with or without inequality constraints, are conducted. The corresponding results showed that all these multi-block problems can be successively solved, and the advantage of using the relaxation step is apparent.