arXiv daily

Optimization and Control (math.OC)

Mon, 29 May 2023

Other arXiv digests in this category:Thu, 14 Sep 2023; Wed, 13 Sep 2023; Tue, 12 Sep 2023; Mon, 11 Sep 2023; Fri, 08 Sep 2023; Tue, 05 Sep 2023; Fri, 01 Sep 2023; Thu, 31 Aug 2023; Wed, 30 Aug 2023; Tue, 29 Aug 2023; Mon, 28 Aug 2023; Fri, 25 Aug 2023; Thu, 24 Aug 2023; Wed, 23 Aug 2023; Tue, 22 Aug 2023; Mon, 21 Aug 2023; Fri, 18 Aug 2023; Thu, 17 Aug 2023; Wed, 16 Aug 2023; Tue, 15 Aug 2023; Mon, 14 Aug 2023; Fri, 11 Aug 2023; Thu, 10 Aug 2023; Wed, 09 Aug 2023; Tue, 08 Aug 2023; Mon, 07 Aug 2023; Fri, 04 Aug 2023; Thu, 03 Aug 2023; Wed, 02 Aug 2023; Tue, 01 Aug 2023; Mon, 31 Jul 2023; Fri, 28 Jul 2023; Thu, 27 Jul 2023; Wed, 26 Jul 2023; Tue, 25 Jul 2023; Mon, 24 Jul 2023; Fri, 21 Jul 2023; Thu, 20 Jul 2023; Wed, 19 Jul 2023; Tue, 18 Jul 2023; Mon, 17 Jul 2023; Fri, 14 Jul 2023; Thu, 13 Jul 2023; Wed, 12 Jul 2023; Tue, 11 Jul 2023; Mon, 10 Jul 2023; Fri, 07 Jul 2023; Thu, 06 Jul 2023; Wed, 05 Jul 2023; Tue, 04 Jul 2023; Mon, 03 Jul 2023; Fri, 30 Jun 2023; Thu, 29 Jun 2023; Wed, 28 Jun 2023; Tue, 27 Jun 2023; Mon, 26 Jun 2023; Fri, 23 Jun 2023; Thu, 22 Jun 2023; Wed, 21 Jun 2023; Tue, 20 Jun 2023; Fri, 16 Jun 2023; Thu, 15 Jun 2023; Tue, 13 Jun 2023; Mon, 12 Jun 2023; Fri, 09 Jun 2023; Thu, 08 Jun 2023; Wed, 07 Jun 2023; Tue, 06 Jun 2023; Mon, 05 Jun 2023; Fri, 02 Jun 2023; Thu, 01 Jun 2023; Wed, 31 May 2023; Tue, 30 May 2023; Fri, 26 May 2023; Thu, 25 May 2023; Wed, 24 May 2023; Tue, 23 May 2023; Mon, 22 May 2023; Fri, 19 May 2023; Thu, 18 May 2023; Wed, 17 May 2023; Tue, 16 May 2023; Mon, 15 May 2023; Fri, 12 May 2023; Thu, 11 May 2023; Wed, 10 May 2023; Tue, 09 May 2023; Mon, 08 May 2023; Fri, 05 May 2023; Thu, 04 May 2023; Wed, 03 May 2023; Tue, 02 May 2023; Mon, 01 May 2023; Fri, 28 Apr 2023; Thu, 27 Apr 2023; Wed, 26 Apr 2023; Tue, 25 Apr 2023; Mon, 24 Apr 2023; Fri, 21 Apr 2023; Thu, 20 Apr 2023; Wed, 19 Apr 2023; Tue, 18 Apr 2023; Mon, 17 Apr 2023; Fri, 14 Apr 2023; Thu, 13 Apr 2023; Wed, 12 Apr 2023; Tue, 11 Apr 2023; Mon, 10 Apr 2023
1.Adaptive Localized Cayley Parametrization for Optimization over Stiefel Manifold

Authors:Keita Kume, Isao Yamada

Abstract: We present an adaptive parametrization strategy for optimization problems over the Stiefel manifold by using generalized Cayley transforms to utilize powerful Euclidean optimization algorithms efficiently. The generalized Cayley transform can translate an open dense subset of the Stiefel manifold into a vector space, and the open dense subset is determined according to a tunable parameter called a center point. With the generalized Cayley transform, we recently proposed the naive Cayley parametrization, which reformulates the optimization problem over the Stiefel manifold as that over the vector space. Although this reformulation enables us to transplant powerful Euclidean optimization algorithms, their convergences may become slow by a poor choice of center points. To avoid such a slow convergence, in this paper, we propose to estimate adaptively 'good' center points so that the reformulated problem can be solved faster. We also present a unified convergence analysis, regarding the gradient, in cases where fairly standard Euclidean optimization algorithms are employed in the proposed adaptive parametrization strategy. Numerical experiments demonstrate that (i) the proposed strategy succeeds in escaping from the slow convergence observed in the naive Cayley parametrization strategy; (ii) the proposed strategy outperforms the standard strategy which employs a retraction.

2.Communication Efficient Distributed Newton Method with Fast Convergence Rates

Authors:Chengchang Liu, Lesi Chen, Luo Luo, John C. S. Lui

Abstract: We propose a communication and computation efficient second-order method for distributed optimization. For each iteration, our method only requires $\mathcal{O}(d)$ communication complexity, where $d$ is the problem dimension. We also provide theoretical analysis to show the proposed method has the similar convergence rate as the classical second-order optimization algorithms. Concretely, our method can find~$\big(\epsilon, \sqrt{dL\epsilon}\,\big)$-second-order stationary points for nonconvex problem by $\mathcal{O}\big(\sqrt{dL}\,\epsilon^{-3/2}\big)$ iterations, where $L$ is the Lipschitz constant of Hessian. Moreover, it enjoys a local superlinear convergence under the strongly-convex assumption. Experiments on both convex and nonconvex problems show that our proposed method performs significantly better than baselines.

3.A Parameter-Free Conditional Gradient Method for Composite Minimization under Hölder Condition

Authors:Masaru Ito, Zhaosong Lu, Chuan He

Abstract: In this paper we consider a composite optimization problem that minimizes the sum of a weakly smooth function and a convex function with either a bounded domain or a uniformly convex structure. In particular, we first present a parameter-dependent conditional gradient method for this problem, whose step sizes require prior knowledge of the parameters associated with the H\"older continuity of the gradient of the weakly smooth function, and establish its rate of convergence. Given that these parameters could be unknown or known but possibly conservative, such a method may suffer from implementation issue or slow convergence. We therefore propose a parameter-free conditional gradient method whose step size is determined by using a constructive local quadratic upper approximation and an adaptive line search scheme, without using any problem parameter. We show that this method achieves the same rate of convergence as the parameter-dependent conditional gradient method. Preliminary experiments are also conducted and illustrate the superior performance of the parameter-free conditional gradient method over the methods with some other step size rules.

4.Necessary and sufficient conditions for unique solvability of absolute value equations: A Survey

Authors:Shubham Kumar, Deepmala

Abstract: In this survey paper, we focus on the necessary and sufficient conditions for the unique solvability and unsolvability of the absolute value equations (AVEs) during the last twenty years (2004 to 2023). We discussed unique solvability conditions for various types of AVEs like standard absolute value equation (AVE), Generalized AVE (GAVE), New generalized AVE (NGAVE), Triple AVE (TAVE) and a class of NGAVE based on interval matrix, P-matrix, singular value conditions, spectral radius and $\mathcal{W}$-property. Based on the unique solution of AVEs, we also discussed unique solvability conditions for linear complementarity problems (LCP) and horizontal linear complementarity problems (HLCP).