Fast Algorithms for Optimal Damping in Mechanical Systems
arXiv:2601.05404v1 Announce Type: new
Abstract: Optimal damping aims at determining a vector of damping coefficients $nu$ that maximizes the decay rate of a mechanical system’s response. This problem can be formulated as the minimization of the trace of the solution of a Lyapunov equation whose coefficient matrix depends on $nu$. For physical relevance, the damping coefficients must be nonnegative and the resulting system must be asymptotically stable. We identify conditions under which the system is never stable or may lose stability for certain choices of $nu$. In the latter case, we propose replacing the constraint $nu ge 0$ with $nu ge d$, where $d$ is a nonzero nonnegative vector chosen to ensure stability.
We derive explicit expressions for the gradient and Hessian of the objective function and show that the Karush–Kuhn–Tucker conditions are equivalent to the vanishing of a nonlinear residual function at an optimal solution. To compute such a solution, we propose a Barzilai–Borwein residual minimization algorithm (BBRMA), which is simple and efficient but not globally convergent, and a spectral projected gradient (SPG) method, which is globally convergent. By exploiting the structure of the problem, we show how to efficiently compute the objective function and its gradient, with eigenvalue decompositions constituting the dominant cost in terms of execution time. Numerical experiments show that both methods require fewer eigenvalue decompositions than the fast optimal damping algorithm (FODA), and that, although SPG may incur additional decompositions due to line search, it often converges faster than BBRMA, leading to a lower overall computational effort.