Speaker
Description
Finding the unique stabilizing solution $X = X^H$ of a large-scale continuous-time algebraic Riccati equation (CARE) $A^HX + XA + CHC - XBB^HX$ with a large, sparse $n \times n$ matrix A, an $n \times m$ matrix $B$ and an $p \times n$ matrix $C$ is of interest in a number of applications. Here, $B$ and $C^H$ are assumed to have full column and row rank, resp., with $m, p \ll n.$ The unique stabilizing solution $X = X^H $ is positive semidefinite and makes the closed-loop matrix $A-BB^HX$ stable. Even so $A$ is large and sparse, the solution $X$ will still be a dense matrix in general. But our assumptions on $B$ and $C$ often imply that the sought-after solution $X$ will have a low numerical rank (that is, its rank is $\ll n$). This allows for the construction of iterative methods that approximate $X$ with a series of low rank matrices $X_i$ stored in low-rank factored form. That is, the Hermitian low-rank approximations $X_j$ to $X$ are of the form $X_j = Z_jY_jZ_j^H,$ where $Z_j$ is an $n \times kj$ matrix with only few columns and $Y_j$ is a small square $kj \times kj$ Hermitian matrix. There are several methods which produce such a low-rank approximation.
In this talk, we consider a class of (block) rational Krylov-subspace-based projection methods for solving CAREs. The CARE is projected onto a block rational Krylov subspace $\mathcal{K}_j$ spanned by blocks of the form $(A^H - s_kI)^{-1}C^H$ for some shifts $s_k,\ k = 1, \ldots, j.$ The considered projections do not need to be orthogonal and are built from the matrices appearing in the block rational Arnoldi decomposition associated to $\mathcal{K}_j.$ The resulting projected Riccati equation is solved for the small square Hermitian $Y_j.$ Then the Hermitian low-rank approximation $X_j = Z_jY_jZ_j^H$ to $X$ is set up where the columns of $Z_j$ span $\mathcal{K}_j.$ The residual norm $\|R(X_j )\|_F$ can be computed efficiently via the norm of a readily available $2p \times 2p$ matrix. We suggest reducing the rank of the approximate solution $X_j$ even further by truncating small eigenvalues from $X_j.$ This truncated approximate solution can be interpreted as the solution of the Riccati residual projected to a subspace of $\mathcal{K}_j.$ This gives us a way to efficiently evaluate the norm of the resulting residual. Numerical examples are presented.