The discretization of time-dependent high-dimensional PDEs suffers from an undesired effect, the so-called curse of dimensionality: The amount of data to be stored and treated grows exponentially and exceeds standard capacity of common computational devices. In this setting, time dependent model order reductions techniques are desirable. In the present seminar, we present a broad overview on dynamical low-rank approximation together with recent developments on robust numerical integrators for it. Dynamical low-rank approximation for matrices is firstly presented, and a numerical integrator with two remarkable properties is introduced: the matrix projector splitting integrator. Based upon this numerical integrator, we construct two equivalent extensions for tensors, multi-dimensional arrays, in Tucker format - a high-order generalization of the SVD decomposition for matrices. These extensions are proven to preserve the excellent qualities of the matrix integrator. Then, via a novel compact formulation of the Tucker integrator, we further extend the matrix and Tucker projector splitting integrators to the most general class of Tree Tensor Networks. Important examples belonging to this class and of interest for applications are given by (but not only restricted to) Tensor Trains.
The present seminar is based upon joint works with Ch. Lubich, H. Walach, J. Kusch, and D. Sulz.