Five publications were submitted recently to the CDS Technical Report Series.
- A short note on solving partial differential equations using convolutional neural networks
https://nbn-resolving.org/urn:nbn:de:hbz:38-642271
Physics-based machine learning using convolutional neural networks (CNNs) to solve PDEs. It uses finite difference stencils to incorporate the residual of the partial differential equation into the loss function. - Efficient Adaptive Elimination Strategies in Nonlinear FETI-DP Methods in Combination with Adaptive Spectral Coarse Spaces
http://nbn-resolving.de/urn:nbn:de:hbz:38-641880
Convergence of Nonlinear FETI-DP (a nonlinear nonoverlapping domain decomposition method) is strongly influenced by the choice of the second level and of a set of degrees of freedom that are eliminated nonlinearly before linearization. In this article, an adaptive coarse space is combined with a problem-dependent and residual-based choice of the elimination set. - Three-level BDDC for Virtual Elements
http://nbn-resolving.de/urn:nbn:de:hbz:38-641875
Application of three-level BDDC (nonoverlapping domain decomposition method) and the Virtual Element Method (discretization method for the solution of partial differential equations that allows for the use of nearly arbitrary polygonal/polyhedral grids). - Adaptive Three-level BDDC Using Frugal Constraints
http://nbn-resolving.de/urn:nbn:de:hbz:38-641234
Adaptive coarse spaces are used to obtain robust domain decomposition methods for heterogeneous problems. These coarse spaces are usually expensive to compute and can lead to large coarse problems. In this article, three-level BDDC (nonoverlapping domain decomposition method) is combined with a provably robust adaptive coarse space and a computationally cheaper frugal coarse space. - Learning Adaptive FETI-DP Constraints for Irregular Domain Decompositions
http://nbn-resolving.de/urn:nbn:de:hbz:38-641226
Adaptive coarse spaces are used to obtain robust domain decomposition methods for heterogeneous problems. These coarse spaces are usually expensive to compute, since many eigenvalue problems are required to be set up and solved. It has been shown [peer-reviewed, report] that a neural network can be trained to decide which eigenvalue problems can be discarded and that it can learn approximations of the relevant eigenvectors [report]. Here, the results are extended to unstructured domain decompositions in two dimensions.