Guest lectures and seminars - Page 27
I will go through my PhD work at DTU. It is about the development of a fully-nonlinear finite difference based potential flow solver which imposes all of the fluid boundaries via an immersed boundary method. The convergence and stability of this approach is first established for various linear and nonlinear wave propagation problems. When it comes to the wave-body interaction problem, cautious attention is paid to the intersection point between free surface and body surface, and a scheme which meets the accuracy and stability requirements best is picked from several proposals. With the scheme introduced in this paper, piston type wave maker and forced heaving cylinder cases with high oscillation frequency have been simulated successfully.
Internal solitary waves (ISWs) are underwater waves of great amplitude moving horizontally in the layered ocean. The waves induce a velocity field which is felt both at the ocean surface, throughout the entire water column, and at the bottom. When of great amplitude, the waves induce a vortex wake in the bottom boundary layer behind the wave and transport water in the vertical direction displacing, e.g., sediments from the bottom. A fundamental mechanism in the ocean ecosystem is the vertical mixing and movement of particles, e.g., biological materials. In this talk, we present numerical simulations of ISWs of depression and of large amplitude by replicating a laboratory experiment. Furthermore, we discuss the dynamics of ISW-sediment interactions and illustrate particle movements, trajectories, and particle distribution in the water column under the influence of ISWs of large amplitude.
C*-algebra seminar by Ole Brevig (University of Oslo)
Why is deep learning so successful in many applications of modern AI? This question has puzzled the AI community for more than a decade, and many attribute the success of deep learning to the implicit regularization imposed by the Neural Network (NN) architectures and the gradient descent algorithm. In this talk we will investigate the implicit regularization of so-called linear NNs in the simplified setting of linear regression. Furthermore, we will show how this theory meets fundamental computational boundaries imposed by the phenomenon of generalized hardness of approximation. That is, the phenomenon where certain optimal NNs can be proven to exist, but any algorithm will fail to compute these NNs to an accuracy below a certain approximation threshold. Thus, paradoxically, there will exist deep learning methods that are provably optimal, but that can only be computed to a certain accuracy.
Vegard Antun is a postdoctoral fellow at the University of Oslo, department of Mathematics.