Norwegian version of this page

Guest lectures and seminars - Page 27

Time and place: , Niels Henrik Abels hus, 9th floor

Internal solitary waves (ISWs) are underwater waves of great amplitude moving horizontally in the layered ocean. The waves induce a velocity field which is felt both at the ocean surface, throughout the entire water column, and at the bottom. When of great amplitude, the waves induce a vortex wake in the bottom boundary layer behind the wave and transport water in the vertical direction displacing, e.g., sediments from the bottom. A fundamental mechanism in the ocean ecosystem is the vertical mixing and movement of particles, e.g., biological materials. In this talk, we present numerical simulations of ISWs of depression and of large amplitude by replicating a laboratory experiment. Furthermore, we discuss the dynamics of ISW-sediment interactions and illustrate particle movements, trajectories, and particle distribution in the water column under the influence of ISWs of large amplitude.

Time and place: , NHA B1120
Already Plücker knew that a smooth complex plane quartic curve has exactly 28 bitangents. Bitangents of quartic curves are related to a variety of mathematical problems. They appear in one of Arnold's trinities, together with lines in a cubic surface and 120 tritangent planes of a sextic space curve. In this talk, we review known results about counts of bitangents under variation of the ground field. Special focus will be on counting in the tropical world, and its relations to real and arithmetic counts. We end with new results concerning the arithmetic multiplicity of tropical bitangent classes, based on joint work in progress with Sam Payne and Kris Shaw.
Time and place: , NHA107

C*-algebra seminar by Ole Brevig (University of Oslo)

Time and place: , Niels Henrik Abels hus, 9th floor

Why is deep learning so successful in many applications of modern AI? This question has puzzled the AI community for more than a decade, and many attribute the success of deep learning to the implicit regularization imposed by the Neural Network (NN) architectures and the gradient descent algorithm. In this talk we will investigate the implicit regularization of so-called linear NNs in the simplified setting of linear regression. Furthermore, we will show how this theory meets fundamental computational boundaries imposed by the phenomenon of generalized hardness of approximation. That is, the phenomenon where certain optimal NNs can be proven to exist, but any algorithm will fail to compute these NNs to an accuracy below a certain approximation threshold. Thus, paradoxically, there will exist deep learning methods that are provably optimal, but that can only be computed to a certain accuracy.

Vegard Antun is a postdoctoral fellow at the University of Oslo, department of Mathematics.

Time and place: , NHA107

QOMBINE seminar by Snorre Bergan (UiO)