Previous talks at the SCCS Colloquium

Onur Eker: Scalable Sampling of Deep Neural Operators

SCCS Colloquium |


Maps between functions are important in many areas of science, serving as a fundamental tool for understanding complex relationships in data. Fourier Neural Operators (FNO) are a popular recent machine learning technique used to encode these maps, offering a powerful and efficient approach for learning and generalizing across diverse functional spaces in a way traditional neural networks cannot. This thesis delves into the scalability of two-dimensional Fourier Neural Operators (FNO2D) within the framework of Sampling Where It Matters (SWIM). It primarily focuses on enhancing the FNO2D model's efficiency and accuracy by comparing it with FNO1D. The research systematically explores various hyperparameters, including the number of modes, hidden channels, and layer width, to optimize the performance of FNO2D models, particularly in the context of complex partial differential equations. A series of experiments are conducted to assess the impact of these hyperparameters on model scalability and performance. The study demonstrates that while FNO2D models show great promise in handling multi-dimensional data, strategic hyperparameter tuning is crucial for balancing computational efficiency with predictive accuracy. This thesis contributes insights into the scalability and applicability of FNO2D models with its experiments results.

Master's thesis presentation. Onur is advised by Iryna Burak, and Dr. Felix Dietrich.