Zum Inhalt springen
Menü

Wir verwenden Google für unsere Suche. Mit Klick auf „Suche aktivieren“ aktivieren Sie das Suchfeld und akzeptieren die Nutzungsbedingungen.

# SCCS Kolloquium

The SCCS Colloquium is a forum giving students, guests, and members of the chair the opportunity to present their research insights, results, and challenges. Do you need ideas for your thesis topic? Do you want to meet your potential supervisor? Do you want to discuss your research with a diverse group of researchers, rehearse your conference talk, or simply cheer for your colleagues? Then this is the right place for you (and you are also welcome to bring your friends along).

# Muhammad Waleed Bin Khalid: Efficient Kernel Flows Optimization for Neural Network Induced Gaussian Process Kernels

SCCS Colloquium |

A ubiquitous task in data-driven learning involves using available information to construct accurate models that can generalize to unseen input. Kernel-based learning methods are one such technique, but they rely on a good prior choice of the kernel. Often, knowledge is required regarding what type of kernel would be suitable for the available data and what kernel hyper-parameters would be needed for optimal performance. The Kernel Flows method is one learning technique that adapts the said kernel to provide the best performance for a given dataset without requiring as much expert knowledge. It works on the simple rationale that if a kernel is good, then there should not be a large change in the predictions even with reduced input data. The algorithm is available in two flavors, a parametric version where the kernel hyper-parameters are adapted, and a non-parametric one, where the input data itself is transformed to suit the base kernel.

On the same spectrum of learning techniques is the Gaussian process, more specifically the Gaussian process generated from Neural Networks. It can be shown that in the limit of infinite width, a fully connected (Dense) Neural Network (NN), and in the limit of infinite filters, a Convolutional Neural Network (CNN) is equivalent to a Gaussian process with a kernel that depends on the respective architecture. For such a Network, the kernel is parameterized by the variances of the learnable parameters, i.e., the variance of the weight and the bias, hence only two numbers per layer of the original architecture.

In this thesis, we will elaborate on both concepts and subsequently combine them by utilizing the Kernel Flows algorithm to optimize the NNGP kernels for kernel ridge regression tasks. We will explore the parametric version of the Kernel Flows algorithm with Neural Network induced Gaussian process (NNGP) kernels as the base kernels we wish to optimize. While the Kernel Flows algorithm can provide appreciable results with only a few data points, this is nevertheless computationally expensive when kernels from deep Neural Networks are involved. Hence, we will also provide efficient implementations of the proposed method which will lead us to variants of the parametric Kernel Flows algorithm that utilize different optimization techniques compared to the one used originally. Subsequently, we will compare the results of these optimized kernels along with the computational complexity involved in achieving them.

We will also explore the non-parametric version of the Kernel Flows algorithm. Particularly, we will explore the problem of unnatural perturbations of data points and poor convergence that have been highlighted in previous works to understand why such abnormalities exist, propose solutions that aim to remedy them, and test our proposed solutions.

You don't want to miss a talk? Subscribe to our mailing list and our Colloquium calendar .

## Contribute a talk

To register and schedule a talk, you should fill the form Colloquium Registration at least two weeks before the earliest preferred date. Keep in mind that we only have limited slots, so please plan your presentation early. In special cases, contact colloquium@mailsccs.in.tum.de.

Colloquium sessions are now on-campus. We have booked room MI 00.13.054 for WS22/23. You can either bring your own laptop or send us the slides as a PDF ahead of time. The projector only has an HDMI connection, so please bring your own adapters if necessary.

Do you want to attend but cannot make it in person? We now have a hybrid option. Simply join us through this BBB room: https://bbb.in.tum.de/ger-wtc-qmp

We invite students doing their Bachelor's or Master's thesis, as well as IDP, Guided Research, or similar projects at SCCS to give one 20min presentation to discuss their results and potential future work. The time for this is typically after submitting your final text. Check also with your study program regarding any requirements for a final presentation of your project work.

New: In regular times, we will now have slots for presenting early stage projects (talk time 2-10min). This is an optional opportunity for getting additional feedback early and there is no strict timeline.

Apart from students, we also welcome doctoral candidates and guests to present their projects.

During the colloquium, things usually go as follows:

• 10min before the colloquium starts, the speakers setup their equipment with the help of the moderator. The moderator currently is Irene López. Make sure to be using an easily identifiable name in the online session's waiting room.
• The colloquium starts with an introduction to the agenda and the moderator asks the speaker's advisor/host to put the talk into context.
• Your talk starts. The scheduled time for your talk is normally 20min with additional 5-10min for discussion.
• The moderator keeps track of the time and will signal 2min before the end of time (e.g. by turning on their video).
• During the discussion session, the audience can ask questions, which are meant for clarification or for putting the talk into context. The audience can also ask questions in the chat.
• Congratulations! Your talk is over and it's now time to celebrate! Have you already tried the parabolic slides that bring you from the third floor to the Magistrale?

Do you remember a talk that made you feel very happy for attending? Do you also remember a talk that confused you? What made these two experiences different?

Here are a few things to check if you want to improve your presentation:

• What is the main idea that you want people to remember after your presentation? Do you make it crystal-clear? How quickly are you arriving to it?
• Which aspects of your work can you cover in the given time frame, with a reasonable pace and good depth?
• What can you leave out (but maybe have as back-up slides) to not confuse or overwhelm the audience?
• How are you investing the crucial first two minutes of your presentation?
• How much content do you have on your slides? Is all of it important? Will the audience know which part of a slide to look at? Will somebody from the last row be able to read the content? Will somebody with limited experience in your field have time to understand what is going on?
• Are the figures clear? Are you explaining the axes or any other features clearly?

In any case, make sure to start preparing your talk early enough so that you can potentially discuss it, rehearse it, and improve it.

Here are a few good videos to find out more:

Did you know that the TUM English Writing Center can also help you with writing good slides?

## Work with us!

Do your thesis/student project in Informatics / Mathematics / Physics: Student Projects at the SCCS.