2022: Continuous Optimization

The 22nd edition of the Geilo Winter School will again take place online, from Monday January 24 to Friday January 28, 2022. The topic of the school will be continuous optimization.

Photo: Shutterstock

Abstract

Continuous optimization is the study of maximizing functions of continuous variables. Such problems are generally intractable without additional conditions and constraints, but for many specific cases, theoretical research has led to the development of very useful algorithms for practical applications. Today, such algorithms constitute the workhorse in a diverse range of applications spanning from robotics over machine learning to economics. Recent successes in the field include progress in compressed sensing and advances in large-scale optimization.

The 2022 Geilo winter school features an exciting weeklong program with introduction to continuous optimization and deep dives into various applications including shape optimization, optimization of physical systems constrained by partial differential equations, and relations between machine learning and optimization. The goal of the school is not only to teach the basics of the field, but also to provide practical understanding of how and when continuous optimization might be applied.

Program

Anton Evgrafov: Introduction to Continuous Optimization

In this introductory part of the school we will recall the fundamental concepts of continuous optimization: terminology, optimality conditions, and descent algorithms.  We will also discuss convexity and its implications for optimization problems.  These concepts will be illustrated by the examples arising in engineering and natural sciences.

Stein Krogstad: Adjoint methods for PDE-constrained optimization using MRST

Adjoint methods for efficient computation of gradients and sensitivities for use in PDE-constrained optimization are used in a variety of engineering disciplines. Once a hassle to implement, in modern simulation codes based on automatic differentiation, the adjoints (in principle) come almost for free. In these two sessions we introduce the adjoint method and its use in optimization with a lean towards subsurface flow applications. The sessions will be accompanied by numerical experiments and small participant tasks using the Matlab Reservoir Simulation Toolbox (MRST). The Matlab code (which also can be run in Octave) will be made available to the participants beforehand.      

Julien Mairal: Large-scale optimization for machine learning

Continuous optimization has been central to machine learning for a long time. In these lectures, we are interested in continuous problems with a particular "large-scale" structure that prevents us from using generic optimization toolboxes or simply vanilla first- or second-order gradient descent methods. In such a context, all of these tools suffer indeed either from too large complexity per iteration, or too slow convergence, or both, which has motivated the machine learning community to develop dedicated algorithms. We will introduce several of such techniques, and in particular focus on stochastic optimization, which plays a crucial role for applying machine learning techniques to large datasets.

Alberto Paganini: Shape Optimization

Lecture 1 - Introduction to shape optimization: In this lecture we learn the fundamentals of shape optimization. In the lab session develop a solver for a class of shape optimization problems using standard Python libraries.

Lecture 2 - PDE-constrained shape optimization with finite elements: In this lecture we learn how to solve partial differential equations (PDEs) using finite elements and how to tackle shape optimization problems constrained to PDEs. In the lab session we develop a solver for PDE-constrained shape optimization using the finite element library Firedrake.

Lecture 3 - Developing Fireshape, a Firedrake’s toolbox to automate PDE-constrained shape optimization: In this lecture we discuss how to develop a library to automate shape optimization using Firedrake’s shape optimization toolbox Fireshape as a case study. In the lab session we experiment with Fireshape.

 

Schedule

All the lectures will take place online via Zoom, supported by Zulip chat. Participants will receive more information by email.

You can subscribe to the above calendar by using this link. 

Lecturers

Anton Evgrafov

Anton Evgrafov graduated with a PhD in Applied Mathematics from Chalmers University of Technology in 2004. He has held several research and faculty positions in Denmark, Norway, and the US, and is currently an associate professor at Aalborg University. His research interests include analysis and numerical methods for control in the coefficients/topology optimization problems, especially for systems governed with nonlocal equations.

Stein Krogstad

Dr. Stein Krogstad is a senior research scientist at SINTEF Applied Mathematics and Cybernetics. He holds a PhD on structure preserving numerical methods for ODEs but has for the last 15 years in SINTEF mostly been involved in research projects related to numerical simulation of flow in porous media. This includes discretization schemes, development of fully implicit solvers, reduced order modelling, upscaling and multiscale methods. Current research activities largely evolve around model-based optimization of subsurface flow problems by the use of adjoints, model reduction and proxies. He has been an active developer of the Matlab Reservoir Simulation Toolbox (MRST) since its first release in 2009.

Julien Mairal

Julien Mairal is a research scientist at Inria Grenoble, where he leads the Thoth research team. He joined Inria Grenoble in 2012, after a post-doc in the statistics department of UC Berkeley. He received the Ph.D. degree from Ecole Normale Superieure, Cachan. His research interests include machine learning, computer vision, mathematical optimization, and statistical image and signal processing. In 2016, he received a Starting Grant from the European Research Council. He was awarded the Cor Baayen prize in 2013, the IEEE PAMI young researcher award in 2017 and the test-of-time award at ICML 2019.

Alberto Paganini

Dr. Alberto Paganini is a Lecturer in Applied Mathematics at the University of Leicester and a Royal Society Short Industry Fellow. Dr. Paganini obtained a PhD in Mathematics at ETHZ in 2016 and, before joining the University Leicester, worked as a postdoc and a lecturer at the University of Oxford and at Pembroke College.

Dr. Paganini's research interests revolve around PDE-constrained optimization, with a particular emphasis on the development of automated solutions that are inherently compatible with standard finite element methods. Dr Paganini also leads the development of the open-source shape optimization toolbox Fireshape.

Important Information

See the About page for general information about the winter school.

Registering for the winter school

To register, please fill out this form.

Cost of participating

There is no registration fee for the winter school.

Available spots

The winter school is limited only by available spots in the online tools used to hold it. Admittance is on a first come-first serve basis, with priority given to students and postdocs.

Posters

We welcome all posters to be presented, and will make space in the program for a poster session in which participants can present their work to colleagues and others. The aim of the session is to make new contacts and share your research, and it is an informal event. You need to indicate in your registration if you want to present a poster during the poster session. 

Organizing Committee

 

The organizing committee for the Geilo Winter School consists of

  • Torkel Andreas Haufmann, Research Scientist (Department of Mathematics and Cybernetics, SINTEF). 
  • Øystein Klemetsdal, Research Scientist (Department of Mathematics and Cybernetics, SINTEF).
  • Signe Riemer-Sørensen, Research Scientist (Department of Mathematics and Cybernetics, SINTEF).

To get in touch with the committee, send an email .