Postgraduate Diploma Programme

An intense, pre-PhD programme for talented students


Quantitative Life Sciences

The QLS Diploma programme is aimed at providing students with a quantitative and theoretical background that can allow them to access postgraduate programmes in a broad range of disciplines, including biophysics, quantitative biology and neuroscience, theoretical systems ecology, economics, data science, and machine learning.

The diploma is organized along five main courses, on:

  • Probability and Information Theory
  • Quantitative Biology
  • Neuroscience
  • Ecology and Evolution
  • Machine Learning and Artificial Intelligence

Each are aimed at giving a core theoretical background for analyzing and modeling different phenomena in life sciences. These are complemented with shorter topical and advanced courses, seminars and journal clubs.

The course descriptions are listed below. In addition to these, students will have to follow the "Spring College in Complex Systems" and take exams there.

QLS Diploma Course Descriptions:

  • Probability Theory (M. Marsili)
    The first part of the course deals with classical probability and it aims primarily at acquiring a solid understanding of how to turn a problem expressed in common language, into a probabilistic model (e.g. urn models, balls and boxes, random walks, branching processes, etc) and how to go from that to a quantitative answer (with combinatorial arguments, generating functions, etc). The second part deals with sequences of many random variables and the asymptotic behaviour in probability. We discuss typical behaviour (law of large numbers, limit theorems) and atypical behaviour (large deviation theory). This part also discusses probability from the point of view of information theory. Statistical mechanics and statistical learning are also discussed from this perspective.

  • Biophysics (E. Roldan)
    This course is conceived as a pedagogical journey towards 
the physical principles of biological phenomena. We will use statistical 
physics and elements of stochastic theory to describe the fluctuating 
signatures of active matter, putting special emphasis on both the 
dynamics and thermodynamics of nonequilibrium biological phenomena. The 
lectures will set a theoretical background that will enable to 
understand recent experimental advances in biophysics, active matter and 
stochastic thermodynamics. We will start describing biological processes 
with well-known physical laws. To describe fluctuations of biological 
process, we will discuss the theory of stochastic processes for discrete 
and continuous systems, introducing e.g. Markov processes, and Langevin 
equations describing diffusion processes. This knowledge will be applied 
to quantify the dynamics and thermodynamics of a collection of 
biological processes of key relevance for the sustainment of life, 
including enzymes, molecular motors, biopolymers, biological sensors, etc.

  • Ecology and Evolution (J. Grilli)

    The goal of this class is both to teach students fundamental concepts in ecology and evolution and provide them with basic notions and tools in dynamical systems and stochastic processes. The class will be divided into two main parts. The first part will be devoted to introducing the students with classic models in population and community dynamics. At the end of this part, the students are expected to be able to quantitatively model community dynamics in the presence of different interaction types, identifying and justifying the important assumptions. They should also be able to perform stability analysis, identify the presence of bifurcations and have basic notions of limit cycles and chaos. The second part will focus on evolutionary theory and population genetics. We will discuss classic and modern experimental evidence of Darwinian (micro)evolution and the observational pieces of evidence of macroevolution.  The students will learn the effect of selection, mutations and drift on fixation of neutral, beneficial, and deleterious mutations in the context of simple stochastic models of population genetics. The last week will be devoted to studying coevolution in the context of an eco-evolutionary model of phage-bacteria interactions.

  • Hands-on Scientific Storytelling and Critical Thinking (J. Grilli)
    The goal of this class is to teach students to write and present scientific results in a rigorous, clear, and efficient way. The class will be designed with a fully hands-on, case-study based approach. The students will start from a scientific paper that it will be “deconstructed” by students in its fundamental building blocks (e.g., figures, the overall structure of the text, results, etc). The students will work on their own to replicate figures of the paper starting from data analysis. The aim of this part is both to learn techniques in data analysis, modeling, and visualization, but also to question all the choices that the authors made in the (graphical) presentation of their results, exploring possible alternatives. A third important goal of the class is to stimulate critical thinking: by questioning authors choices, the students will learn to check the soundness of the results and their limits. Lectures will be discussion based. We will discuss the choices that the authors made in presenting their results, how did they organize the text, how they decided to frame and visualize the results. A lecture will be devoted to discussing good practices for giving a scientific talk. The final project will consist of preparing a referee report and a 15- minute presentation on the paper studied and deconstructed in class.

  • Data Science: Machine Learning and Advanced Inference (J. Barbier)
    This course aims at providing students with basic Machine Learning skills, both at the theoretical and applied levels, with coding session based on the Python programming language. Some of the presented methods are regression and classification techniques (linear and logistic regression, least-square); clustering; dimensionality reduction techniques such as PCA, SVD and matrix factorization. More advanced methods such as generalized linear models, neural networks and variational and Bayesian inference using graphical models are also introduced. The course is self-contained in terms of the necessary mathematical tools (mostly probability) and coding techniques. At the end of the course, students will be able to formalize a ML or inference task, choose the appropriate method in order to tackle it while being able to assess its performance, and to implement these algorithms in Python.

  • Machine Intelligence (A. Celani)
    We will introduce the basic notions of artificial intelligence with emphasis on its connections with optimal control theory, both deterministic and stochastic. Reinforcement Learning will be a central concept of the course. We will discuss how to obtain algorithms that learn efficient strategies using both model-based and model-free approaches, in condition of full or partial observability. Applications to simple tasks from robotics and computer science will be considered. If time permits we will also introduce multi-agent Reinforcement Learning and some of its applications. 

  • Introduction to Neuroscience (C. Mathys)
    We will start by looking at neurons and the way the communicate with each other. Basic concepts such as dendrites, axons, action potentials, synapses, neurotransmitters, ion channels, etc. will be introduced. From there we will go on to look at models of spiking neurons and models of information processing in the brain as a whole. We will end by looking at methods of data collection in neuroscience such as intra- and extracellular recordings, local field potentials, electroencephalography, magnetoencephalography, (functional) magnetic resonance imaging, and more.

  • Evolution of Neural Computation (A. Treves)
    Neural network theories often claim to be inspired by the brain, but they reduce it to an undifferentiated generic “neural network”, which has never been observed in a living nervous systems. Viceversa, proponents of large scale computational neuroscience approaches often emphasize a seemingly infinite diversity e.g. of neuron types in the cerebral cortex, arguing that only a pedantic compilation of thousands of relevant parameters will enable realistic simulations of actual brain circuits. Both communities are misguided, if the aim is to understand neural computation. From a statistical point of view, evolution has produced in the mammalian brain a limited number of organizing principles, different from those observed in other classes of animals and from each other. Among them, the course will briefly touch on those informing neural networks in the retina, in the basal ganglia, in the cerebellum, and then focus on the cerebral cortex and the hippocampus, discussing their structure and their yet-to-be-fully-understood relation to computation.

  • Mathematical Methods (R. Belousov)
    A review of basic notions on:
  1. Functions of a Real Variable

  2. Complex Numbers and their properties

  3. Functions of a Complex Variable
Contour Integration in the Complex Plane
Real integrals by Complex Analysis
Linear Vector Spaces and Linear Operators
Fourier Series and Fourier Transform

  8. Ordinary Differential Equations

  9. Green Function Method for Partial Differential Equations
  • Statistical Mechanics with Python (J. Barbier, A. Mazzolini, A. Roy)
    In this course the students will review the basic notions of statistical mechanics with exercises in Python. Theory: the canonical ensemble, the perfect gas, Ising model, notions of phase transitions; Numerical exercises: Monte-Carlo/Metropolis algorithms, simulated annealing. At the end of the course the students will undertake a numerical project in small groups.

  • Introduction to Mathematical Economics and Game Theory (M. Marsili)