# Course Descriptions

Example topics covered in this course are the following:

• Mortality models in life and pension insurance:

o Deterministic models

o Stochastic models

o Age-Period-Cohort model

o Multifactor models with parameter uncertainty

o etc.

• Parametric and non-parametric approaches in non-life insurance pricing:

o Modeling non-life insurance claims

o Prediction Uncertainty

o Cross-Validation methods

o Generalized linear models

o Data compression

o Issue of over-parametrization and over-fitting

o etc.

Why are financial intermediaries such as banks and mutual funds different from other industries? How did Lehman Brothers' bankruptcy bring financial systems worldwide to the brink of collapse? What is the "shadow banking system" and how on earth are individual hedge fund managers able to earn more than one billion dollars per year? Contents: Principles of financial intermediation, commercial banks, other lending institutions, insurance companies, securities firms and investment banks, mutual funds, pension funds, hedge funds, private equity and venture capital, rating agencies, shadow banking system, the financial crisis, the European sovereign debt crisis, regulation of financial intermediaries.

Students master the essential techniques of large graph analysis and algorithms and have insight into the structural properties of large graphs. They are able to evaluate and develop new methods.

The course covers optimal stopping time and American options, stochastic interest rates and mortality models, and optimal asset allocation in life/pension insurance.

The lecture Advanced Statistics is a fundamental part in any statistical and any Data Science education, covering, in particular, statistical inference in linear models. Linear models are a key discipline in applied statistics, including the modern fields of analytics/prediction and causality. Topics covered include:

- multivariate normal distribution
- random quadratic forms
- Least-Squares- and BLUE-estimators
- Analysis of Variance (ANOVA)
- Regression analysis
- Prediction and Causality

Longitudinal data arise when the same individual/experiment unit is measured at a sequence of observation times. Such data combine aspects of both multivariate data and time series. Specific to longitudinal data is that the temporal interdependence implies a highly structured pattern and that typical data sets consist of a moderate to large number of short series, one from each subject.

This course covers the basic facts from probability in a measure-theoretic approach. Specific topics are the definition and properties of measure and Lebesgue integral, fundamentals of probability (probability space, random variables, conditional expectation, modes of convergence, convolutions and characteristic functions, central limit theorem) and an introduction to fundamentals of statistics (simple random sampling, introduction to estimation techniques).

Asset-Liability-Management (ALM) describes the management and controlling of liabilities and assets within an insurance company. It is based on techniques from actuarial science and financial mathematics. The course covers the most important methods, which are widely used in practice. These methods become more and more relevant for the risk-management and controlling of insurance companies (e.g. due to the "Solvency II" requirements). The course discusses models, which handle the entire insurance company, as well as models, which focus on single insurance products and the matching of insurance guarantees and the asset allocation.

Go to the heart of things. Why is gold attractive even though its average return is low? Read papers which earned their authors the Nobel Prize. Learn why psychology matters for finance. Contents: Review of standard asset pricing models, equity premium puzzle, cross-section of returns, behavioral finance.

This course covers selected topics from the asymptotic theory of estimators and tests. It deals with density estimators, the construction of confidence intervals and the development of asymptotic test approaches. Special attention is given to distribution-free resampling tests, like rank tests, and their application for testing non- or semi-parametric hypotheses.

Many of the classical statistical inference procedures need specific and stringent distributional assumptions which are often not met in practice. In the course it is shown how the tools of asymptotic statistics may provide a way out. In particular, the construction of asymptotic estimates, confidence intervals and tests is studied covering topics such as asymptotic (relative) efficiency, likelihood-ratio statistics, nonparametric density estimation, rank tests, resampling (e.g. permutation and randomization methods), U- and V-statistics.

Behavioral economics extends the narrow view of standard microeconomics that agents are completely rational and selfish. Students will first learn to identify behavior and judgement that systematically deviates from the predictions of standard microeconomics. We will try to explain such behavior by building on concepts from neighboring disciplines, like psychology or sociology. Finally, we will focus on how behavioral theory is built upon the empirical and experimental evidence to arrive at descriptively more accurate models of human behavior. Contents: Behavioral decision theory, time inconsistency and self-control, behavioral game theory, social preferences and fairness, applications to financial markets.

Behavioral Finance explains financial phenomena by agents who are bounded rational. It consists of two building blocks: the limits to arbitrage and investor psychology. The aim of this lecture is to give students an overview over the field of behavioral finance. At the end of the semester students should know under which conditions arbitrage does not work, how behavioral patterns guide agents' investment decisions, when herd behavior occurs and what implications bounded rationality has for market outcomes.

The lecture covers blockchain fundamentals from three different angles. First, we cover the architecture and technical fundamentals of blockchain and other distributed ledger technologies (DLT). Second, we explain applications of the blockchain and other DLT on financial markets. We talk about new business models of decentralised finance and non-fungible tokens (NFTs). Finally, we deal with the basic structures of the legal framework for applications of blockchain and other DLT in Europe.

The course deals with the interrelation of strategic considerations and corporate finance. For instance, we will also compare the pros and cons of different forms of capital (bank loans, bonds, equity, convertibles etc.) to fund growth. We will also discuss the different financial exit strategies (spin off, auction, leveraged buyout etc.) if a company decides to split-off a segment that is considered non-core by its management. Contents: Recap finance basics, growth strategies, corporate restructuring, distribution mechanisms, capital structure, rating management, agency issues, incentive systems, real option valuation, risk management and hedging.

There are two reasons for a statistical analysis. One is prediction of future data based on what one has learned from past data and accounting for uncertainty. Prediction need not be concerned with understanding cause-effect relationships, but understanding causality is central to our understanding of data and how we use that knowledge. For instance, standard statistical techniques allow to predict the survival probability of a current smoker, typically predicting earlier death compared to non-smokers. But there is no standard statistical technique that analyses the causal effect of smoking on mortality. The difficulty is that smoking is not assigned in a randomized experiment, and there are more differences between smokers and non-smokers than just smoking status. In fact, defining a causal effect is not even part of the usual statistical and mathematical formalism. In the last 30 years or so, there has been a statistical revolution of developing causal inference, motivated by practical needs as in the search for effective HIV treatments. The aim of this lecture is to introduce students to this groundbreaking new field

The lecture discusses controlling in the context of management and corporate governance, and topics and dimensions of management control as well as challenges and critical success factors. Selected case studies focus on topics as the Balanced Scorecard, IT-supported controlling, process-oriented controlling, profitability of investments and specific indicator systems.

When the suppliers of finance (shareholders) are distinct from those who control and operate their assets (managers) - the necessity of corporate governance comes into play. Corporate governance tries to resolve the inherent problems related to the separation of ownership and control in corporations and to alleviate the consequent agency problems - when managers want to act for their own benefit and expropriate the shareholders.

The course covers corporate strategy fundamentals and terminology as well as strategic aspects of corporate governance, organization, and leadership, marketing, branding, and sales, supply chain management, megatrends, corporate social responsibility, innovation management, organic and acquired growth, controlling and risk and management under uncertainty.

Example contents are the definition of strategy as a core corporate process, growth/share matrix and its implications, market activated corporate strategy framework and capital market based strategies.

Acquire the risk management tools that you need to know in banking as well as in industrial firms and consulting: Make you familiar with state-of-the art credit analysis used by banks, rating agencies, fund management and supervisory authorities. Topics of the lecture include rating methods, validation of rating systems, portfolio credit risk models and bank regulation.

Typically the student of econometrics is presented with a bewildering range of statistical tools and methods, which differ in the assumptions that are made, in their properties and in what we learn with them from data. The reasoning of choices made, often remains unexplained (why does OLS focus on the conditional mean and not on the median or the whole conditional distribution) or is related in a rule of thumb fashion to the type of data (fixed or random effects estimators for panel data). But often real world data do not neatly fall in one of the categories and even if they do, many approaches remain to choose from. In this course we review a broad range of approaches and try to shed some light on how applied researchers choose specifications and estimation methods. We do this by considering the following three categories: Description, inference and causality.

The students get acquainted with advanced data analytics methods in the context of life insurance.

Students can

- assess the advantages and disadvantages of various quantitative methods in the context of life insurance.
- apply the discussed methods to questions in life insurance.
- process and interpret the results of the discussed methods in this context.

Content: The contents of this course may change from year to year, depending on recent developments.

After successful completion of this course, students ae familiar with the following topics:

- Univariate and multivariate statistical methods

- Cluster analysis methods

- Visualisation and dimensionality reduction

- Association rule learning

- Classification methods

- Regression and forecasting

- Statistical evaluation

This course covers the fundamental principles and techniques of financial mathematics in discrete-time models. Specific topics are

- Financial market models in discrete time: arbitrage freeness and completeness
- Conditional expectation and discrete time martingales
- Valuation of European, American and path-dependent options
- Interest rate models and derivative
- Portfolio optimisation
- Risk measures

Extreme Value Theory is a part of stochastics which deals with maxima and minima of random variables, records and extremely rare events. Maxima and minima of random variables play an important role in many areas, for example in sports records, extreme meteorological events or extreme events in insurance mathematics. The lecture covers topics such as limit distributions of the maximum, statistical applications, records and limiting distributions of further quantiles.

This course covers the fundamental principles and techniques of financial mathematics in discrete- and continuous-time models. Specific topics are financial market models in discrete time (arbitrage freeness and completeness), valuation of European, American and path-dependent options, foundations of continuous time market models and of the Black-Scholes model, interest rate models and derivatives, risk measures, portfolio optimization and CAPM.

Topics discussed in this course include stochastic analysis (stochastic integration, stochastic differential equations, (semi-)martingales), continuous-time financial market models, valuation and hedging of derivatives in complete and incomplete financial markets, stochastic volatility and interest rate models (term structure modeling, interest rate derivatives, LIBOR market models).

Implement financial methods on the PC with Excel. Learn to do things in 2 minutes for which others need 2 days. The course covers topcis such as present value calculations, bond pricing, working with data and financial models, trading strategies and risk and return.

The course includes topics such as ratios of financial statement analysis, investment analysis, financing analysis, liquidity analysis, Economic Value Added, Cash Value Added, Cash Flow Return on Investment, intercompany profit and loss, and ad hoc disclosure.

# Course Descriptions

This course examines the role of central banks and other regulatory institutions in the global economy and financial stability. We will study the basic functioning financial intermediation, and objectives, operational framework, decision making process and policy implementation of central banks. As the 2008 global financial crisis has demonstrated, national monetary policies and other regulatory policies have consequences for the global financial stability. We will explore the causes and consequences of financial crises and whether and how policies and institutions can be used to stabilize markets and help countries recover from crises, and what lessons we can learn from recent experiences which might help prevent future crises. We will investigate the recent developments in monetary policy experiments throughout the world, and macroprudential regulation framework to contain the systemic risk. We will pay particular emphasis on financial markets’ interaction over the globe and the importance of central banks and other regulatory institutions on global financial stability. |

Whenever data is arranged in a network, graphs are the language of choice to represent the data. Examples are the street layout in a city that may be used in routing problems, or relationships in a social network, or dependencies in a software project. In this course we study the theoretical properties of graphs. Apart from the many applications of graphs, they turn out to be a source of attractive results and open problems.

Graph theory offers a rich choice of attractive theorems, methods and open problems. While normally we introduce a wide range of topics in Graph Theory I, we see Graph Theory II as an opportunity to concentrate on fewer topics but deeper and more powerful techniques. Topics that we likely treat are list colouring, combinatorial Nullstellensatz, nowhere-zero flows, dual graphs, extremal graphs and the regularity lemma.

This lecture provides an introduction to parallel computer architectures and standard programming interfaces for parallel numerical algorithms. We focus on few numerical algorithms like dense matrix multiplications and LU decompositions which allow us to approach step by step the theoretical peak performance.

Topics of the lecture are an introduction to the programming language C++ with a special focus on numerical linear algebra, memory hierarchies, and parallel computer architectures such as shared memory with POSIX threads and OpenMP, distributed systems with MPI and GPUs.

Contents of this lecture are the architecture of parallel computers, parallelization, MPI, Cuda, the cluster Pacioli, Parallel Finite Elemente Method, parallel numerical methods for linear systems of equations, domain decomposition, parallel preconditioning, parallel multigrid methods, symmetric Eigenvalue problems and storage formats for sparse matrices.

This course provides an introduction to insurance economics, including topics such as choice under uncertainty (expected utility theory and rational decision under risk, measure for risk aversion, mean variance preferences), insurance demands by households (base model, insurance demand without fair premium, Pareto-optimal insurance contract), insurance demand by firms (risk management and diversification, risk management forward, future and options, corporate demand for insurance), insurance supply (traditional premium calculation, financial modelling of insurance pricing, economies of scope, economies of scale), microeconomic analysis (moral hazard, adverse selection) and insurance regulation.

Interest rates are of fundamental importance in the economy in general and in financial markets in particular. Empirical observations suggest that they should be modelled by a stochastic process, since they are heavily varying over time. Even when considering only "risk free" interest rates there is not only a single interest rate to be modelled, but a whole interest rate curve/term structure of interest rates.

In this course we first look at the different possible interest rates and some related financial contracts and discuss ways of estimating the whole term structure based on the interest rates actually observable. Thereafter we turn to the analysis of some models for interest rates, viz. short rate models, LIBOR market models and the Heath-Jarrow-Morton Methodology. Furthermore, forward measures, forward and futures contracts and consistent term structure parametrizations are to be considered. Furthermore we look at affine processes and how to incorporate default risk.

Time-to-event data are ubiquitous in fields such as medicine, biology, demography, sociology, economics and reliability theory. In biomedical research, the analysis of time-to-death (hence the name survival analysis) or time to some composite endpoint such as progression-free survival is the most prominent advanced statistical technique. One distinguishing feature is that the data are typically incompletely observed - one has to wait for an event to happen. If the event has not happened by the end of the observation period, the observation is said to be right-censored. This is one reason why the analysis of time-to-event data is based on hazards. This course will emphasize the modern process point of view towards survival data without diving too far into the technicalities.

This course deals with topics such as the classic Markowitz analysis, tracking-error minimization, indices and index tracking, portfolio insurance, risk measurement beyond single-period mean-variance (shortfall risk, Value at Risk) and investment styles.

Emerging markets are becoming an increasingly significant part of the global economy and financial markets. By almost any measure, the strength and growth of emerging market economies is impressive. Clearly, many investors recognize the potentially attractive return characteristics and diversification benefits of this asset class. Yet, institutional investors have been cautious in their allocations to emerging market equities and emerging market equities still comprise a relatively small percentage of global market capitalization and institutional investor portfolios. Emerging markets, therefore, present an enigma.

In this course we will visit some of the issues associated with emerging markets characteristics, especially growth drivers in emerging markets, understanding the four majors BRIC, other emerging markets, understanding financial crisis in emerging markets, formal versus informal finance in emerging markets, understanding micro finance institutions and issues in risk management.

A classical example from medical research is time-to-death in HIV patients and CD4 cell counts; the latter measure viral load. However, the distinction between the analysis of survival data and the analysis of longitudinal measurements is artificial. Joint models aim at an integrated analysis. As these models may not have found their final form yet, the lecture potentially touches upon rather recent research work towards the end of the semester, including causal modeling and updated prediction.

The course provides a broad introduction to machine learning covering the following areas:

Concept learning

Learning in logic-based systems

Statistical learning

Unsupervised learning

Reinforcement learning

Bayesian learning

Kernel learning

The course covers advanced learning models both in technical and natural systems including:

Statistical learning theory

Multi-agent learning

Learning in evolutionary systems

Learning-to-learn

Learning with kernels

Learning in feedback systems

Learning in biological neurons

Learning in animals

Learning in robots

Topics discussed in this course include infinitely divisible distributions, the Lévy-Khintchine formula, existence of Lévy processes, moments of Lévy processes, stable distributions and stable Lévy processes, and the Lévy-Itô-decomposition and applications.

This introductory course deals with the mathematical fundamentals of life-, health- and pension- insurance. It covers basic probabilistic models for the computation of future lifetimes and life-tables, actuarial present values, expected payoffs and safety margins. With a focus on the German market, the course discusses: the calculation of premiums (principle of equivalence) for immediate and deferred life annuities, pure endowment, life insurance, health insurance and corporate pension schemes; the calculation of actuarial reserves; and the distribution of surpluses.

It is important to differentiate between anecdotal evidence and evidence that can be tested and proved. Understanding and analyzing data is essential for the latter. All of us in day-to-day routine use numbers in our calculations. Problems in business contain a great degree of quantitative element in the form of facts and figures. It is essential for professionals to carry out data analysis and interpretation for effective decisions. In this context, they need to prepare quantitative arguments to justify their decisions. Decision making using machine learning techniques is the answer for accomplishing this purpose. The course aims to develop your analytical skills - both the ability to conceptualize a problem as well as solve them broadly, it aims to make you understand and appreciate the most widely used tools of machine learning which form the basis for rational and sound decisions.

The course will cover cover an introduction to analytics using machine learning, R Studio and QGIS, supervised and unsuperivsed learning, boosting and bagging, and model comparisons.

This is a special course in reinsurance in cooperation with SCOR Reinsurance. The course will discuss links between reinsurance and financial modeling. During the course the students will work in groups on a set of problems and present their solutions.

The course explores methods to model and empirically estimate the behavior of consumers and the strategic interactions of firms. A large part of the course consists of analyzing data, models, and methods using the freely available statistical programming language R. We will provide an introduction to R, so no previous knowledge of R is needed.

This course covers Markov chains in discrete and continuous time with countable state space, in particular: Definition and elementary properties, examples, stopping times and strong Markov property, recurrence and transience, invariant distributions and limit distributions, classification of states and the generator in continuous time.

How do popular learning algorithms work? What guarantees are there that the learning was successful? To answer these questions the course covers a number of different topics and draws on techniques from different fields, such as

- convex optimisation,
- sample complexity,
- PAC learning or
- VC dimension.

Game theory is the study of multiperson decision problems. Such problems arise frequently in competitive scenarios where knowledge is distributed, such as in economics. For instance, oligopolies incorporate multiperson problems: Each one must consider what the others will do in order to achieve success in a competitive environment. The same happens in auctions, where several people might be interested in the very same item. How to act in the best of ways when competition is present among participants with conflictive interests? Does cooperation help in such a scenario? In which cases?

In this class, we will cover a handful of models in game theory, including models of static and dynamic information, as well as of complete and incomplete information. We will also study a variety of stable optimal solutions, including: nash equilibrium, subgame-perfect equilibrium, bayesian equilibrium, perfect bayesian equilibrium. Several examples are given, such as in the context of oligopolies, auctions, bargaining, tariff and wage competition, cooperative games, finite and infinite repeated games, zero sum games, signaling games, mechanism design, reputation.

This advanced course covers new material not considered in the Mathematics of Games introductory course, deepening the knowledge on known classical games (e.g., auctions, bargaining, market, prisonner's dilemma, etc...) as well as broadening its range, while introducing some new ones (e.g., networking). New concepts such as pareto optimality and social-optimum efficiency parameters are presented. The traditional concepts of Nash Equilibrium (One-Stage, Subgame-Perfect, Bayesian and Perfect Bayesian) from Mathematics of Games are therefore here extensively used as support for the problem solving. However, some additional equilibrium refinements such as Walrasian and Sequential are also brought into light, as well as some aspects of Mechanism Design, which focus rather on the game design itself from the point of view of the designer's aim. For instance, an auctioneer running an auction meets decisions on the auction's rules which affect the revenue obtained with the auction. Finally, a glimpse on communication aspects is given.

This course gives a broad overview of Monte Carlo methods. These methods include many important tools for students interested in applied probability, finance and statistics. The course will cover the following topics pseudo-random numbers and quasi-random numbers, generating random variables, mathematical analysis of Monte Carlo algorithms and variance reduction. Examples will be given of applications in finance, probability, statistics and the physical sciences.

Students will learn basic algorithms for stochastic simulation, their theoretical properties and typical applications. They will gain competence to implement these algoritms and to interprete the simulation results. The students will learn algorithms for strong approximation and quadrature of solutions of stochastic differential equations, their theoretical properties and typical applications.

In this course participants will discuss some papers in the fields of dynamic mortality models and equity-linked life insurance.

A main focus lies on the theoretical development of various rank tests; commonly applied in practice (as, e.g. Wilcoxon-Mann-Whitney). To analyze their properties we first discuss their finite sample properties and explain how they can be carried out distribution-free in specific situations. For more complex situations central limit theorems for rank statistics are proven.

Prices of options cannot be computed by evaluating a simple formula if the market model is only slightly more complex than Black-Scholes or if option structures are more complex. How can you price options in such cases and how can you guarantee accuracy of the computations? Contents: Generation of random numbers, Monte-Carlo and Quasi-Monte-Carlo methods, numerical methods for the computation of European and American options: binomial, finite difference and finite element methods, numerical methods for the simulation of stochastic processes: numerical treatment of stochastic differential equations.

Core components of the course are:

- Singular value decomposition, compression
- Network analysis
- Numerical schemes for parametric and nonparametric regression
- Numerical aspects of neural networks
- Deep learning: approximation theory and numerical aspects

The following topics are an integral part of the course:

- Numerical methods for initial value problems of ODEs
- Stiff ODEs
- Numerical methods for boundary value problems of ODEs

After concentrating on linear programming, integer linear programming, and some efficiently solvable discrete optimization problems in *Optimization and OR 1,* in *Optimization and OR 2 *we will focus on algorithmically hard problems, complexity theory, approximation algorithms, and heuristics. Furthermore, we will extend some of the fundamental results of linear programming to more general, and, in particular, convex optimization problems.

Incentive problems arise in any organization. How can employers design compensation schemes or jobs to induce workers to act in their interest? Contents: Theoretical models of team work and tournaments and their predictions of individual behavior, empirical and experimental evidence on actual behavior, implications of non-standard preferences.

# Course Descriptions

Varying topics in the theory of partial differential equations in the context of nonlinear analysis, mathematical physics, variational calculus, or geometric analysis; for instance: elliptic and parabolic regularity, singular integral operators, mathematical fluid dynamics, variational problems in geometry and mechanics.

This course covers the following content:

Statistical pattern recognition

Linear and nonlinear classifiers

Kernel methods

Structural pattern recognition

Context-dependent classification

Feature extraction, selection and reduction

System performance evaluation

Point processes are popular statistical models for point patterns in space. They have applications in various fields such as astronomy, biology, economics, forestry, geography, materials science or seismology. This course gives an introduction to their general theory with a particular focus on Poisson point processes and related models. Main topics include basic properties and characteristics of point processes, special classes of point processes, the Palm distribution and marked point processes.

The Practical Actuarial Science course covers in particular the features of innovative life insurance products, simulation with MS Excel and VBA and taxation.

This course offers an introduction to the practical application of tools for pricing and hedging of standard or complex derivative instruments, advanced stochastic simulation and numerical routines.

Students will develop their own equity investment strategies using advanced machine learning approaches using R. The course will combine lectures (offering key concepts in Backtesting, Machine Learning, and R) as well as hands-on empirical implementations of their own strategies. Student groups with successful backtest results will be invited to participate in the planned Ulm University Student Investment Initiative (U^{2}SI^{2}). Grades will be based on a written project report and a brief presentation.

This is an introductory course in the theory of random functions and fields. It provides an extension of some topics treated in the course "Stochastic II", by studying random processes with a spatial index.The main topics are Kolmogorov's existence theorem, stationarity and isotropy, basic models of random fields, correlation theory of stationary random fields, positive semi-definite functions, orthogonally scattered measures and stochastic integration.

This course is designed to prepare students for doing their own research projects in Finance. Students will learn how use Stata and how to manage data, how to visualize and clean data, and how to run and interpret cross sectional and panel regressions. Students are required to solve several empirical problems on their own.

This course provides an introduction to the basics of risk management in insurance. The lecture will be held completely in English. Designated topics are:

- Risk measures, economic capital and capital allocation
- Aspects of Solvency II
- Financial accounting
- Control parameters and performance indicators
- Interest rate risk
- State preference theory
- Equity options and the Black-Scholes-Merton model
- Portfolio planning and optimal decision theory

This course gives a brief introduction to finance and business ethics. Moreover, students will study the Financial Risk Manager Handbook and prepare presentations on key concepts used for the management of different types of risk such as market risk, liquidity risk, credit risk and operational risk.

This course provides an introduction to several stochastical and statistical methods of risk modeling and their applications. Some of the subjects discussed in the lecture are risk measurement using VaR and TVaR, relevant distribution families in risk theory, the collective model, credibility theory, statistical methods in risk theory, generalized linear models and their applications in risk theory, Monte Carlo simulations, mortality modeling, and stochastic processes in risk theory with focus on Markov chains and Markov processes.

This course provides an introduction to the mathematical models of (non-life) insurance with emphasis on premium calculation, reinsurance, ruin probabilities, claim reserving, Biometric Actuarial Bases, credibility theory, simulation and Markov-Chains.

In modern applications, there is a need to model phenomena that can be measured by very high numerical values which occur rarely. In probability theory, one talks about distributions with heavy tails. One class of such distributions are stable laws which (apart from the Gaussian one) do not have a finite variance. They possess a number of striking properties which make them inevitable in modelling of processes in radioelectronics, engineering, radiophysics, astrophysics and cosmology, finance, insurance, etc., to name just a few. This introductory lecture is devoted to basic properties of such distributions.

Main topics are stability with respect to convolution, characteristic functions and densities, Non-Gaussian limit theorem for i.i.d. random summands, representations and tail properties, symmetry and skewness and simulation. Further topics are multivariate stable distributions and elementary stable random processes.

In this course, "linear methods" and "basis expansions and regularitzation methods" used in supervised statistical learning are analyzed. Thus, we study multivariate linear regression, ridge regression, lasso regression, classification problems, splines and filtering. Additionally, an overview on neural networks and support vector machines is given. Model assesment, selection and inference methodologies are core topics of the course. The course covers both a statistical approach to the study of the models as well as implementation of algortihms in R.

Stochastic geometry deals with random geometric structures. Stochastic geometry II covers two selected topics from stochastic geometry: Random compact sets and random tesselations. More precisely, the course investigates the stochastic properties of the convex hull of random points and the most important models for random tesselations as well as the corresponding geometric quantities of the so-formed cells.

The course Stochastics II gives an introduction to different classes of stochastic processes. Key aspects are counting processes and renewal processes, Poisson point processes, the Wiener process, martingales, Lévy processes and stationary processes in discrete time. We shall discuss analytic, geometric and asymptotic properties of stochastic models to provide the students with knowledge of statistical methods and simulation algorithms.

Time-to-event data are omnipresent in fields such as medicine, biology, demography, sociology, economics and reliability theory. In biomedical research, the analysis of time-to-death (hence the name survival analysis) or time to some composite endpoint such as progression-free survival is the most prominent advanced statistical technique. At the heart of the statistical methodology are counting processes, martingales and stochastic integrals. This methodology allows for the analysis of time-to-event data which are more complex than composite endpoints and will be the topic of this course. The relevance of these methods is, e.g, illustrated in the current debate on how to analyse adverse events. Time permitting, we will also discuss connections between causal modelling and event histories.

In many application areas, the data to be analyzed form a sequence of observations given at a sequence of time points, that is, a time series. For instance, stock prices, exchange rates or meteorological data are typically recorded at a sequence of time points and thus yield time series. The fact that the data are subject to a certain chronological order is crucial for their analysis and has to be taken into account when formulating statistical models. Trends, seasonal effects, and stationarity will be fundamental notions in this course. We will discuss autocovariance and autocorrelation functions as a tool for analyzing dependencies in time. Particular attention will be given to ARMA (auto regressive moving average) processes as the most important linear model for time series. Within the setting of ARMA processes we will discuss statistical inference and forecasting methods. In addition to problems on the mathematical theory, homework sets will include practical examples.

The aim of the course is to discuss topics from the recent journal literature in the area of insurance and quantitative finance. The general focus of the course is on the differences and relations between finance and insurance. The lecture will in most parts be based on recent journal papers of which a list will be provided.

The contents of this course may change from year to year, depending on recent developments. Some examples of subjects are the following: Optimal Control Theory, allocation of risk capital, decision under uncertainty, risk measures, pricing in incomplete markets, pricing of insurance products.

This lecture covers the following topics: Life and pension mathematics, the Black-Scholes-Merton model, unit-linked life and pension products (participating contracts and variable annuities), default risk modeling (structural approach) and optimal asset allocation and its application in life/pension insurance.

The lecture introduces two main methods to estimate firm value: Market-based approaches, such as multiples, and net-present value approaches, such as discounted cash flows or valuation via CAPM. Further attention is given to relevant legal and institutional norms and special issues such as non-operating assets or pensions and other liabilities. The theoretical knowledge is applied to case studies.

# Further module descriptions depending on the chosen specialization

Specialization Actuarial Science

Specialization Financial Economics

Specialization Financial Mathematics