Wednesday 24th March 2021
We propose to combine state-of-the-art tensor completion and machine learning to efficiently learn multivariate functional representations of financial data. We identify two main challenges in learning multivariate functional representations of financial data: 1) A large data challenge, i.e. the problem of accessing, storing and processing large data sets; and 2) A lack-of-large data challenge, i.e. to apply modern machine learning methods large data sets are required, but in finance often only scarce data is available. Our experiments (with up to 25 underlying assets) reveal that the method is successful in providing prices fast and accurately. Indeed, the accuracy of the resulting prices is comparable to that obtained by the benchmark methods, which are computationally more expensive. In the basket option case, the gain in speed is by a factor of roughly 104 with a squared error in the tenth digit on average.
The proposed combination of tensor completion with machine learning to overcome the curse of dimensionality is a novel idea and proves highly promising for financial applications.
Lecturer in Financial Mathematics, Queen Mary University of London
Kathrin Glau: Lecturer in Financial Mathematics, Queen Mary University of London
Kathrin Glau currently is a Lecturer in Financial Mathematics at Queen Mary University of London & FELLOW co-founded by Marie Skłodowska Curie at École Polytechnique Fédérale de Lausanne. Between 2011 and 2017 she was Junior Professor at the Technical University of Munich. Prior to this she worked as a postdoctoral university assistant at the chair of Prof. Walter Schachermayer at the University of Vienna. In September 2010 she completed her Ph.D. on the topic of Feynman-Kac representations for option pricing in Lévy models at the chair of Ernst Eberlein.
Her research is driven by the interdisciplinary nature of computational finance and reaches across the borders of finance, stochastic analysis and numerical analysis. At the core of her current research is the design and implementation of complexity reduction techniques for finance. Key to her approach is the decomposition of algorithms in an offline phase, which is a learning step, and a fast and accurate online phase. The methods range from model order reduction of parametric partial differential equations to learning algorithms and are designed to facilitate such diverse tasks as uncertainty quantification and calibration, real-time pricing, real-time risk monitoring, and intra-day stress testing.
Many pricing problems boil down to the computation of a high dimensional integral, which is usually estimated using Monte Carlo. The convergence of this algorithm can be relatively slow depending on the variance of the function to be integrated. To resolve such a problem, one would perform some variance reduction techniques such as importance sampling, stratification, or control variates.
We will study two approaches for improving the convergence of Monte Carlo using Neural Networks. The first approach relies on the fact that many high dimensional financial problems are of low effective dimensions. We expose a method to reduce the dimension of such problems in order to keep only the necessary variables. The integration can then be done using fast numerical integration techniques such as Gaussian quadrature. The second approach consists in building an automatic control variate using neural networks. We learn the function to be integrated (which incorporates the diffusion model, the discretization as well as the payoff function) in order to build a network that is highly correlated to it. As the network that we use can be integrated exactly, we can use it as a control variate.
Zineb El Filali Ech-Chafiq:
Quantitative Analyst, Natixis
Zineb El Filali Ech-Chafiq: Quantitative Analyst, Natixis
We present the deep parametric PDE method to approximate multi-asset option prices simultaneously for a range of times, states and option parameters of interest. We use an unsupervised learning approach with deep neural networks to numerically solve the high-dimensional parametric partial differential equation. Motivated by the deep Galerkin method, the loss function is only based on the partial differential equation. After a single training phase, the option price for different time, state and parameter values can be computed in millisecond. We evaluate the performance based on the error in the price and the implied volatility with examples of up to 25 dimensions.
Postdoctoral Research Assistant: Queen Mary University of London
Linus Wunderlich: Postdoctoral Research Assistant: Queen Mary University of London
- Base line: deep neural networks (DNN) serve as surrogates for computationally expensive financial models
- Pruning yields DNN with intermediate layers of varying cardinalities
- Global Hessian by chaining given local Hessians of individual layers
- Hessian chaining is NP-complete
- Dynamic programming heuristic yields substantial improvement in terms of computational cost
Professor of Computer Science, RWTH Aachen University
Uwe Naumann: Professor of Computer Science, RWTH Aachen University
Uwe Naumann is the author of the popular text book on (Adjoint) Algorithmic Differentiation (AAD) titled “The Art of Differentiating Computer Programs” and published by SIAM in 2012. He holds a Ph.D. in Applied Mathematics / Scientific Computing from the Technical University Dresden, Germany.
Following post-doctoral appointments in France, the UK and the US, he has been a professor for Computer Science at RWTH Aachen University, Germany, since 2004. As a Technical Consultant for the Numerical Algorithms Group (NAG) Ltd. Uwe has been playing a leading role in the delivery of AAD software and services to a growing number of tier-1 investment banks since 2008.
Recently it was announced as part of a trend towards ever more parameters that Microsoft’s Deepspeed was fitted with 17 billions parameters. However such deep networks rely on the perceptron – a blunt instrument for building financial models, leading to overly-complicated, non-interpretative, models built without regard for stylized properties of financial data.
In this talk, we build on the legacy of financial econometrics, replacing the perceptron as the basic building block of deep networks with econometrics models. This solves many of the pain-points associated with deep learning in a principled way: (i) providing econometric interpretability; (ii) reducing the number of parameters; and (iii) the ability to recover the classical model for apples to apples performance benchmarking. Moreover, the network intrinsically provides a principled approach to model averaging, without reliance on Rube Goldberg like hybrids which plague the industry. We demonstrate two examples of such models: a RNN built from GARCH cells and a RNN built from exponential smoothing cells. Finally we demonstrate how this approach fits into a Bayesian framework for providing uncertainty quantification.
Stuart School of Business, Illinois Institute of Technology
Matthew Dixon: Stuart School of Business, Illinois Institute of Technology
Matthew Dixon, Ph.D, FRM, began his career as a quant in structured credit trading at Lehman Brothers. He has consulted for numerous investment management, trading and financial technology firms in machine learning and risk analytics. He is the author of the 2020 textbook “Machine Learning in Finance: From Theory to Practice” and has written over 20 peer reviewed papers on machine learning and computational finance, including SIAM J. Financial Mathematics and the Journal of Computational Finance. He is the recipient of an Illinois Tech innovation award, and his research has been funded by Intel and the NSF. Matthew has recently contributed to the CFA syllabus on machine learning and he currently serves on the CFA advisory committee for quantitative trading. He has been invited internationally to give talks at prestigious seminars organized by investment banks and universities in addition to being quoted in the Financial Times and Bloomberg Markets. He holds a Ph.D. in Applied Math from Imperial College, has held visiting academic appointments at Stanford and UC Davis, and is a tenure-track Assistant Professor at Illinois Tech.
Wednesday 24th March 2021
Abstract: We propose a framework for the valuation and the management of complex life insurance contracts, whose design can be described by a portfolio of embedded options, which are activated according to one or more triggering events. These events are in general monitored discretely over the life of the policy, due to the contract terms. The framework is based on Fourier transform methods as they allow to derive convenient closed analytical formulas for a broad spectrum of underlying dynamics. Multidimensionality issues generated by the discrete monitoring of the triggering events are dealt with efficiently designed Monte Carlo integration strategies. We illustrate the tractability of the proposed approach by means of a detailed study of ratchet variable annuities, which can be considered a prototypical example of these complex structured products.
Reader, Financial Mathematics, Cass Business School
Laura Ballotta: Reader, Financial Mathematics, Cass Business School
Dr Ballotta works in the areas of quantitative finance and risk management. She has written on topics including stochastic modelling for financial valuation and risk management, numerical methods aimed at supporting financial applications, and the interplay between finance and insurance.
Recent major contributions have appeared in Journal of Financial and Quantitative Analysis, European Journal of Operational Research and Quantitative Finance among others.
She serves as associate editor and referee for a number of international journals in the field.
Laura Ballotta obtained her PhD in Mathematical and Computational Methods for Economics and Finance from the Università degli Studi di Bergamo (Italy), following her BSc in Economics from Università Cattolica del Sacro Cuore, Piacenza (Italy), and MSc in Financial Mathematics from the University of Edinburgh – jointly awarded with Heriot-Watt University (UK). Laura has previously held positions at Università Cattolica del Sacro Cuore, Piacenza (Italy), and Department of Actuarial Science and Statistics, City University London (UK).
In this talk we give an overview of some recent applications of machine learning in portfolio management. We provide an example of each of the following (time permitting):
- Unsupervised learning: Regime detection and risk on / risk off
- Supervised learning: Using alternative data to improve revenue predictions
- Reinforcement learning: Portfolio risk management and hedging
Professor, Courant Institute of Mathematical Sciences, NYU
Petter Kolm: Director of the Mathematics in Finance Master’s Program and Clinical Professor, Courant Institute of Mathematical Sciences, New York University
Petter Kolm is the Director of the Mathematics in Finance Master’s Program and Clinical Professor at the Courant Institute of Mathematical Sciences, New York University. Previously, Petter worked in the Quantitative Strategies Group at Goldman Sachs Asset Management where his responsibilities included researching and developing new quantitative investment strategies for the group’s hedge fund. Petter has coauthored four books: Financial Modeling of the Equity Market: From CAPM to Cointegration (Wiley, 2006), Trends in Quantitative Finance (CFA Research Institute, 2006), Robust Portfolio Management and Optimization (Wiley, 2007), and Quantitative Equity Investing: Techniques and Strategies (Wiley, 2010). He holds a Ph.D. in Mathematics from Yale, an M.Phil. in Applied Mathematics from the Royal Institute of Technology, and an M.S. in Mathematics from ETH Zurich.
Petter is a member of the editorial boards of the International Journal of Portfolio Analysis and Management (IJPAM), Journal of Financial Data Science (JFDS), Journal of Investment Strategies (JoIS) and Journal of Portfolio Management (JPM). He is an Advisory Board Member of Betterment (one of the largest robo-advisors) and Alternative Data Group (ADG). Petter is also on the Board of Directors of the International Association for Quantitative Finance (IAQF) and Advisory Board Member of Artificial Intelligence Finance Institute (AIFI).
As a consultant and expert witness, Petter has provided his services in areas including alternative data, data science, econometrics, forecasting models, high frequency trading, machine learning, portfolio optimization w/ transaction costs and taxes, quantitative and systematic trading, risk management, robo-advisory and investing, smart beta strategies, transaction costs, and tax-aware investing.
- We will discuss the considerations when designing a software library to develop systematic trading strategies for FX options
- We shall give examples of FX options trading strategies and show how to implement them using the open source finmarketpy library
Saeed Amen: Founder: Cuemacro
Saeed has a decade of experience creating and successfully running systematic trading models at Lehman Brothers and Nomura. He is the founder of Cuemacro, Cuemacro is a company focused on understanding macro markets from a quantitative perspective. He is the author of ‘Trading Thalesians – What the ancient world can teach us about trading today’ (Palgrave Macmillan), and graduated with a first class honours master’s degree from Imperial College in Mathematics& Computer Science.
Associate Director, Quantitative Analyst, Model Validation, Banco Santander
Ángel Rodríguez-Rozas: Associate Director, Quantitative Analyst, Model Validation, Banco Santander
Ángel Rodríguez Rozas holds a Ph.D. in Computational and Applied Mathematics from the University of Lisbon and an M.Sc. in Artificial Intelligence from the Universitat Rovira i Virgili (URV) and the Polytechnic University of Catalonia (UPC). He has authored more than 20 research articles in international peer-reviewed journals in many different areas, including artificial intelligence, numerical methods for PDEs, high-performance computing, plasma physics, the finite element method, seismic wave propagation, and oil&gas simulation and inversion of petrophysical measurements.
Ángel joined Banco Santander in 2018 where he is working as a Quant Analyst in the Internal Validation team, within the Risk Department. As part of his role, Ángel is responsible for leading the design and development of a numerical library for the internal validation of pricing models, including interest rates, FX, credit, commodities, equity, inflation, and xVA. His research efforts are currently focusing on the finance industry, investigating efficient numerical methods (Quasi- and Monte Carlo methods, Finite Elements) and quantum computing algorithms (digital and analog) for the pricing of financial derivatives.
- Universal Quantum Computers vs Adiabatic Quantum Computers
- Quantum Annealing
- Reverse Quantum Annealing
- Quantum Random Walk
- Overview of some published research on quantum computing in finance
- Overview of the size of the current quantum computing market and its predicted future
PhD Student in Quantum Computing, UCL
Nedeen Alsharif: PhD Student in Quantum Computing, UCL