The future of financial mathematics

Photo Nicole El Karoui / Professor of Applied Mathematics, Pierre et Marie Curie University (Paris VI), Research Director, Center for Applied Mathematics, Ecole Polytechnique / September 6th, 2013

How do financial mathematics specialists imagine the markets in five to ten years? What exactly will their role look like? In a speech before ENSAE ParisTech alumni, specialist Nicole El Karoui addresses these issues through a retrospective history of both the academic discipline and business practices. Over time, specific stakes have shaped a field that remains crucial to the financial business and keeps evolving.

In the late 1980s, when I began to get involved in the training of future professionals of the financial sector, mathematics did not play the role they were given afterwards. But with Helyette Geman (ESSEC), I had spent a whole year in a bank analyzing and explaining to practitioners the first stochastic interest rate models. At that moment, I experienced a genuine intellectual meeting.

In fact, it soon emerged that there would be a growing need for professionals able of understanding maths: quantitative issues were becoming a central part of finance. Meanwhile, we also needed to train mathematicians able to understand finance: “quants” that have enough understanding of the markets to implement, evaluate and use appropriate models. In 1990, with Helyette Geman, we created a Probability and Finance option in the Master of Advanced Studies of Probability at the Université Pierre et Marie Curie-Paris VI, jointly accredited by the École Polytechnique, ENPC and ESSEC. It was the first training of its kind in the scientific community.

However, at that time, the main issue wasn’t only about the use of high-level mathematics in the world of finance: it was also the beginning of a new conception of risk.

“Noise” vs. the fundamentals?
From the start, one of the aims of the world of finance consisted in managing and reducing risks as much as possible. During the late 1980s, a quantum leap was achieved thanks to the quick development of dynamic strategies, as opposed to techniques that had previously been applied by ensuring an average risk through the weighting of different asset classes.

During the development of financial mathematics in the early 1990s, the first challenge was to implement a dynamic hedging, supported by increasingly sophisticated mathematical models. And as the models strengthened, researchers and practitioners started to examine more closely, not only the models themselves, but the disruptive factors, the random elements that distorted curves.

Indeed, markets can be represented as dynamic systems disturbed by “noise”. It is precisely this noise (i.e. the day to day volatility) which received, in the mid-1990s, all the attention. Not without consequences, as we shall see. And it would be interesting to know what led traders, those who trained them, and those who conceived the software they use, to focus on this immediate aspect.

During those years, from a technical point of view, the context was changing rapidly. With the development of the processing power of computers, simple models (binomial, with the possibly of additional parameters) that could be calculated by hand were replaced by more sophisticated calculations, involving specialized software. In 1990, in Chicago, they would use the same computer systems as in 1973! And other world stock markets were not much more advanced.

This situation didn’t last very long and progresses have been meteoric. On one hand, we have a very quick revolution of the market size and on the other, an equally rapid implementation of sophisticated techniques to apprehend them. For instance, partial differential equations are very quickly calculated by computers. This paves the way for more complex probabilistic techniques, which allow more refined models. For example, techniques such as Monte-Carlo, which consists in repeating a lot of times a phenomenon in order to obtain a more reliable approximation of the true value of the mathematical expectation.

The nature of financial expertise changed completely. From a scientific point of view, it was very exciting. The aim was to integrate increasingly complex equations, such as stochastic processes (random time-dependent phenomena), and compare the obtained models with the movements actually observed. Brownian “noise” becomes the object of attention, because there is a difference between the statistical level of noise and that observed in facts.

This is explained by the fact that market participants react to immediate information, in other words, that noise creates noise. This phenomenon was exacerbated by the regulators’ decision in 1998 to provide operators with daily information centered on the concept of value at risk. Fundamentally, it wasn’t a bad idea: it provided richer and more complete information, and changes from one day to the other of the market price contain much useful information, for example on liquidity.

But it went too far, in the sense that it led to underestimate the information based on the historical behavior of prices. Young traders we had trained had no experience or enough hindsight to free themselves from this focus on immediate data. They were mathematicians, not economists: they didn’t care enough about the past or the underlying. Both the models and the young people who used them were lacking the ability to articulate the risk of the underlying and that related to the actual market. Models become more sophisticated, but the same logic has persisted: we buy, we sell on a daily basis, and the statistics and historical information are neglected, and even disappear from models. But these are fundamental.

This is a lesson for all players, but especially for those who, like me, are in charge of training: trainings focused on prices will have to rely more heavily on the historical aspect. Actually, it’s been almost fifteen years since the regulators are asking for it. And the recent crises have further increased the need to change the approach. Easier said than done! To take an example, the calculation of counterparty risk is more complicated on a yearly basis than on an instant basis.

Today’s challenges

The world in which we live today does not know the same expansion as the years 1990-2000. Moreover, the crisis has highlighted the importance of the hidden, implicit links, and more generally of what we call the “systemic risk”. To put things simply, until 2008 the focus was set on the “risk of noise”. Since 2008, we are learning to manage the risk of chaos.

In terms of risk management, this implies especially a stronger consideration of collateralization. In this context, financial mathematics are set to undergo an inflection. Earlier systems need to integrate different techniques: collateral management requires to follow several curves at the same time; taking into account the systemic risk poses considerable problems, since it requires both a better understanding and to prevent the creation of systems that by responding to signals from systemic risk, will increase this risk.

Let’s take as example, the credit derivatives that proliferated in the 2000s and that have played a crucial part in the 2008 crisis.

Credit derivatives have grown significantly in recent years, and even before the crisis there were doubts about the robustness of the models on which they relied. As early as 1998, the U.S. regulator – who kept in memory the Black Monday of October 1987 – asked financial institutions to produce a daily Value at Risk of market risks, that is to say of the aggregate activity of a trading room. Although computers have gained significant power and that the models have been refined, it was – and remains – a real challenge for financial institutions and also for training facilities that provide the quants who will implement this new measure. Exchanges between the academic world and the market have intensified, especially around the relevance of VaR as a risk measure. In particular, we had discussed with professionals from trading desks, in the years preceding the crisis, of the risk of default. To summarize, the management of the risk of default was okay as long as the default was probable, but what would actually happen if default really occurred was never contemplated. The discussions were not successful (it’s always difficult to discuss during a financial bubble!) and the issue is now back on the agenda.

It is all the more urgent to address this problem that the concentration of the financial sector, further enhanced by the crisis, tends to make of each trading room a systemic player. In this regard, it should be noted that until recently, mathematicians would validate models without knowing the size of the “poses” (exposures) which were behind. That’s absurd!

And tomorrow?
All in all, a number of developments are underway or will be required, regarding both practices and training. Trainers are encouraged to put particular emphasis on statistics and to get students to work on a vision of global quantitative risk. This is now a central aspect. Some teachings have been reinforced: regulation, market risk.

The range of the considered disciplines raises another problem. Indeed, a few months of training are barely enough for the student – even the most brilliant – to assimilate all of stochastic calculus, finance, statistics, law… The question of the time needed to ingest all this science is complicated because banks “hunt” our students already during their internships.

Quants are very active in the areas of overall risk (risk analysis, simulations), but also increasingly in model validations. Ultimately, with the consciousness that when they build or validate a model, it will have an impact on prices. Taking into account the systemic dimension is a real challenge, but it offers a real opportunity for financial mathematics. Markets have worked for more than fifteen years on an imaginary real time: they need to build a different relation with time. Concerning this precise issue, quantitative finance has its word to say.

The quantitative view of finance will not set the pace, although in some sectors (asset management, finance) some dream of going back to the old-style, less sophisticated way of doing things. That will certainly not eliminate systemic risks! More generally, rare risks don’t exist: we can work at least on this basis, except to play with fire. Today, it is crucial to develop new tools (and train professionals) for risk detection, by reworking on the determination of the exposure to risk.

The models are structurally imperfect. At best, they simplify the reality; but a model is wrong by definition. One important challenge is to clarify the use that can be made of them. Basically, the issue is to use them while knowing that they are false and, if possible, understand their limits. This is a critical issue for professionals, today: they must be able to understand the mechanics and identify the logic of what they use. And be able to worry if they don’t understand the logic or spot aberrations. I believe that the issue of the position regarding the model is absolutely essential.

A problem in this respect is that the regulator fixes a number of positions, instead letting them evolve. It’s dangerous to think that defining a standard requires a consensus. This does not capture complexity, which requires a different approach. The more time is introduced in the analysis, the more complex it becomes because it involves thousands of factors. Under these conditions, one can rely on a standard and strong enough vision. But then, it is important to explore the areas where we can’t explain everything.

Another problem: during the crisis, much attention was paid to economists, when these are disconnected from the technological realities (high-frequency trading, new software…). The same goes for regulators and policy makers. There is therefore a danger of creating a regulation completely disconnected from market reality. I strongly believe in the necessity of a dialogue between the various players and the different disciplines involved.

More on paristech review

By the author

  • The future of financial mathematicson September 6th, 2013
  • Brain Molecule Marketing

    Until any of this is fully experimentally tested in double-blind experiments and replicated and picked apart – this is a vast waste of resources. Back testing is not proof – it is curve-fitting.

    None of this, or econ, has any experimental or empirical basis. Econ doesn’t even have theories, but just models. Theories must be testable and falsifiable. duh…

  • Brain Molecule Marketing

    Oh yeah, the proper word and idea is “uncertainty” not “risk.” Risk is a known probability of an outcome. Three is very little of that in finance and econ. Probably, none.

  • Pingback: The Future of Financial Mathematics | Carl A R Weir's Blog

  • Thibaut Lemoine

    If we follow your reasoning, the NASA should not waste more resources by trying to build space stations, because none of the tests they make can prove that anything will succeed with absolute certainty… Neither theories nor models can “prove” anything about reality and its mysteries: what if I told you that applied mathematics don’t aim to help us predicting the future, but rather dealing with uncertainty?

    That’s why you confuse backtesting (which is nothing more than statistics) and calibration (which is TRUE curve-fitting): you don’t know what all of this is about. Maybe you still think that Black-Scholes model aims to “predict” the price of an option?

    I won’t go into details and I invite you to review a bit of literature before pretending you are qualified to judge a science, but I will finish on the following thought: the concept of risk does not necessarily rejects the idea of uncertainty. There is a risk that a lion goes to your house and eats you, but can you give a probability to this event? And more important, do you really have to consider such an event to make your everyday decisions? Please tell me how you live in an ideal and deterministic world…

  • Pingback: Les profils matheux non grata dans la finance? | finobuzz

  • The Magus

    Oh dear Nicole, this is expected from an academic when financial mathematics is entering the oblivion. One crisis after another has just “proved” that mathematics has failed miserably in financial markets. “some dream of going back to the old-style” is perhaps the right way to go. Embracing mathematical complexity would only intensify the already troubled markets. Was mathematical complexity a success, where are the exotic markets gone? Are you one of those who attempts at “teaching birds on flying?”

  • The Magus

    Dear Nicole,

    You have used the term “risk” so often in your article – “new conception of risk”, “reducing risks”, “average risk”, “concept of value at risk”, “systematic risk”, “risk of noise”, “global quantitative risk”, “overall risk”, “exposure to risk”, etc. but have failed to define risk. By the way, what is your precise definition of risk? What is uncertainty?

    “trainings focused on prices will have to rely more heavily on the historical aspect.”… What historical aspect (or use of statistics) has got to do with risk?

    “and the statistics and historical information are neglected, and even disappear from models. But these are fundamental.” This is a contradiction you made to your own philosophy of applied mathematics. Why? Remember back in 1960′s, statisticians are only those operating in the financial markets determining prices of derivatives, they were then wiped out of the markets by the fancy mathematics – the so-called stochastic calculus introduced by Merton. This new mathematics and it’s theory disregard historical data (prices or information). The use of Black-Scholes formula and its implied volatility is a shining example of such a contradiction.

    You oppose to the old style “some dream of going back to the old-style, less sophisticated way of doing things.”, yet you encourage to use statistics which is itself an old style, the 1960′s style as pointed out above.

    All in all, the points you try to make in your article are very confusing, aimless,…Regarding the FUTURE of financial markets, as opposed to “the future of applied mathematics”, i think your article has just added additional UNCERTAINTY to the markets, which could lead to another crisis.

    I sincerely expect your responses as i think you are passionate about what you do.

www.paristechreview.com

This content is licensed under a Creative Commons Attribution 3.0 License
You are free to share, copy, distribute and transmit this content

Logo creative commons

5 quai Voltaire 75007 Paris, France - Email : contact@paristechreview.com / Landline : +33 1 44 50 32 89