How Important is Academia in Today’s Application of Finance?
Abstract — This article looks into the relationship between academic theory and practical applications in modern finance, highlighting ongoing debates regarding the relevance of traditional theories in real-world scenarios. By scrutinizing the flaws in existing models and analyzing their practical implications, the article aims to provide insight into the evolving landscape of financial theory and its application in today’s markets.
I. Introduction
As I delve deeper into the field of Quantitative Finance, I became increasingly aware of the ongoing debates surrounding the relevance of academic theories in the practical realm of modern finance. Business schools drill the concepts of “modern” finance — concepts that emerged from the mathematics of chance and statistics in the 1900s to 1960s. Branches of analyzing markets became apparent. Investors started to develop a very elaborate toolkit for analyzing markets: Technical, Fundamental and Quantitative. With this, investors started to debate on which is the financial equivalent of alchemy. Focusing on the quantitative aspect, mathematicians and economist started to derive intricate mathematical systems to beat the market. However, market crashes (Black Monday of 1987, Asian economic crisis of 1977, Russian summer of 1998, Long Term Capital Management Collapse of 1998, Dot-Com crash of 2000, just to name a few) prove these theories obsolete. Surely, many have now realized that something is not right. If the theories can truly model reality, why is the outcome so disappointing time and time again? I shall dive into the topic: The relevance of academic in the application of modern finance today.
My article is made out of the following sections. They are:
1) Efficient Market Hypothesis and Random Walk Theory
2) CAPM and Modern Portfolio Theory
3) The fundamentals of Quantitative Finance
4) The flaws in the Quantitative Finance
5) Evaluation of the theoretical versus practical aspects
6) Final Thoughts
II. Efficient Market hypothesis and the random walk theory
For much of market’s history, hedge funds have skirmished with the academic view of the market. The prevailing view was that the market is efficient, prices follow a random walk and that hedge fund’s success was mainly driven by luck. As this critique anticipates, hedge funds have no real “edge” — there is no distinctive investment insight that allows them to beat the market consistently.
The Efficient Market Hypothesis holds in an ideal market where all relevant information is already priced into security today. One illustrative possibility is that yesterday’s change does not influence today’s, nor today’s tomorrows. Essentially, each price change is “independent” from the last. It determines whether successful investing is all about luck. The Efficient Market Hypothesis requires sufficiently large random reactions so that excess profit is impossible to be made when there is a release of news or data. Past historical data is futile in predicting the market’s directional movement. For instance, one has spotted a pattern (technical analysis) which indicates a rise in price of the stock. Assuming the market is big and efficient, others will also spot that same trend and initiate the same trades earlier than the predicted rally and to beat the trend for the earlier trades, others will initiate even earlier. The whole phenomenon is spread out over the large time frame that it ceases to be noticeable. The trend vanishes and is killed by its very discovery.
The random walk model (a component of the Efficient Market Hypothesis) was started out by French Mathematician Louis Bachlier before being proposed by Eugene Fama in the 1960s1. The theory postulates that market prices will go either up or down with equal probability, just like a fair coin turning heads or tails. The magnitude that prices vary was also measurable by their standard deviations (also known as volatility — a metric to measure risk). Most changes, 68 percent, are small moves up or down, within one “standard deviation”, 95 percent, within two “standard deviation” and 98 percent within three. As such, Bachlier’s theory is heavily dependent on Gaussian Distributions of stock prices where anomalies are very rare just from the series of tossing coins. One can have long streaks of tossing only heads but averaging over the long run, one expects to find the mean due to its Gaussian property, together with the concept of Law of Large Numbers. In the absence of new information that might change the balance of supply and demand, what is the best possible forecast for tomorrow? Again, prices can go up or down, either by big or small increments, but, with no new information to push the price decisively in one direction or another, the price on average will fluctuate around its starting point. Moreover, each variation in price is unrelated to the last, and is generated by the process that drives the market where price changes form a sequence of independent and identically distributed random variables (i.i.d in statistics and econometrics). This is the foundation of quantitative finance where it leads onto valuation of market instruments via the Binomial Tree and random Brownian Motion in discrete and continuous time respectively (more on that later).
Therefore, reading price charts, analyzing public information and acting on inside information is futile as the market quickly discounts the new information that results. Prices rise or fall to reach that new equilibrium of buyer and sellers and the next price change is repeated in that exact same cycle. If you have special insights into a stock, you could profit from being the first in the market to act on it, but how can you quantify that you are the right or the first one?
The Efficient Market Hypothesis, despite being an elegant theory, is flawed due to two main underlying assumptions. This old financial orthodoxy was assumed that price changes are statistically independent and that they have a Gaussian Distribution. The first assumption has already been debunked by research, indicating that financial price series have some form of ‘memory’. Today, does in fact influence tomorrow. Market turbulence tends to cluster, when a market opens choppily, it may well continue that way. This is the same logic of volatility clustering due to dependence and also one of the main logics for the Generalized Auto-Regressive Conditional Heteroskedasticity (GARCH) to model risks in econometrics. If prices do in fact take a big leap up or down, there is a measurably greater likelihood that they will move just as violently the next day.
The second assumption indicates that prices should cluster around the mean or average of no change. However, from the crashes we observed from 1990 to 2008, it reinforces the notion that the bell curve fits reality very poorly. The Gaussian Distribution does not account for the fat tails (kurtosis) which are the anomalies. Extreme occurrences can be more frequent than otherwise theoretically expected as markets are governed by human behaviour, under- and overreactions to various data and indicators and the herding instincts of participants push prices to extremes, explaining the prevalence of such extreme but infrequent events in reality. Yet, in many Value-At-Risk (VAR) models, the odds of crashes happening are vanishingly low. Calculations indicate that it would take a ten-sigma event, an event that occurs every 1024 times. Theory predicts index swings of more than 7 percent should come once every 300,000 years. With odds like that, you are more likely to get vaporized by a meteorite landing on your house than you are to go bankrupt in a financial market. In fact, the twentieth century saw 48 of such days. This is irrefutable evidence that theory has indeed failed to model reality. For instance, Long Term Capital Management collapsed despite multiple VAR, portfolio stress testing and ‘what if’ analysis2. Extreme price swings are the norm in financial markets and price movements do not follow Gaussian Distributed governed by modern finance. A sound trading strategy or portfolio metric would drill this fact into their foundations — big news causes big market action, and that action concentrates in small slices of time.
After discussing the concepts of The Efficient Market Hypothesis and Random Walk Theory, is it still relevant in a practitioner’s approach to seeking alpha today? Is it even possible to profit from trading strategies without brushing it aside as sheer luck? I will leave the following quote I picked up from Jim Rogers3.
Do you think long-term success stories in markets are lucky?
If somebody flips a coin a hundred times, somebody is going to come out on top. But there are enough people who have made investment decisions and have explained why they made them and have turned out to be right, to know that they were not just flipping coins. If a guy says, “I’m going to buy X for these reasons” and it goes up for 10 years for those reasons, I hardly call that flipping a coin.
III. CAPM and modern portfolio theory
In 1952, Harry Markowitz was the first to propose a modern quantitative methodology for portfolio selection. The idea was simple: If the market price of a stock is lower than what the true valuation is, buy it. Eventually, the rest of the market will agree with you, pushing the demand up for the stock and the stock’s price will rise, you will end up with a profit. If it is risky, no problem. Pick several stocks to spread your bets. This concept can be best explained via the example of a casino. The casino has a famous saying “the house always wins”. For every bet someone offers, the casino will always have a theoretical edge in the long run due to the laws of probability, regardless of the amount of money wagered. Looking from a player’s point of view, if the player knows that the odds are against him and he wants the greatest chance of showing a profit, his best course is to win the short run game and hope he gets lucky. If he continues to make bets over a long period of time, the laws of probability eventually will catch up with him, and the casino will end up with the player’s money. Casinos do not like to see an individual player wager a large amount of money on one outcome, no matter the game. The laws of probability are in the casino’s favor, but if the bet is large enough and the bettor happens to win, the short-term bad luck can overwhelm the casino. To the casino, this is their tail risk which is why casinos often have betting limits. Therefore, the casino would have multiple players to play on the same table to spread their bets. Spreading helps to maintain profit potential but reduces the short-term bad luck. The casino represents traders. Traders do not bet all their capital on one stock, in hopes that they are right. Instead, they try their best to spread of their risks by diversifying their portfolio. If you are good at it, you will end up with enough winners to offset your losers.
In Corporate Finance, the way to value stocks was to calculate its cash flows (dividends it will pay) and then adjust for inflation, forgone interest and other factors that would make the forecast uncertain. This is the premise for discounted cash flow analysis. However, investors do not think that way. They do not just purchase one stock and wait for the winnings to roll in. Investors think about diversification. They judge its risk to reward ratio in comparison to other stocks. “Don’t put all your eggs in one basket” is the fundamental premise of Modern Portfolio Theory. The prospects of a stock can be described by its return (reward) and variance (risk). Analyze each stock and plot them all on a graph of expected returns against standard deviations (square root of variance) then finally, combine the stocks to build a portfolio. As each possible combination of stocks will give a different overall return and different overall risk, estimating their profiles is not a simple matter of additive functions. It is more complicated than that. We will require correlation to build a well-diversified portfolio. For instance, if we toss a coin 100 times and assume that the outcome of heads and tails are totally uncorrelated, we will likely come out even. Our bets are diversified. However, if the outcome of the heads and tails are correlated, we succumb to the outcome of the toss, making us either rich or poor. The key input of Modern Portfolio Theory is the correlation of financial instruments in the portfolio. If we mix some stocks that flip tails with others that flip heads (uncorrelated), we can lower the risk of our overall portfolio. As we keep adding more stocks in different combinations, together with the mean-variance optimization technique (through language multipliers and constraints), we obtain an “efficient” portfolio where it produces the most profit with the least risk. According to the theory, no one should hold portfolios that are not on the efficient frontier (the curve in the risk-return space with the largest expected return for each level of risk). Our appetite for risk then decides what portfolios we choose (i.e aggressive traders would go for the section near the top of the graph where the majority of the equities are while the conservative traders take the bottom section of the graph where the majority of the bonds are).
Despite a concise and logical theory, it comes with its fair share of flaws. Firstly, Modern Portfolio Theory is dependent on the Gaussian Distribution. As discussed in Chapter II, it is obviously not the best way to measure the stock-market (easy but not necessarily right). Secondly, to build efficient portfolios, you need good data. “Garbage in, garbage out”, is a common saying in any field of quantitative modelling. You need good forecasts of earnings, share prices and volatility for thousands of stocks (though I admit that this flaw is applicable to all fields that require a model). Lastly, you must laboriously calculate its “covariance” with, or how it fluctuates against, every other stock. For instance, for a thirty-stock portfolio, about the minimum needed to make the numbers work well, that means 495 different calculations of mean, variance and covariance. Is such a laborious calculation needed for a static set of portfolios?
Moving on with the development of the Capital Asset Pricing Model (CAPM). It is derived from the Modern Portfolio Theory by Sharpe. The CAPM relates the returns on individual assets or entire portfolios to the return on the market as a whole. It introduces the concepts of specific risk (unique to an individual asset) and systematic risk (associated with the market). In CAPM, investors are compensated for taking systematic risk but not for taking specific risk as it can be diversified away by holding many assets. This is because if everybody in the market plays by Markowitz’s rules in Modern Portfolio Theory, then there will not be as many efficient portfolios as people in the market, but instead, one for all known as the “market portfolio”. Therefore, if all that matters is the market portfolio, then the value of an individual stock depends only on how it compares to the rest of the market. The CAPM is straightforward: The more risk you take, the more you expect to get paid. It says the most important risk you face as a stock market investor is the general state of the economy, reflected in how the market is doing. This is known as the Beta. Compared to Modern Portfolio Theory, what made this model just as Nobel Prize worthy? It is because it takes all of Markowitz’ tedious portfolio calculations and reduces them to just a few by working up a forecast for the market overall and then reducing Beta for each stock for your consideration. For instance, 495 calculations for a 30 stock portfolio with Markowitz’s Modern Portfolio Theory (calculated via the covariance and correlation matrix) can be simplified to 31 with CAPM. While Modern Portfolio Theory has arbitrary correlation between all investments, CAPM only links investments via the market as a whole. It is an example of an equilibrium model.
On the other end of the spectrum, if you believe that MPT and CAPM are sufficient in modelling portfolios, it means that variance and standard deviation are good proxies for risk provided that Gaussian Distribution correctly describes how price moves. As mentioned above in Chapter I, the academics said that the market crashes should not have happened, that it was a once-in-an-eon event. Yet, the mathematical designed system of portfolios blew up. On the topic of risk, MPT bases everything on the conventional market assumptions that prices vary mildly, independently and smoothly from one moment to the next. If those assumptions are wrong (which they are, I will discuss the flaw on independence and continuity later), everything falls apart.
Secondly, it lies on the assumption that people are rational and seek to maximize their utility. This is modelled via the utility function (for a given input will always yield the same output). When presented with all the relevant information about a stock or bond, they will not ignore it or pay a lot of or a stock they expect to fall, driving the market efficiently to the ‘correct’ level. However, investors are not always rational and self-interested. Investors do misinterpret information, their emotions can distort their decisions and subsequently, miscalculate probabilities. This is similar to the psychology in Poker where a person with a few chips left after many losses will behave differently to another person with a large amount of chips after many wins. Both should make the same decisions when faced with the same options, but it is because of their different circumstances, but the choices will likely not be the same for both. To a real, irrational man, who feels differently about loss than gain, he will act very different, the outcomes are vastly different. The claims I have made are relevant to the field of Behavioral Economics. This is clear to show that the assumption that investors are rational and seek maximum profit is flawed.
Therefore, the old models should no longer be viewed with the same degree of respect as compared to the 1990s. Although being taught, refined, retailed and used in Business Schools, the future generations of investors should be more aware and cautious about its flaws.
IV. The Fundamentals of Quantitative Finance
From the Random Walk Theory emerged the binomial model, which became a cornerstone in financial theory. The Binomial Tree Model is a simplistic version of finite differences in discrete time and the baseline model for the Brownian Motion in stock prices and Black-Scholes in option pricing through its no arbitrage principles. It explains the concepts of Delta Hedging (perfect theoretical elimination of all risk by hedging the option and the underlying by exploiting the perfect correlation between changes in the option value and changes in the stock price). As we reduce the change in time between intervals infinitely, it will eventually converge to random motions in continuous time. In the past, French Mathematician Jean Baptiste Joseph Fourier had devised equations to describe the way heat spreads. Bachelier knew the formulae well from his physicist background and adapted them to calculate the probability of bond prices moving up and down. This technique is called “radiation of probability”. This technique is also similar to the Scottish Botanist, Robert Brown, who studied the observations of the erratic way that tiny pollen grains jiggled about in a sample of water. It is what we know today as “Brownian Motion” — a physical phenomenon of randomness. Bachelier modelled the motion of security prices in continuous time just like how Robert and Fourier modelled the motion of molecules and the diffusion of heat respectively.
Brownian Motion, also known as the Wiener Process, is a stochastic process with stationary independent normally distributed increments which also has continuous sample paths. The important properties of Brownian Motion (which will be debunked soon) are as follows:
1. Finiteness: The scaling of variance with the time step is crucial to Brownian Motion remaining finite.
2. Continuity: The paths are continuous, fractal and not differentiable anywhere.
3. Markov: The conditional distribution of the Brownian Motion process given information up until that current point depends only on the current distribution.
4. Martingale: Given information up until that current point, the conditions expectation of Brownian Motion distribution is the distribution itself.
5. Markets are frictionless.
6. Normality.
Brownian Motion is a very simple yet rich process, extremely useful for representing many random processes especially those in finance. Its simplicity allows calculations and analysis that would not be possible with other processes (i.e closed form option pricing formula for prices of vanilla options).
In options theory, the Black-Scholes equation is derived from Geometric Brownian Motion with the same assumptions. The underlying follows a log normal random walk where delta hedging is done continuously and there are no arbitrage opportunities. The log normal distribution is often used as a model for the distribution of market prices as the Gaussian Distribution is often used to model returns (the logarithm of a Gaussian Distributed function). This is because you would expect equity prices to follow a random walk around an exponentially growing average. Thus, by taking the logarithm of the stock price, you will expect that to be normal about some mean. Mathematically, lognormality is derived via the Central Limit Theorem. If each return is random, independent and identically distributed, then the logarithm of the return will have the same properties. By Central Limit Theorem, the logarithm of the total stock price today will just be the sum of the logarithm of random returns which is lognormally distributed. The mean of logarithm of stock price will grow linearly with time, and the standard deviation will be proportional to the square root of time. From here, we can form a partial differential equation and use Ito’s Lemma to derive the Black-Scholes Model. The article will not focus so much on the technical on deriving the Black-Scholes as it can be found in most Financial Engineering textbooks.
V. THE FLAWS IN QUANTITATIVE FINANCE
A trader who uses a theoretical pricing model is exposed to two types of risk:
1. The trader has the wrong inputs into the model.
2. The risk that the model itself is wrong because it is based on false or unrealistic assumptions.
When a trader feeds a volatility into a model, he is specifying the magnitude and frequency of the price changes that will occur over the life on the option. In the Black-Scholes Model, the volatility of the stock price is assumed to be constantly independent of the strike and time to maturity. If the model was indeed correct, a plot of the Black-Scholes volatility implied from option prices with constant time to maturity will be a flat line and the volatility surface will be flat. However, in practice, volatilities implied by option prices with different strikes or time to maturity are different. In fact, the shape resembles a smile, particularly for short term options. Volatilities implied by out of the money and in the money options are higher than the volatility of close and at the money options4. Thus, the typical pattern for volatility options is even a volatility smirk.
The volatility smile is a U-shaped curve obtained when plotting volatility implied by option prices as a function of the strike price. The implied volatility increases when the strike of the option is above or below the underlying price. Such a pattern is typically observed for near term options or in some markets where we do not observe a strong asymmetry in the return distribution. This can be explained as investors worry about market jumps, on the upside or on the downside, fat-tailed distribution. Investors imply a higher probability to have market jumps, on the upside or on the downside, fat-tailed distribution which consists of extreme movements compared to a normal distribution. This can be measured by their kurtosis. Volatility increases as options become increasingly in the money or out of the money compared to at the money options. The smile in options implies a distribution different to the standard normal distribution used in the Black-Scholes model and as such, adjusts for one of the most commonly pointed out flaws in the model. What we can take away from smiles & skews is that essentially if an underlying has a different distribution from the usual lognormal distribution that we use in pricing, we will see some sort of different pricing of options for that same underlying. This implies the difference between the actual distribution versus the distribution we are using in our model. Volatility smile & skew can be thought of as the result of Black-Scholes Model that follows Geometric Brownian Motion assumes a fixed average return and volatility fixed over the life of the option, but in the real world that is not how stock prices behave & has to be incorporated into the skew. We can also think of skew as a measure of fear in the market: options sellers mark up their price (reducing supply) & increase in demand for the options which results in higher prices. These sellers would not want to sell deep out of the money or in the money as they do not want to be involved in the tail risk of the market.
In Chapter III, we talk about Brownian Motion and Black-Scholes having the following assumptions:
1. Independence: Tomorrow’s stock price is independent of past prices. (In Chapter I, I have briefly mentioned the flaw in this, however, I have not read enough sources to further illustrate my point other than mere observations from the market, but I do believe that prices do contain some sought of ‘memory’ in certain degrees which explains why volatility clusters).
2. Normality: All the price changes taken together, from small to large, vary within the mild, Gaussian Distribution. (I have talked enough of this in Chapter III and will not be elaborating further).
3. Markets are frictionless: Liquidity is free, traders can enter and exit positions without any constraints.
4. Continuity: Continuous price from one price to the next, a continuous diffusion process.
Continuity stems from a common human assumption. For instance, if we see a man running on the streets now and an hour later, he is running somewhere else, we will assume that he ran in a line covering all grounds in between. It does not occur to us that he may have stopped to rest and hitch a ride to get to his current location. Continuity is a fundamental assumption of conventional finance and it is simply wrong. Financial prices do indeed jump up and down. That is also one of the principals that separates finance and physics. In physics, under Ideal Gas Law, as molecules collide and exchange heat, their billions of individually infinitesimal transactions collectively produce an average temperature, around which smooth gradients lead up or down the scale. In finance, however, the news that impels an investor can be minor or major. His decision can be based on an instantaneous change of heart, switching positions from bearish to bullish and back again. This will result in a far-wilder distribution of price changes: not just price movements, but price dislocations which is especially reinforced when everyone has that same change of heart. This speed of information today reinforces this phenomenon. For example, news of a declaration of war (i.e. RUSSIA INVADES UKRAINE!) flashes across the globe to millions of investors. Everyone acts on it in seconds by dumping their positions, liquidity becomes an issue (which explains why markets are not frictionless) — people are unable to close their positions. It is likened to a crowd rushing out of the train with only one exit. Liquidity becomes an issue and everyone is in for a short squeeze. Next moment, a crash happens. Market is down 30%. On the other hand (when circumstances are more ‘normal’), exchange-traded contracts cannot follow a pure diffusion process. This is because the exchange is not open 24 hours a day. At the end of the trading day, may close at one price and then open the next day at a different price which creates a price gap. This is something a diffusion continuous process does not permit. Prices can be discontinuous.
There are clearly real problems associated with the use of such theoretical pricing models. Markets are not frictionless, prices are not always continuous, volatilities vary across market securities. With all these weaknesses, one might wonder whether theoretical pricing.
VI. EVALUATION OF THEORETICAL VERSUS PRACTICAL ASPECTS
Most traders have found that pricing models, while not perfect, are an invaluable tool for making decisions in the market. Even if a model does not work perfectly, traders have found that using a model, even a flawed one, is usually better than using no model at all. Still, a trader who wants to make the best possible decisions cannot afford to ignore the problems associated with a theoretical pricing model. Consequently, a trader who uses such a model might look for a way to reduce the potential errors resulting from these weaknesses by finding a better model. However, better is a relative term. A model might be better in the sense that it gives slightly more accurate theoretical values, but if it at the expense of being an extremely complex and difficult to use, the model may just be a substitute of one set of problems for another.
Traders say do not use Black-Scholes because traders use implied volatility skews and smiles that are inconsistent with the model. Yes, I have put down Black-Scholes and Geometric Brownian Motion in the whole of Chapter V but we must still acknowledge that it is still a model that is far simpler than modern day ‘improvements.’ For instance, stochastic volatility models such as the Heston and SABR is an attempt by Quants and Theoreticians to make Black-Scholes more consistent with the volatility smile by adding more calibrated variables, but they fail to account for the complexity of the model’s calibration. If you really want to dive deep into the Heston Stochastic Volatility Model, you will find yourself having to do numerical integration in the complex plane. Is it worth the effort? In Chapter II, I hinted about calibrating in a static dataset regardless of the dynamics of that dataset. When you calibrate a model, it is saying that whatever that market reflects today, is going to pertain forever (static — input data, output results at that current time). In reality, the model’s sensitivity to data and its lack of stability made this a far more dangerous model to use, despite being ‘theoretically’ better. This is a big contrast to the Black-Scholes in terms of its robustness where the assumptions are clear — tackle those assumptions and your risk can be minimized. For example, using the Black-Scholes Gaussian models for pricing but keep in mind worst-case scenarios for risk management of tail risks. You cannot say that a stochastic volatility model is similarly as robust as the Black-Scholes. Sure, it would be nice to have a theory to accommodate fat tails, but using a far more complicated model that is harder to understand and that takes so much longer to compute, due to its time complexity, just to accommodate an event that is short-term. To overcome the assumptions of Black-Scholes, you can add more variables into your model to account for the dynamic nature of the market. However, adding more variables to create a more precise tool does not equate to the model being robust due to its complex calibration and instability. It creates an unrealistic weapon to beat the markets. The aim for practicality is to have a simpler model and focus more on diversification and risk management by tackling the assumptions of what modern financial theory posits. I believe a robust and all-in-one equation does not exist.
Many improvements on Black-Scholes are rarely improvements, instead, many of them are just better at hiding their faults. In that sense, Black-Scholes and Brownian Motion have just the ‘right’ number of parameters. Should the model have many unobservable parameters like what we observe today, it would have been useless and totally impractical. It is true that there is a plethora of models by Quant Researchers that are better today, but my point is that complexity does not equate to better. An expertise in advanced mathematics is important but trading is not just a science but also an art. A lot of Quants treat the markets like a problem sum: no substitute for a little bit of commonsense and an open mind. They believe that the more complicated the mathematics the better. This frame of thinking is reinforced thanks to academics who take an almost axiomatic approach to modern finance. The axions such as no arbitrage, Gaussian Distribution, independent prices have been drilled into Quants today. Therefore, the ability to adapt is a steep learning curve. If Quants do not know what they do not understand, they are bound for disaster.
Given the fact that most traders are not theoreticians, a more realistic solution might be to use a less complex model and somehow fine-tune it so that it is consistent with the realities of the market place.
VII. FINAL THOUGHTS
In finance, what truly matters is not the elegance of theories, but rather their practical effectiveness in making money. The beauty lies in the fact that there’s no single definitive answer to what generates profits. While academic purists may find flaws in real-world trading environments, relying too heavily on financial theories — whether outdated or updated — can lead to repeating past financial crises. To prevent this, we need stronger, more straightforward models with transparent assumptions and flaws. By acknowledging the limitations of current models and emphasizing transparency, we can develop frameworks better suited to navigating the complexities of financial markets and reducing the risk of catastrophic outcomes.
Theories play a crucial role in shaping our understanding of markets and financial history. But they are not everything. If we fail to grasp the reasons behind past financial crashes, our ability to improve our practices will be limited. Therefore, I believe it’s important not to focus excessively on theoretical concepts in finance. Instead, we should prioritize applying these theories practically in the market while enhancing their robustness and reliability in answering the fundamental question of finance. What makes money.
References
[1] The Misbehavior of Markets: A Fractal View of Financial Turbulence by Benoit Mandelbrot
[2] When Genius Failed: The Rise and Fall of Long Term Capital Management by Roger Lowenstein
[3] Inside the House of Money: Top Hedge Fund Traders on Profiting In Global Markets {Chapter 11 — The Pioneer} by Steven Drobny
[4] Option Volatility and Pricing Strategies by Sheldon Natenberg