Phorgy Phynance

Archive for the ‘Quantitative Analysis’ Category

Weighted Likelihood for Time-Varying Gaussian Parameter Estimation

leave a comment »

In a previous article, we presented a weighted likelihood technique for estimating parameters \theta of a probability density function \rho(x|\theta). The motivation being that for time series, we may wish to weigh more recent data more heavily. In this article, we will apply the technique to a simple Gaussian density

\rho(x|\mu,\nu) = \frac{1}{\sqrt{\pi\nu}} \exp\left[-\frac{(x-\mu)^2}{\nu}\right].

In this case, the log likelihood is given by

\begin{aligned} \log\mathcal{L}(\mu,\nu) &= \sum_{i=1}^N w_i \log\rho(x_i|\mu,\nu) \\ &= -\frac{1}{2} \log\pi\nu - \frac{1}{\nu} \sum_{i=1}^N w_i \left(x_i - \mu\right)^2 \end{aligned}.

Recall that the maximum likelihood occurs when

\begin{aligned} \frac{\partial}{\partial\mu} \log\mathcal{L}(\mu,\nu) = \frac{\partial}{\partial\nu} \log\mathcal{L}(\mu,\nu) = 0. \end{aligned}

A simple calculation demonstrates that this occurs when

\begin{aligned} \mu = \sum_{i=1}^N w_i x_i \end{aligned}

and

\begin{aligned} \sigma^2 = \sum_{i=1}^N w_i \left(x_i - \mu\right)^2, \end{aligned}

where \sigma^2 = \nu/2.

Introducing a weighted expectation operator for a random variable X with samples x_i given by

\begin{aligned} E_w(X) = \sum_{i=1}^N w_i x_i, \end{aligned}

the Gaussian parameters may be expressed in a familiar form via

\mu = E_w(X)

and

\sigma^2 = E_w(X^2) - \left[E_w(X)\right]^2.

This simple result justifies the use of weighted expectations for time varying Gaussian parameter estimation. As we will see, this is also useful for coding financial time series analysis.

 

Written by Eric

February 3, 2013 at 4:33 pm

More fun with maximum likelihood estimation

with one comment

A while ago, I wrote a post

Fun with maximum likelihood estimation

where I jotted down some notes. I ended the post with the following:

Note: The first time I worked through this exercise, I thought it was cute, but I would never compute \mu and \sigma^2 as above so the maximum likelihood estimation, as presented, is not meaningful to me. Hence, this is just a warm up for what comes next. Stay tuned…

Well, it has been over a year and I’m trying to get a friend interested in MLE for a side project we might work on together, so thought it would be good to revisit it now.

To briefly review, the probability of observing N independent samples X\in\mathbb{R}^N may be approximated by

\begin{aligned} P(X|\theta) = \prod_{i = 1}^N P(x_i|\theta) = \left(\Delta x\right)^N \prod_{i=1}^N \rho(x_i|\theta),\end{aligned}

where \rho(x|\theta) is a probability density and \theta represents the parameters we are trying to estimate. The key observation becomes clear after a slight change in perspective.

If we take the Nth root of the above probability (and divide by \Delta x), we obtain the geometric mean of the individual densities, i.e.

\begin{aligned} \langle \rho(X|\theta)\rangle_{\text{geom}} = \prod_{i=1}^N \left[\rho(x_i|\theta)\right]^{1/N}.\end{aligned}

In computing the geometric mean above, each sample is given the same weighting, i.e. 1/N. However, we may have reason to want to weigh some samples heavier than others, e.g. if we are studying samples from a time series, we may want to weigh the more recent data heavier. This inspired me to replace 1/N with an arbitrary weight w_i satisfying

\begin{aligned} w_i\ge 0,\quad\text{and}\quad \sum_{i=1}^N w_i = 1.\end{aligned}

With no apologies for abusing terminology, I’ll refer to this as the likelihood function

\begin{aligned} \mathcal{L}(\theta) = \prod_{i=1}^N \rho(x_i|\theta)^{w_i}.\end{aligned}

Replacing w_i with 1/N would result in the same parameter estimation as the traditional maximum likelihood method.

It is often more convenient to work with log likelihoods, which has an even more intuitive expression

\begin{aligned}\log\mathcal{L}(\theta) = \sum_{i=1}^N w_i \log \rho(x_i|\theta),\end{aligned}

i.e. the log likelihood is simply the weighted (arithmetic) average of the log densities.

I use this approach to estimate stable density parameters for time series analysis that is more suitable for capturing risk in the tails. For instance, I used this technique when generating the charts in a post from back in 2009:

80 Years of Daily S&P 500 Value-at-Risk Estimates

which was subsequently picked up by Felix Salmon of Reuters in

How has VaR changed over time?

and Tracy Alloway of Financial Times in

On baseline VaR

If I find a spare moment, which is rare these days, I’d like to update that analysis and expand it to other markets. A lot has happened since August 2009. Other markets I’d like to look at would include other equity markets as well as fixed income. Due to the ability to cleanly model skew, stable distributions are particularly useful for analyzing fixed income returns.

Modeling Currencies

leave a comment »

I hope to begin some research into currencies. Before I come out with any result though, I thought I’d ask an open question and hope someone comes by with a response.

First of all, as a former scientist, thinking about currencies is very fun. See, for example, my previous article

This morning, I happened across a recent article that appeared on the arxiv:

The second paragraph really stood out:

One of the problems in foreign exchange research is that currencies are priced against each other so no independent numeraire exists. Any currency chosen as a numeraire will be excluded from the results, yet its intrinsic patterns can indirectly affect overall patterns. There is no standard solution to this issue or a standard numeraire candidate. Gold was considered, but rejected due to its high volatility. This is an important problem as different numeraires will give different results if strong multidimensional cross-correlations are present. Different bases can also generate different tree structures. The inclusion or exclusion of currencies from the sample can also give different results.

This is interesting because financial modeling is often about prices of securities or changes in prices. Currencies are about the relationship between prices. In graph theoretic (or category theoretic) terms, it is tempting to say that currency models should be about directed edges (or morphisms).

Is the best way to model currencies to choose some numeraire as is done in this paper? Or is there a way to study the relationships (morphisms) directly?

Written by Eric

October 28, 2010 at 6:01 pm

Discrete stochastic calculus and commutators

leave a comment »

This post is in response to a question from Tim van Beek over at the Azimuth Project blog hosted by my friend John Baez regarding my paper

The basic fact needed to address the question is that we have a set 0-dimensional objects \mathbf{e}^\kappa and 1-dimensional objects \mathbf{e}^{\kappa\lambda} that obey the following geometrically-motivated multiplication rules:

  1. \mathbf{e}^\kappa \mathbf{e}^\lambda = \delta_{\kappa,\lambda} \mathbf{e}^\kappa
  2. \mathbf{e}^{\kappa\lambda} \mathbf{e}^\mu = \delta_{\lambda,\mu}  \mathbf{e}^{\kappa\lambda}
  3. \mathbf{e}^\mu \mathbf{e}^{\kappa\lambda} = \delta_{\mu,\kappa} \mathbf{e}^{\kappa\lambda}

To see the geometrical meaning of these multiplications, it might help to consider discrete 0-forms

f = \sum_\kappa f(\kappa) \mathbf{e}^\kappa.

and

g = \sum_\lambda g(\lambda) \mathbf{e}^\lambda

Multiplication 1.) above implies

f g = \sum_\kappa f(\kappa) g(\kappa) \mathbf{e}^\kappa,

which is completely reminiscent of multiplication of functions where we think of f(\kappa) as the value of the function at the “node” \mathbf{e}^\kappa.

Multiplications 2.) and 3.) introduce new concepts, but whose geometrical interpretation is quite intuitive. To see this consider

f \mathbf{e}^{\kappa\lambda} = f(\kappa) \mathbf{e}^{\kappa\lambda}

and

\mathbf{e}^{\kappa\lambda} f = f(\lambda) \mathbf{e}^{\kappa\lambda}.

Multiplying the function f on the left of the “directed edge” \mathbf{e}^{\kappa\lambda} projects out the value of the function at the beginning of the edge and multiplying on the right projects out the value of the function at the end. Hence, the product of functions and edges do not commute.

In the paper, it was shown that the exterior derivative of a discrete 0-form is given by

df = \sum_{\kappa,\lambda} \left[f(\lambda) - f(\kappa)\right] \mathbf{e}^{\kappa\lambda}.

Here, we show that this may be expressed as the commutator with the “graph operator”

\mathbf{G} = \sum_{\kappa,\lambda} \mathbf{e}^{\kappa\lambda}.

The result is quite simple and follows directly from \mathbf{G} and the multiplication rules, i.e.

f\mathbf{G} = \sum_{\kappa,\lambda} f(\kappa) \mathbf{e}^{\kappa\lambda}

and

\mathbf{G} f = \sum_{\kappa,\lambda} f(\lambda) \mathbf{e}^{\kappa\lambda}

so that

[\mathbf{G},f] = \sum_{\kappa,\lambda} \left[f(\lambda) - f(\kappa)\right] \mathbf{e}^{\kappa\lambda}

or

df = [\mathbf{G},f].

Written by Eric

October 27, 2010 at 3:47 pm

Barclays quants error on leveraged ETFs

with 14 comments

In a recent article, Cheng and Madhaven from Barclays Global Investors published a good article on leveraged ETFs

The Dynamics of Leveraged and Inverse Exchange-Traded Funds
April 8, 2009

Check it out.

The Error

They begin from a fairly standard starting point

dS_t = \mu S_t dt + \sigma S_t dW_t

However, they proceed to state that since

\frac{A_{t_i}-A_{t_{i-1}}}{A_{t_{i-1}}} = x\frac{S_{t_i}-S_{t_{i-1}}}{S_{t_{i-1}}}

“holds for any period”, then it follows that

\frac{dA_t}{A_t} = x\frac{dS_t}{S_t}

where A_t is the ETF NAV and x is the leverage factor.

Unfortunately, that is not correct. The problem is that

\frac{A_{t_i}-A_{t_{i-1}}}{A_{t_{i-1}}} = x\frac{S_{t_i}-S_{t_{i-1}}}{S_{t_{i-1}}}.

only holds when t_i - t_{i-1} is 1 day. Otherwise, we could let t_i - t_{i-1} be 1 year and this would say that the 1-year return of the ETF is x times the 1-year return of the index, which we already know is not true.

This should have also been obvious by plugging t=1 into their final expression

\frac{A_t}{A_0} = \left(\frac{S_t}{S_0}\right)^x \exp\left[\frac{\left(x-x^2\right)\sigma^2 t}{2}\right],

which violates the relation defining leveraged ETFs they started with. As a result of this error, their discussion of return dynamics in Section 4 must be re-examined

The Solution

The correct way to look at this is to let

G_{i-1,i} =\frac{S_{t_i}}{S_{t_{i-1}}} and G_{x,i-1,i} = \frac{A_{t_i}}{A_{t_{i-1}}}.

If \Delta t is 1 day, then

\begin{aligned} G_{x,i-1,i} &= 1 + x \left(G_{i-1,i} - 1\right) \\ &= (1-x)\left[1+\left(\frac{x}{1-x}\right) G_{i-1,i}\right]\end{aligned}

so that

\begin{aligned} G_{x,0,n} &= \prod_{i=1}^n G_{x,i-1,i} \\ &= (1-x)^n\prod_{i=1}^n \left[1+\left(\frac{x}{1-x}\right) G_{i-1,i}\right].\end{aligned}

If we assume S_t is a geometric Brownian motion (as they do), then

G_{i-1,i} = \exp\left(\bar\mu \Delta t + \sigma \sqrt{\Delta t} W_{\Delta t}\right),

where \bar\mu = \mu - \frac{\sigma^2}{2}. With a slight abuse of notation, we can drop the indices and let

G =\exp\left(\bar\mu \Delta t + \sigma\sqrt{\Delta t} W_{\Delta t}\right)

so that

G^i =\exp\left(\bar\mu i \Delta t + \sigma\sqrt{i\Delta t}W_{i \Delta t}\right).

This allows us to rewrite (using the definition of the binomial coefficient)

\begin{aligned} G_{x,0,n} &= (1-x)^n \left[1+\left(\frac{x}{1-x}\right) G \right]^n \\ &=(1-x)^n \sum_{i=0}^n \binom{n}{i}\left(\frac{x}{1-x}\right)^i G^i. \end{aligned}

Noting that

E(G) = \exp\left[\left(\bar\mu + \frac{\sigma^2}{2}\right)\Delta t\right] = \exp\left(\mu\Delta t\right)

and

E(G^i) = \exp\left(\mu i\Delta t\right) = E(G)^i.

we arrive at a disappointingly simple, yet important, expression

\begin{aligned} E(G_{x,0,n}) &=(1-x)^n \sum_{i=0}^n \binom{n}{i}\left(\frac{x}{1-x}\right)^i E(G)^i \\ &= (1-x)^n \left[1+\left(\frac{x}{1-x}\right) E(G) \right]^n \\ &= \left[1-x+x E(G)\right]^n. \end{aligned}

The expression above governing leveraged ETFs is the starting point for further analysis. We will come back to this in a subsequent post.

To be continued…

Written by Eric

May 4, 2009 at 7:38 pm

Why do scientists go into finance?

with 2 comments

Why do mathematicians and physicists go into finance?

One reason people may sympathize with is mere survival. Job prospects for mathematicians and physicists in academics is horribly bleak. Each PhD program churns out 10s if not 100s of PhDs each year. How many PhDs do these same institutions hire each year? Less than 10 for sure. Most likely none. Scientists are a different breed. Most pursue higher education simply because they love what they do with little thought about what will happen after school. It is often not until a few months before being cut loose that many graduate students think to themselves, “Oh %$#%! What am I going to do now?” A good picture to keep in mind is the classic absent-minded professor.

Although Wall Street rolled out the red carpet to scientists in the 70’s and 80’s, I would suspect that the idea was not high on most graduate student’s minds at the time. When quantitative risk management systems began being deployed on a large scale in the 90’s coincident with significant improvements in computational power, that marked a turning point. By the mid 90’s, Wall Street was becoming a clear beacon for mathematicians and physicists about to hit the job market. Leading up to Basel II setting capital requirements based on value-at-risk measurements in 2004, banks literally went on a hiring spree of PhDs. I know that when I first tested the waters on Wall Street in 2002, each advertised quant opening was receiving no less than 30 PhD resumes. Most of these had prior work experience in finance. Today, many physics and mathematics PhD programs offer minors in finance. Clearly, Wall Street is a destination for many graduating scientists these days.

Is survival the only reason scientists go to Wall Street? Clearly not. The real and only important reason physicists and mathematicians go into finance is that they can potentially make lots of money doing very interesting and rewarding work. Who wouldn’t want to work in finance? I know I absolutely fell in love with finance at first sight. The first time I stepped foot on a trading floor, I knew I had found my calling in life. It was a truly transformative experience.

There are about as many different kinds of quants as their are scientists. I have been fortunate enough to have seen the quant world from many perspectives. I started finance life as a “risk quant” working at a large bank with a group of 12 other PhDs building risk models that spanned trading desks across the globe. These guys have been getting a bad rap lately and I’ll have more to say about that another time. Make no mistakes though. The risk quants know quite well the strengths and limitations of their models and given more authority, they could have and would have kept the credit bubble from getting out of hand. Unfortunately, the reality is that risk quants have been relegated to secondary roles whose purpose is often to massage numbers to tell the risk managers what they want to hear. For example, at one point, a friend told me that their risk manager did not like the numbers produced for a particular trading desk. This trader had significant influence. So the risk manager came back and told them to recompute the correlation matrix until it output what the trader wanted to see. Did the quant have a choice? Not if he wanted to keep his job. At another point, another friend was told that they needed to modify the risk numbers coming out of the models because they were too high which forced the bank to retain too much capital. He was warned that people higher up were becoming unhappy and that the entire group could be eliminated if they didn’t do something about it. Since the job market was so competitive and since the pay was quite good, there really was no incentive to rock the boat. This has absolutely nothing to do with poor models or “black swans”. It has everything to do with greed. Period.

There are some really good aspects of being a risk quant. Usually, it is a good entry point to other things since you get a general introduction to a large variety of securities. The typical entry requirements are often lower as well. The downside is that you are effectively a NARC with absolutely no authority. You may think your job is to reign in excessive risk takers, but the reality is that you are most likely a puppet for upper management.

As a byproduct of proliferation of risk management systems, clients and investors are becoming increasing demanding in terms of risk reporting. This has trickled down from investment banks on the “sell side” to money managers on the “buyside”. Traditional asset managers who previously had no interest in quants or their models are now being forced to hire quants simply due to client demands. This can be a very good place for scientists to end up. You will often come across as a super star rocket scientist regardless of what you actually contribute. The downside is that many traditional investors may view you as a necessary evil and don’t really want you there. It is a challenge in such an environment to demonstrate the value of the work you do. Yes, I am speaking from experience 🙂 There are definitely good things to be learned from investors who are firmly “anti-quant” though. I value the experience obtained from attempting to understand the way traditional investors think and invest. It has had a definite positive impact on the way I look at things. My advice to any quants moving into traditional asset management is to try to find a way to “quantify” your contributions. Make it clear that you are doing things that few others could do. My biggest mistake was assuming that my hard work and the contributions I was making would be obvious and rightfully recognized. Make sure you have champions and make sure these champions speak up for you. Working on the buyside can be quite rewarding both scientifically and financially. I know it is where I belong.

Another type of quant is the “front-office quant” whose job it is to build derivatives models to assist traders directly. From my experience, this is where most quants would like to end up. It is often fast-paced and quite demanding. You have to be willing to be brutalized and cannot be sensitive to fowl language 🙂 A part of me would love to work on a fast-paced desk. I almost look at these guys as the rock stars of quants. These guys can enjoy quite ridiculous compensation since they participate more directly in the profit sharing. Plus, the closer to the money you are, the better. This role can also lead to opportunities to become a trader. I think secretly (or not so secretly) most quants dream of becoming traders.

When I grow up, I hope to become a quantitative portfolio manager. I envision this as somewhat of a hybrid between the traditional asset manager and the traditional quant. People need some place to put their retirement investments. Traditional asset managers have let many retirees down in a bad way. They often charge high fees for unremarkable performance. Many asset managers saw the current crisis coming and positioned themselves appropriately. Others had their heads in the sand for far too long and ended up destroying a lot of hard-earned wealth.

I love finance. I do not feel like I’ve given anything up by leaving physics. The modeling is quite enjoyable and regardless of what some talking heads in the media would have you believe, can be quite valuable to investors. Any decent credit model was screaming that fixed-income securities were grossly overpriced leading up to the crash. I know that I literally begged my research directors to let me work with the high yield analysts when I saw the risk premium go negative in 2006. Every other quant I talked to knew it too. As long as the music plays, you need to keep dancing, right?

What do I think about markets now? I hope to say more in a separate post, but I started this blog on July 10, 2007 with a post entitled:

“The End is Near”

At the time, I claimed to be an optimist and I am. I was scared because very few others were scared. Now, everyone is scared as they should be, but I see that as the first step to recovery. You have to recognize how serious the situation is before it can get better. Spreads in fixed income have priced in some very gruesome scenarios. I think many of these gruesome scenarios will come to pass. Corporate defaults will obviously increase and this will put a strain on the CDS market. I was more scared about this before, but recent efforts to move CDS to clearinghouses has dramatically reduced my fears. There will be more blood before things hit a bottom, but investors are slowly beginning to see beyond it. We’re not out of the woods by any means and risks remain extremely elevated, but I am optimistic that in 2-3 years, the equity markets will be much higher than they are today regardless of how low they go in the interim.

Written by Eric

February 28, 2009 at 12:24 pm

Quants R Us

leave a comment »

Howdy!

It’s been a while since I’ve been able to find the time to post anything. Work is completely insane, but I’m loving it 🙂

I’ve been in a “tool” building frenzy lately. When you’re building quant models, one of the best uses of them is sensitivity analysis, i.e. trying to see how sensitive securities are to various parameters. After fiddling with one too many Excel charts, I decided to automate a lot of the sensitivity analysis we’re doing.

Building tools to automate mundane tasks is one thing, but I also enjoy throwing fancy looking interfaces on my tools. This is motivated in large part by Emanuel Derman. Both in his book and at conferences where I’ve heard him speak, he often emphasizes that some of the biggest impacts he made early in his quant career were in building nice user-friendly interfaces. Stuff that even the traders can figure out 😉

Building tools requires a bit of effort up front and when you are barraged by demands that need to be done yesterday, you can either succumb to the “easy” way and just crank out results as fast as you can (knowing full well you will have to repeat the same process next week), or double your working hours to get the tools working asap. I chose the latter, which explains my hiatus.

Things are working quite nicely now and it was pleasing to watch the younger quants admiring the analysis tool I built 🙂 There is no doubt in my mind the extra effort it took to build some useful tools, will be paid back tenfold (or more!) in time.

Written by Eric

October 9, 2007 at 8:00 pm