post list

How to… use bootstrapping in Portfolio Management



Learning with kernels: an introductory approach



Euro Stoxx Strategy with Machine Learning



Autoregressive model in S&P 500 and Euro Stoxx 50



Prices convolution, a practical approach



Prices convolution, a practical approach

In this post, we will approach the problem of convolution from a matricial point of view.

What is convolution?

What we mean by convolution, is in the composing of two different functions to have a third one. The particularity of the convolution is that we have one function moving in the domain space of another.

Convolution is a useful resource in many data science aspects; from cross-correlation computing to image filtering. There’s also a great relationship with the “stencil pattern” in a parallel programming problem. This pattern consists of picking up a surrounding domain, to the point that we study our problem in order to apply a reduction and move it into the problem domain.

We define a convolution like this:


In a discrete problem, our convolution will turn into this next formula:


The previous formula is just a generalization of a well-known technical indicator: mobile averages. So regarding the convolution, we could use it to make a smoother curve or, as in this case, to calculate an exponential average over a prices serie.

But what about our matricial approach?

Well, the idea is performing the convolution through a Toeplitz matrix in order to achieve a more efficient calculation. We can achieve our convolution (y) through the inner product between a Toepliz matrix built onto the h function (convolution function in this specific domain), and the serie from which we desire the convolution (see the next formula):


Let’s check out the python code to achieve it:

import numpy as np
from scipy.linalg import toeplitz

# Convolution matrix functions:
convolution_matrix = lambda h,dim: np.tril(toeplitz(np.r_[h[::-1],[0]*(dim-f.shape[0])]))
convolution = lambda f,g:,g.shape[0]),g)

# Convolution functions:
hN = lambda n: (lambda x: np.exp(x)/np.exp(x).sum())(np.r_[0:1+(1./n):1./n])

# Convolution over ~3 months:
prices = np.cumsum(np.random.randn(10000,1))

The results of applying the convolution with 63 evaluation points in the convolution matrix are:


The red axis is the exponential function used to perform the convolution over the blue price serie, and we get the green curve as a result of the convolution.

Tweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Email this to someone

add a comment

[…] Prices Convolution, A Practical Approach [Quant Dare] othing could be further from my intention than to give an extensive mathematical approach to this post but an slightly idea is desirable. In this post we will approach to the problem of convolution from a matricial point of view. Well, what we mean by convolution is about composing 2 different functions to have a third one. The particularity of the convolution is that we ha […]