Python

Prices convolution, a practical approach

fuzzyperson

17/06/2015

1

In this post, we will approach the problem of convolution from a matricial point of view.

What is convolution?

What we mean by convolution, is in the composing of two different functions to have a third one. The particularity of the convolution is that we have one function moving in the domain space of another.

Convolution is a useful resource in many data science aspects; from cross-correlation computing to image filtering. There’s also a great relationship with the “stencil pattern” in a parallel programming problem. This pattern consists of picking up a surrounding domain, to the point that we study our problem in order to apply a reduction and move it into the problem domain.

We define a convolution like this:

convolution_formulaN

In a discrete problem, our convolution will turn into this next formula:

convolution_formula2

The previous formula is just a generalization of a well-known technical indicator: mobile averages. So regarding the convolution, we could use it to make a smoother curve or, as in this case, to calculate an exponential average over a prices serie.

But what about our matricial approach?

Well, the idea is performing the convolution through a Toeplitz matrix in order to achieve a more efficient calculation. We can achieve our convolution (y) through the inner product between a Toepliz matrix built onto the h function (convolution function in this specific domain), and the serie from which we desire the convolution (see the next formula):

matrix_multiplication

Let’s check out the python code to achieve it:

import numpy as np
from scipy.linalg import toeplitz

# Convolution matrix functions:
convolution_matrix = lambda h,dim: np.tril(toeplitz(np.r_[h[::-1],[0]*(dim-f.shape[0])]))
convolution = lambda f,g: np.dot(convolution_matrix(f,g.shape[0]),g)

# Convolution functions:
hN = lambda n: (lambda x: np.exp(x)/np.exp(x).sum())(np.r_[0:1+(1./n):1./n])

# Convolution over ~3 months:
np.random.seed(19940907)
prices = np.cumsum(np.random.randn(10000,1))
convolution(hN(63),prices)

The results of applying the convolution with 63 evaluation points in the convolution matrix are:

convolution

The red axis is the exponential function used to perform the convolution over the blue price serie, and we get the green curve as a result of the convolution.

1 Comment
Inline Feedbacks
View all comments

[…] Prices Convolution, A Practical Approach [Quant Dare] othing could be further from my intention than to give an extensive mathematical approach to this post but an slightly idea is desirable. In this post we will approach to the problem of convolution from a matricial point of view. Well, what we mean by convolution is about composing 2 different functions to have a third one. The particularity of the convolution is that we ha […]