Unlock the Power of Quantitative Strategies: Explore Our Cutting-Edge Website Today!

# Concepts of Entropy in Finance: Transfer entropy

### Konstantinos Pappas

#### 10/06/2021

The concept of entropy has many useful applications in finance such as measuring risk, uncertainty, or noise in a signal. In this post we will focus on transfer entropy, a useful tool for causal inference between financial time series.

## What is entropy?

Entropy in general represents the uncertainty, ambiguity, and disorder of a stochastic process. The concept of entropy has been introduced in many scientific fields. Mainly due to its benefit of not being restricted by the form of the theoretical probability distribution of the random variables it is applied on, it appears in contexts of thermodynamics, statistical mechanics, information theory, topological dynamics, and econometrics and as a measure can demonstrate different properties (energy that cannot produce work, disorder, uncertainty, randomness, complexity, causality etc.). Based on the axioms imposed on probability distributions whose entropic quantities we want to measure  we can identify various useful entropic functions in the field of finance like: Shannon entropy, Renyi entropy, Tsallis entropy, Conditional entropy, Permutation entropy,  Approximate entropy, and Transfer entropy.

For an elaborate introduction of how to calculate the entropy of a random variable and its applications in machine learning visit the following links:

## What is Transfer entropy

Transfer entropy (TE) aims to measure the amount of time-directed information between two random processes. Given the past time evolution of  process a Y, TE from system X to Y is the amount of Shannon uncertainty reduction in the future time evolution of Y including the knowledge of the past evolution of X. TE formula can be written as the sum of Shanon entropies:

$$TE_{X \rightarrow Y}=H\left( Y_{t-1: t-L}, X_{t-1: t-L}\right)-H\left(Y_{t}, X_{t}\mid, X_{t-1: t-L}\right)+H\left(Y_{t}, Y_{t-1: t-L}\right)-H\left(Y_{t-1: t-L}\right)$$

Where Shannon entropy is defined as:
$$H(X) = -\sum_{x_{i}\in X}P(X=x_{i})\cdot log(P(X=x_{i}))$$

or the difference of conditional entropies:

$$TE_{X \rightarrow Y}=H\left(Y_{t} \mid Y_{t-1: t-L}\right)-H\left(Y_{t} \mid Y_{t-1: t-L}, X_{t-1 t-L}\right)$$

Where conditional entropy is defined as:
$$H(X|Y) = – \sum_{x,y} p(x,y)\cdot log p(x|y)$$

The fact that TE is not assuming any particular functional form to describe interactions among systems makes it an important tool in the analysis of causal relationships as it can be applied when the model assumption of typical causality measures such as Granger causality doesn’t hold, for example, analysis of non-linear signals which is common occurrence in finance.

## Now let’s see a practical example

We will examine what causal relationships arise between the S&P 500 index and 3 different instruments that do not have an explicit relationship with the dependent variable: the VIX index, the 10 year US treasury yield and SPDR Bloomberg Barclays 10+ Year U.S. Corporate Bond ETF.

For this example, a 6 day lag and a 2 days stride window is used to calculate the local non-linear transfer entropy of log returns of the series. The data was retrieved using yfinance and the transfer entropy calculations using PyCausality

Local entropy values (that appear on the y axis) may be either positive or negative and can be interpreted as being either informative or misinformative respectively.

It appears that all the series tested have significant entropy values only at certain time points (as expected from the noisy nature of our data) mainly during 2020.

• VIX has a stronger causal impact on S&P 500 index (higher TE)
• S&P 500 index has a stronger causal impact on the Corporate Bond ETF.
• There is no clear causal direction for the 10-year treasury rate and S&P 500 index. They show an equal net TE across time.

We can search for a potential more stable causal relation by optimizing the time lag and window in our calculations.

## References

 Bossomaier, T., Barnett, L., Harré, M., & Lizier, J. T. (2016). An introduction to transfer entropy. Springer: Berlin, Germany.
 Marschinski, R., & Kantz, H. (2002). Analysing the information flow between financial time series. The European Physical Journal B-Condensed Matter and Complex Systems, 30(2), 275-281.
 Knuth, K. H. (2006). Optimal data-based binning for histograms. arXiv preprint physics/0605197.
 San Liang, X. (2014). Unraveling the cause-effect relation between time series. Physical Review E, 90(5), 052150.
 Schreiber, T. (2000). Measuring information transfer. Physical review letters, 85(2), 461.
 Zaremba, A., & Aste, T. (2014). Measures of causality in complex datasets with application to financial data. Entropy, 16(4), 2309-2349.