Entropy++

This post explains how to install Entropy++ on MacOS and Linux (Debian) and discusses how the packages can be used in various examples.

Contents

Installing Entropy++ on macOS

Download the required packages.

LLVM is required, because Entropy++ uses OpenMP (parallelisation).

Optional

To download and compile the code, follow these steps:

You can also use Ninja instead of make

Installing Entropy++ on Debian

Download the required packages.

Optional

To download and compile the code, follow these steps:

You can also use Ninja instead of make

Available Tools

Currently, Entropy++ only works with discretised data. This is why the discretisation procedure in Entropy++ is discussed first.

Discretisation

Entropy++ has a class called Container which purpose is to store (real value) data and preprocess it for the methods below. All functions (entropy, mutual information, etc.) take a Container as their input.

For the purpose of efficiency, the size of a container must be specified before using it:

create a container with 3 rows and 2 columns.

Containers are easily filled in the following way:

Containers can be specified to fill rows first or columns first. By default, containers are filled row-wise, which means that in the example above, the following code

will lead to the following output

Column-wise filling would have resulted in

Containers can be discretised column-wise or over all columns at once. To discretise, one has to specify the bins and domains for each column. It is also possible to set the same bins and domains for all columns simultaneously:

Container itself is a template class. DContainer is equivalent to Container and ULContainer is equivalent to Container.

Entropy

Entropy is defined in the following way

\fn_phv H(X) = - \sum_{x\in\mathcal{X}} p(x) \log_2 p(x)

The following code is taken from entropy_test.cpp which is provided with Entropy++

Both t and s are equal to 1.0.
There are two implementations for the entropy H, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The H implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.

Mutual Information

Mutual Information is defined in the following way

\fn_phv I(X,Y) = \sum_{x\in\mathcal{X},y\in\mathcal{Y}} p(x,y) \log_2 \frac{p(x,y)}{p(x)p(y)}

The following code is taken from mi_test.cpp which is provided with Entropy++

There are two implementations for the mutual information, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The mutual information implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.

Conditional Mutual Information

Conditional Mutual Information is defined in the following way

\fn_phv I(X,Y|Z) = \sum_{x\in\mathcal{X},y\in\mathcal{Y},z\in\mathcal{Z}} p(x,y,z) \log_2 \frac{p(x|y,z)}{p(x|y)}

The following code is taken from cmi_test.cpp which is provided with Entropy++

There are two implementations for the conditional mutual information, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The conditional mutual information implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.

Predictive Information

Predictive Information is defined in the following way (one-step approximation)

\fn_phv PI(X) = I(X_{t},X_{t+1}) = \sum_{x_t,x_{t+1}\in\mathcal{X}} p(x_t,x_{t+1}) \log_2 \frac{p(x_t,x_{t+1})}{p(x_t)p(x_{t+1})}
It can be used in the following way

There are two implementations for the predictive information, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The mutual information implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.

Morphological Computation

Currently, Entropy++ provides a set of quantification in the context of morphological computation, which have been investigated in different publications (see below).

MC_W

MC_W[1] is defined in the following way

\fn_phv \mathrm{MC}_\mathrm{W} = \sum_{w',w\in\mathcal{W},a\in\mathcal{A}} p(w',w,a)\log_2 \frac{p(w'|w,a)}{p(w|a)}

MC_A

MC_A[1] is defined as

\fn_phv \mathrm{MC}_\mathrm{A} = \sum_{w',w\in\mathcal{W},a\in\mathcal{A}} p(w',w,a)\log_2 \frac{p(w'|w,a)}{p(w|A)}

MC_MI

MC_MI[2] is defined as

\fn_phv \mathrm{MC}_\mathrm{MI} = I(W';W) - I(A;S)\\ \hspace*{1.65cm} = \sum_{w',w\in\mathcal{W}} p(w',w) \log_2\frac{p(w',w)}{p(w')p(w)}\\ \hspace*{1.95cm} - \sum_{s\in\mathcal{S},a\in\mathcal{A}} p(s,a) \log_2\frac{p(s,a)}{p(s)p(a)}

Bibliography

[1] [pdf] K. Zahedi and N. Ay, “Quantifying morphological computation,” Entropy, vol. 15, iss. 5, p. 1887–1915, 2013.
[Bibtex]
@article{Zahedi2013aQuantifying,
Author = {Zahedi, Keyan and Ay, Nihat},
Issn = {1099-4300},
Journal = {Entropy},
Number = {5},
Pages = {1887--1915},
Pdf = {http://www.mdpi.com/1099-4300/15/5/1887},
Title = {Quantifying Morphological Computation},
Volume = {15},
Year = {2013}}
[2] [pdf] K. Ghazi-Zahedi, D. F. B. Haeufle, G. F. Montufar, S. Schmitt, and N. Ay, “Evaluating morphological computation in muscle and dc-motor driven models of hopping movements,” Frontiers in robotics and ai, vol. 3, iss. 42, 2016.
[Bibtex]
@article{Ghazi-Zahedi2016aEvaluating,
Author = {Ghazi-Zahedi, Keyan and Haeufle, Daniel F.B. and Montufar, Guido Francisco and Schmitt, Syn and Ay, Nihat},
Issn = {2296-9144},
Journal = {Frontiers in Robotics and AI},
Number = {42},
Pdf = {http://www.frontiersin.org/computational_intelligence/10.3389/frobt.2016.00042/abstract},
Title = {Evaluating Morphological Computation in Muscle and DC-motor Driven Models of Hopping Movements},
Volume = {3},
Year = {2016}}

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.