This post explains how to install Entropy++ on MacOS and Linux (Debian) and discusses how the packages can be used in various examples.
Contents
Installing Entropy++ on macOS
Download the required packages.
1 |
brew install cmake llvm boost gflags glog |
LLVM is required, because Entropy++ uses OpenMP (parallelisation).
Optional
1 |
brew install ninja |
To download and compile the code, follow these steps:
1 2 3 4 5 |
git clone https://github.com/kzahedi/entropy mkdir entropy/build cd entropy/build cmake .. make -j |
You can also use Ninja instead of make
1 2 3 4 5 |
git clone https://github.com/kzahedi/entropy mkdir entropy/build cd entropy/build cmake .. -G Ninja ninja |
Installing Entropy++ on Debian
Download the required packages.
1 |
sudo apt-get install git cmake g++ libboost-dev libgflags-dev libgoogle-glog-dev |
Optional
1 |
sudo apt-get install cmake-curses-gui ninja |
To download and compile the code, follow these steps:
1 2 3 4 5 |
git clone https://github.com/kzahedi/entropy mkdir entropy/build cd entropy/build cmake .. make -j |
You can also use Ninja instead of make
1 2 3 4 5 |
git clone https://github.com/kzahedi/entropy mkdir entropy/build cd entropy/build cmake .. -G Ninja ninja |
Available Tools
Currently, Entropy++ only works with discretised data. This is why the discretisation procedure in Entropy++ is discussed first.
Discretisation
Entropy++ has a class called Container
which purpose is to store (real value) data and preprocess it for the methods below. All functions (entropy, mutual information, etc.) take a Container
as their input.
For the purpose of efficiency, the size of a container must be specified before using it:
1 |
DContainer* container = new DContainer(3,2); |
create a container with 3 rows and 2 columns.
Containers are easily filled in the following way:
1 2 3 4 5 6 7 8 |
double index = 0.0; for(int row = 0; row < 3; row++) for(int column = 0; column < 2; column++) { (*container) << index; index = index + 0.1; } } |
Containers can be specified to fill rows first or columns first. By default, containers are filled row-wise, which means that in the example above, the following code
1 |
cout << *container << endl; |
will lead to the following output
1 2 3 4 |
3x2 0.0 0.1 0.2 0.3 0.4 0.5 |
Column-wise filling would have resulted in
1 2 3 4 |
3x2 0.0 0.3 0.1 0.4 0.2 0.5 |
Containers can be discretised column-wise or over all columns at once. To discretise, one has to specify the bins and domains for each column. It is also possible to set the same bins and domains for all columns simultaneously:
1 2 3 4 |
container->setBins(10); container->setDomains(0.0,0.5); ULContainer* colwise = container->discretiseByColumn(); ULContainer* combined = container->discretise(); |
Container itself is a template class. DContainer is equivalent to Container and ULContainer is equivalent to Container.
Entropy
Entropy is defined in the following way
The following code is taken from entropy_test.cpp which is provided with Entropy++
1 2 3 4 5 6 7 8 9 10 11 12 13 |
DContainer X(1000,1); for(float i = 0; i < 1000.0; i = i + 1.0) { X << i; } X.setDomains(0.0, 999.0); X.setBinSizes(1000); ULContainer *dx = X.discretise(); double s = entropy::H(dx); double t = entropy::sparse::H(dx); delete dx; |
Both t and s are equal to 1.0.
There are two implementations for the entropy H, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The H implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.
Mutual Information
Mutual Information is defined in the following way
The following code is taken from mi_test.cpp which is provided with Entropy++
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
DContainer X(1000,1); DContainer Y(1000,1); for(float i = 0; i < 1000.0; i = i + 1.0) { X << cos(i/10.0); Y << sin(i/5.0); } X.setDomains(-1.0,1.0); X.setBinSizes(100); Y.setDomains(-1.0,1.0); Y.setBinSizes(100); ULContainer *dx = X.discretise(); ULContainer *dy = Y.discretise(); double s = entropy::MI(dx, dy); double t = entropy::sparse::MI(dx, dy); delete dx; delete dy; |
There are two implementations for the mutual information, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The mutual information implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.
Conditional Mutual Information
Conditional Mutual Information is defined in the following way
The following code is taken from cmi_test.cpp which is provided with Entropy++
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
DContainer X(1000,1); DContainer Y(1000,1); DContainer Z(1000,1); for(float i = 0; i < 1000.0; i = i + 1.0) { X << cos(i/10.0); Y << sin(i/5.0); Z << cos(i/5.0) * sin(i/5.0); } X.setDomains(-1.0,1.0); X.setBinSizes(100); Y.setDomains(-1.0,1.0); Y.setBinSizes(100); Z.setDomains(-1.0,1.0); Z.setBinSizes(100); ULContainer *dx = X.discretise(); ULContainer *dy = Y.discretise(); ULContainer *dz = Z.discretise(); double s = entropy::CMI(dx, dy, dz); double t = entropy::sparse::CMI(dx, dy, dz); delete dx; delete dy; delete dz; |
There are two implementations for the conditional mutual information, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The conditional mutual information implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.
Predictive Information
Predictive Information is defined in the following way (one-step approximation)
It can be used in the following way
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
DContainer X(1000,1); for(float i = 0; i < 1000.0; i = i + 1.0) { X << cos(i/10.0); } X.setDomains(-1.0,1.0); X.setBinSizes(100); ULContainer *dx = X.discretise(); double s = entropy::PI(dx); double t = entropy::sparse::PI(dx); delete dx; |
There are two implementations for the predictive information, which can be found in either entropy or entropy::sparse. The difference is that classes in entropy are useful for either small data sets or such which are not sparse. It uses the Matrix implementation. The mutual information implementation in the entropy::sparse package uses a sparse matrix implementation, which is useful for large, sparse data sets.
Morphological Computation
Currently, Entropy++ provides a set of quantification in the context of morphological computation, which have been investigated in different publications (see below).
MC_W
MC_W[1] is defined in the following way
MC_A
MC_A[1] is defined as
MC_MI
MC_MI[2] is defined as
Bibliography
[Bibtex]
@article{Zahedi2013aQuantifying,
Author = {Zahedi, Keyan and Ay, Nihat},
Issn = {1099-4300},
Journal = {Entropy},
Number = {5},
Pages = {1887--1915},
Pdf = {http://www.mdpi.com/1099-4300/15/5/1887},
Title = {Quantifying Morphological Computation},
Volume = {15},
Year = {2013}}
[Bibtex]
@article{Ghazi-Zahedi2016aEvaluating,
Author = {Ghazi-Zahedi, Keyan and Haeufle, Daniel F.B. and Montufar, Guido Francisco and Schmitt, Syn and Ay, Nihat},
Issn = {2296-9144},
Journal = {Frontiers in Robotics and AI},
Number = {42},
Pdf = {http://www.frontiersin.org/computational_intelligence/10.3389/frobt.2016.00042/abstract},
Title = {Evaluating Morphological Computation in Muscle and DC-motor Driven Models of Hopping Movements},
Volume = {3},
Year = {2016}}