GID

Software

Gists


Here you can find some code snippets or short source code for complete programs that might interest you.

Miscellaneous


omnipyseed

Omnipyseed is a tiny and simple Python package that can be used to seed random number generators (RNGs) of Python, Numpy, and Pytorch.

timeseries_imputlorer

Imputlorer is a collection of scripts implementing univariate time series imputation methods relying on sklearn. In addition, the repository provides statistical tools for evaluating and comparing the performance of different imputation techniques. Furthermore, it allows the user to test the imputed data in regression problems via the XGBoost regressor. The user can optimize the XGBoost hyperparameters using Ray Tune.

Machine/Deep Learning


DNN for Solving PDEs & ODEs

This repository contains a primer for solving partial and ordinary differential equations using deep neural networks (DNNs). The primer consists of a PDF file explaining the basic concepts of deep learning and differential equations and the complete source code implementing the solutions.

Adversarial Validation

This Python class implements the adversarial validation (AV) method. One can use the AV method to test if the train/test data sets come from the same distribution.

Pytorch TimeseriesLoader

A Pytorch Dataset class for time series data sets. The class provides several methods for preprocessing raw time series data (Box-Cox transform, normalization, standardization, mu-law).

Time series collection

There are many ways to perform time series forecasting. One of the most recent (modern) ways is to use artificial neural networks (or even deep neural networks) to model time series and perform predictions usign those models. The present repository provides three mainstream artificial neural networks, an LSTM, a MLP, and a TCN for time series forecasting.

Autoencoders for time series

Autoencoders are a type of artificial neural networks that can learn efficient codings of unlabeled data. Autoencoders learn the code by trying to map the identity function (e.g., they try to reconstruct the input at their output layer). This repository contains four types of autoencoders: (i) a standard linear autoencoder (AE and Variational AE), (ii) an LSTM autoencoder (AE and VAE), (iii) a convolutional autoencoder, and finally a causal convolutional autoencoder.

Pytorch-Time2Vec

Time2Vec is an algorithm that provides a learning representation of time, which is model agnostic and can be used to encode temporal dynamics in many different applications (e.g., in Transformers for time series).

Neural Sampling Machines

NSMs or Neural Sampling Machines is a Pytorch implementation of the algorithms proposed in

Random Contrastive Hebbian Learning (rCHL)

rCHL is a learning algorithm that relies on the contrastive Hebbian learning algorithm. The major difference is that the feedback pathway does not use any kind of learnable weights. Instead, it exploits fixed random weights.

Restricted Bolzmann Machine (in C)

Restricted Boltzmann Machine is an artificial neural network with generative capabilities. Usually, it consists of two layers and can learn a probability distribution over its set of inputs through a contrastive divergence algorithm.

Annealed Importance Sampling (pyAIS)

When one has to compute the partition function Z of a probability distribution (Z = \sum_{x}f(x), where p(x) = \frac{1}{Z}f(x)) often applies an Importance Sampling (IS) method. The issue with IS is the choice of its single hyperparameter, namelly the proposal distribution. Annealed Importance Sampling overcomes that problem by creating intermediate distributions. AIS does this by “moving” the proposal distribution until one gets a fair approximation of the target (original) distribution.

Self-organizing maps Dx-Dy representations

SOM-DyDx, is a python script that implements Demartines dy-dx representation method. This method is useful to inspect if a self-organized map is well-organized.

Neuromorphic Computing


Neural and Synaptic Array Transceiver (NSAT)

Neural and Synaptic Array Transceiver is a Neuromorphic Computational Framework facilitating flexible and efficient embedded learning by matching algorithmic requirements and neural and synaptic dynamics. NSAT supports event-driven supervised, unsupervised and reinforcement learning algorithms.

NSATCarl

NSATCarl is a simple C++ library that brings together the Neural and Synaptic Array Transceiver (NSAT) and the neural simulator CARLsim.

Optimization


Genetic Algorithms and Island Models (GAIM)

Genetic Algorithms (GAs) are optimization metaheuristics inspired by the theory of evolution and the process of natural selection. GAs belong to a larger class of heuristics called evolutionary algorithms (EA). Island Model (IM) is a collection of algorithms that enable many populations to evolve over the same optimization problem providing faster convergence to a solution than a simple GA. Furthermore, IMs can overcome the problem of local minima due to a migration policy between islands (different populations).

Dynamical Systems


Neural Field Self-organizing Map Stability Analysis

This project is about establishing a stability condition for a class of neural fields. The main idea is to provide a stability condition for a self-organizing map based on neural fields, which allows us to know a priori if the learning will converge to a stable map.

Empirical Dynamic Modeling (empyred)

Empirical Dynamic Modeling (EDM) is an equation-free framework for modeling non-linear dynamic systems. EDM relies on Takens' (1981) theorem of temporal embeddings and reconstruction of system attractors from temporal embeddings. Empyred is a pure Python implementation of the EDM methods: Simplex and SMAP.

Neuroscience


Correlated Spike Trains (CorrSpikeTrains)

This is a Python implementation of the paper “Generation of correlated spike trains”. Both the doubly stochastic processes and the mixture method described in the original paper are implemented.

Spike sorting - SPySort

Spike sorting is a class of algorithms for classifying spikes into clusters based on a similarity measure. Usually, spike sorting identifies the waveforms of neural spiking in signals collected from extracellular recordings. SpySort is a Python package that implements a spike sorting algorithm.

Neural Fields and Deep Brain Stimulation

This project regards a deep brain stimulation (DBS) model of the globus pallidus and the subthalamic nucleus of the basal ganglia. The model explores the effects of optogenetic DBS stimulation in closed-loop treatments of motor symptoms of Parkinson’s disease. The model relies on neural fields and a simple proportional controller.

Primary Somatosensory Cortex and Structure of its Receptice Fields

This repo contains a computational model of area 3b of the somatosensory cortex. The model relies on the neural field’s theory and reproduces an exact experimental protocol described in DiCarlo et al., 1998. The model describes the structure of receptive fields in area 3b of the primary somatosensory cortex and how attention mechanisms affect that structure. The source code, beyond the model, provides tools to analyze the results and generate the figures from the original paper. The model is capable of obtaining similar results as the original experiment described in DiCarlo et al..

Self-Organization and Sensory Topographic Maps

A collection of Python scripts implement a self-organization model of the primary somatosensory cortex. The model relies on the neural field’s theory and describes the dynamics of the primary somatosensory cortex of monkeys. It forms, maintains, and reorganizes somatosensory topographic maps following a biologically plausible Oja-like learning rule.

Signal Processing


Semi-Classical Signal Analysis

This repository includes a Python implementation of the semi-classical signal analysis algorithm. The algorithm is designed to denoise and analyze pulse-shaped signals, such as ECG signals or neural spikes. Users will find examples demonstrating how to use the SCSA class and all the necessary documentation.

Signal denoising with Wavelets

In this repository, one will find an implementation of signal denoising using wavelets. Denoising is reducing the noise in a signal as much as possible without distorting the signal. The user will find examples of using the library and all the necessary documentation.

Numerical Analysis


Pseudospectra Analysis for Rectangular Matrices (pygpsa)

Pseudospectra of a matrix (or an operator) is a set containing its spectrum and the “pseudo”-eigenvalues. Pseudospectra is particularly useful for understanding and/or revealing information about non-normal matrices (operators). A Python script that computes the pseudospectra of a rectangular matrix. For more information about pseudospectra visit the Pseudospectra Gateway.

CRKMethods

Runge-Kutta (RK) methods are iterative methods used in the temporal discretization of (numerical) approximations of Ordinary Differential Equations (ODEs). CRKMethods is a collection of explicit RK methods: (i) Foreward Euler’s method, (ii) RK45, (iii) Refined RK45, and (iv) Fehlberg’s method. Each method is implemented as a single step, so the end-user has to run over N timesteps to get the final solution. Furthermore, the end-user is responsible for providing the time-step (dt). For more details, please have a look at the file main.c in the src directory.