Spark library for generalized K-Means clustering. Supports general Bregman divergences. Suitable for clustering probabilistic data, time series data, high dimensional data, and very large data.
-
Updated
Jan 19, 2024 - HTML
Spark library for generalized K-Means clustering. Supports general Bregman divergences. Suitable for clustering probabilistic data, time series data, high dimensional data, and very large data.
Trending algorithm based on the article "Trending at Instagram"
Maximum entropy and minimum divergence models in Python
Methods for computational information geometry
Kullback-Leibler projections for Bayesian model selection in Python
Code for Variable Selection in Black Box Methods with RelATive cEntrality (RATE) Measures
[CVPR 2023] Modeling Inter-Class and Intra-Class Constraints in Novel Class Discovery
Experiments of the three PPO-Algorithms (PPO, clipped PPO, PPO with KL-penalty) proposed by John Schulman et al. on the 'Cartpole-v1' environment.
PyTorch implementations of the beta divergence loss.
🐍 🔬 Fast Python implementation of various Kullback-Leibler divergences for 1D and 2D parametric distributions. Also provides optimized code for kl-UCB indexes
[Python] Comparison of empirical probability distributions. Integral probability metrics (e.g. Kantorovich metric). f-divergences (e.g. Kullback-Leibler). Application to the Choquet integral.
Code, data, and tutorials for "Sense organ control in moths to moles is a gamble on information through motion"
Non-Negative Matrix Factorization for Gene Expression Clustering
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
This project implements in Python some common statistical analysis methods used in data analysis, including Entropy, Mutual Information, Kolmogorov–Smirnov test, Kullback-Leibler divergence (KLD), AB tests (Mann-Whitney U and t-tests)
Can we identify key events in a war by analyzing raw text from news stories?
💫 Fast Julia implementation of various Kullback-Leibler divergences for 1D parametric distributions. 🏋 Also provides optimized code for kl-UCB indexes
Using entities from NER on GOV.UK content to power personalisation.
Particle Filter tracker and square-shape detection
The repo consists of Statistics Algorithms
Add a description, image, and links to the kullback-leibler-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kullback-leibler-divergence topic, visit your repo's landing page and select "manage topics."