KL Divergence of Gaussians

Preliminary: KL DivergenceKullback–Leibler (KL) Divergence, aka the relative entropy or I-divergence is a distance metric that quantifies the difference between two probability distributions. We de...

发布于 Statistics

Basic Skills in Data Analysis

This blog has been migrated to Microsoft Azure and is generated automatically using Azure DevOps Pipeline. Due to some problems in Node.js, some inline MathJax may be error. The author is working ...

发布于 Data Analysis