📅  最后修改于: 2022-03-11 14:55:45.644000             🧑  作者: Mango
In mathematical statistics, the Kullback–Leibler divergence, KL-Divergence (also called relative entropy), is a measure of how one probability distribution is different from a second, reference probability distribution.