Jensen-Shannon Divergence

Jensen-Shannon Divergence is a type of statistical method for concept drift detection, using the KL divergence.

JS(Q||P) = \frac{1}{2}(KL(Q||M) +KL(P||M))

Whereย M = \frac{Q+P}{2} ย the mean between P and Q

The main differences between JS divergence and KL divergence are that JS is symmetric and it always has a finite value

Concept drift detection method Concept drift detection method

Learn more about these concepts in our articles:

Start Monitoring Your Models in Minutes