To Glossary Lobby

Jensen-Shannon Divergence

Jensen-Shannon Divergence is a type of statistical method for concept drift detection, using the KL divergence.

JS(Q||P) = \frac{1}{2}(KL(Q||M) +KL(P||M))

Where M = \frac{Q+P}{2}  the mean between P and Q

The main differences between JS divergence and KL divergence are that JS is symmetric and it always has a finite value

Concept drift detection methodConcept drift detection method

Learn more about these concepts in our articles:

Start Monitoring Your Models in Minutes