Aporia Raised $25M Series A to Build Trust in AI 🎉 Read More
Aporia Raised $25M Series A 🎉 Read More

Jensen-Shannon Divergence

Jensen-Shannon Divergence is a type of statistical method for concept drift detection, using the KL divergence.

JS(Q||P) = \frac{1}{2}(KL(Q||M) +KL(P||M))

Where M = \frac{Q+P}{2}  the mean between P and Q

The main differences between JS divergence and KL divergence are that JS is symmetric and it always has a finite value

Concept drift detection method Concept drift detection method

Learn more about these concepts in our articles: