

# Jensen-Shannon Divergence (JS)
<a name="clarify-data-bias-metric-jensen-shannon-divergence"></a>

The Jensen-Shannon divergence (JS) measures how much the label distributions of different facets diverge from each other entropically. It is based on the Kullback-Leibler divergence, but it is symmetric. 

The formula for the Jensen-Shannon divergence is as follows:

        JS = ½\$1[KL(Pa \$1\$1 P) \$1 KL(Pd \$1\$1 P)]

Where P = ½( Pa \$1 Pd ), the average label distribution across facets *a* and *d*.

The range of JS values for binary, multicategory, continuous outcomes is [0, ln(2)).
+ Values near zero mean the labels are similarly distributed.
+ Positive values mean the label distributions diverge, the more positive the larger the divergence.

This metric indicates whether there is a big divergence in one of the labels across facets. 