mars.tensor.special.rel_entr#

mars.tensor.special.rel_entr(x, y, out=None, where=None, **kwargs)[source]#

Elementwise function for computing relative entropy.

$\begin{split}\mathrm{rel\_entr}(x, y) = \begin{cases} x \log(x / y) & x > 0, y > 0 \\ 0 & x = 0, y \ge 0 \\ \infty & \text{otherwise} \end{cases}\end{split}$
Parameters
• x (array_like) – Input arrays

• y (array_like) – Input arrays

• out (ndarray, optional) – Optional output array for the function results

Returns

Relative entropy of the inputs

Return type

scalar or ndarray

entr, kl_div

Notes

This function is jointly convex in x and y.

The origin of this function is in convex programming; see 1. Given two discrete probability distributions $$p_1, \ldots, p_n$$ and $$q_1, \ldots, q_n$$, to get the relative entropy of statistics compute the sum

$\sum_{i = 1}^n \mathrm{rel\_entr}(p_i, q_i).$

See 2 for details.

References

1

Grant, Boyd, and Ye, “CVX: Matlab Software for Disciplined Convex Programming”, http://cvxr.com/cvx/

2

Kullback-Leibler divergence, https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence