Introduction - If you have any usage issues, please Google them yourself
Relative entropy, used to measure the difference between two positive functions or probability distributions.
Cross entropy is used to measure the size of the effort required to eliminate the uncertainty of the system by using the strategy specified by the non real distribution in a given real distribution.