IJPAM: Volume 77, No. 3 (2012)

ON `USEFUL' R-NORM RELATIVE INFORMATION
AND J-DIVERGENCE MEASURES

Satish Kumar$^1$, Gurdas Ram$^2$, Vishal Gupta$^3$
$^1$Department of Mathematics
Geeta Institute of Management and Technology
Kanipla, (Kurukshetra), INDIA
$^2$Department of Applied Sciences
M.M. University
Solan (H.P.), INDIA
$^3$Department of Mathematics
M.M. University
Mullana (Ambala), INDIA


Abstract. In this paper some new generalized R-Norm measures of useful relative information have been defined and their particular cases have been studied. From these measures new useful R-Norm information measures have also been derived. We have obtained J-divergence corresponding to each measure of useful relative R-norm information. in the end, an equality satisfied by useful J-divergence of type $\beta$ has been proved.

Received: December 1, 2011

AMS Subject Classification: 94A15, 94A24, 26D15

Key Words and Phrases: Renyi's entropy, concave and convex functions, R-norm relative information measure, homogeneous function, J-divergence of type $\beta$

Download paper from here.



Source: International Journal of Pure and Applied Mathematics
ISSN printed version: 1311-8080
ISSN on-line version: 1314-3395
Year: 2012
Volume: 77
Issue: 3