Measures of information and their applications /

Saved in:
Bibliographic Details
Author / Creator:Kapur, J. N. (Jagat Narain), 1923-2002.
Imprint:New York : Wiley, 1994.
Description:xi, 572 p. : ill. ; 24 cm.
Language:English
Subject:
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/1720940
Hidden Bibliographic Details
ISBN:0470220643 (John Wiley & Sons Inc.)
8122404847 (Wiley Eastern Ltd.)
Notes:Includes bibliographical references and index.

MARC

LEADER 00000nam a2200000 a 4500
001 1720940
003 ICU
005 20230925132637.9
008 950317s1994 nyua b 001 0 eng u
010 |a  92040846  
020 |a 0470220643 (John Wiley & Sons Inc.) 
020 |a 8122404847 (Wiley Eastern Ltd.) 
035 |a (ICU)BID19734026 
040 |a DLC  |c DLC  |d DLC  |d OrLoB-B  |d OCoLC 
041 0 |a eng 
050 0 0 |a Q370  |b .K38 1994 
082 0 0 |a 003/.54  |2 20 
100 1 |a Kapur, J. N.  |q (Jagat Narain),  |d 1923-2002.  |1 http://viaf.org/viaf/79097229 
245 0 0 |a Measures of information and their applications /  |c J.N. Kapur. 
260 |a New York :  |b Wiley,  |c 1994. 
300 |a xi, 572 p. :  |b ill. ;  |c 24 cm. 
336 |a text  |b txt  |2 rdacontent  |0 http://id.loc.gov/vocabulary/contentTypes/txt 
337 |a unmediated  |b n  |2 rdamedia  |0 http://id.loc.gov/vocabulary/mediaTypes/n 
338 |a volume  |b nc  |2 rdacarrier  |0 http://id.loc.gov/vocabulary/carriers/nc 
504 |a Includes bibliographical references and index. 
505 0 0 |g 1.  |t An Unorthodox Measure of Entropy --  |g 2.  |t A New Parametric Measure of Entropy --  |g 3.  |t On the Use of Maximum-entropy Principle for Generating New Entropies --  |g 4.  |t Characterisation Theorems for Shannon and Havrda-Charvat Measures of Entropy --  |g 5.  |t On Recursivity Property of Measures of Entropy --  |g 6.  |t Are Additivity and Recursivity Identical? --  |g 7.  |t On Weighted Entropies and Entropic Means --  |g 8.  |t Measures of Entropy for Continuous-variate Probability Distributions --  |g 9.  |t On Measures of Entropy for Continuous-variate Probability Distributions --  |g 10.  |t Discrete Analogues of Measures of Entropy for Continuous Variate Distributions and their Applications to Parameter Estimation --  |g 11.  |t Correct Measures of Weighted Directed Divergence --  |g 12.  |t Some New Measures of Directed Divergence --  |g 13.  |t Two New Measures of Entropy and Directed Divergence --  |g 14.  |t Some New Additive Measures of Entropy and Directed Divergence --  |g 15.  |t On Equivalent Sets of Measures of Entropy and Cross-entropy --  |g 16.  |t Entropy and Directed Divergence Measures which Lead to Positive Probabilities on Optimisation Subject to Linear Constraints --  |g 17.  |t Generating Measures of Cross-entropy by Using Measures of Weighted Entropy --  |g 18.  |t Normalised Measures of Information --  |g 19.  |t Generating Functions for Information Measures --  |g 20.  |t On Generating Appropriate Measures of Weighted Entropy and Weighted Directed Divergence --  |g 21.  |t Measuring the Uncertainty of a Set of Probability Distributions --  |g 22.  |t Measuring the Uncertainty of a Birth-death Queueing Process --  |g 23.  |t On Minimum Entropy Probability Distributions --  |g 24.  |t The Most Feasible Probability Distributions --  |g 25.  |t The Most Likely Probability Distribution --  |g 26.  |t Duals of Entropy Optimization Problems --  |g 27.  |t Maximum-Entropy Probability Distribution when Mean of a Random Variate is Prescribed and Burg's Entropy Measure is Used --  |g 28.  |t Some Theorems Concerning Maximum-entropy and Minimum Cross-entropy Probability Distributions --  |g 29.  |t On Concavity of Maximum Entropy when Burg's Entropy is Maximised Subject to Its Mean being Prescribed --  |g 30.  |t The Principle of Weighted Minimum Information --  |g 31.  |t Maximum Weighted Entropy Principle --  |g 32.  |t On Lind and Solana's Principle of Minimum Information --  |g 33.  |t On a Principle of Minimum Information --  |g 34.  |t Maximum-Entropy and Minimum Cross-entropy Probability Distribution when There are Inequality Constraints on Probabilities --  |g 35.  |t Maximum Entropy Probability Distributions when There are Inequality Constraints on Probabilities --  |g 36.  |t Maximum-Entropy Probability Distributions when There are Inequality Constraints on Probabilities: An Alternative Approach --  |g 37.  |t On Maximum Entropy Probability Distribution when the Prescribed Values of Moments are Functions of One or More Parameters --  |g 38.  |t Characterising Moments of a Probability Distribution --  |g 39.  |t Maximum Entropy Estimation of Missing Values --  |g 40.  |t On the Use of Minimum Entropy Principles in Estimation of Missing Values --  |g 41.  |t Higher Order Moments of Distributions of Statistical Mechanics --  |g 42.  |t On Maximum Entropy Derivation of Distributions of Statistical Mechanics --  |g 43.  |t Some New Statistical Mechanics Distributions and Their Applications --  |g 44.  |t Maximum Entropy Principle and Parameter Estimation --  |g 45.  |t Inverse Maxent and Minxent Principles and Their Applications to Spectral Analysis --  |g 46.  |t Maximum-entropy Solutions of Some Assignment Problems --  |g 47.  |t An Important Application of Generalised Minimum Cross-entropy Principle: Generating Measures of Directed Divergence --  |g 48.  |t Some New Applications of Generalised Maximum Entropy Principle --  |g 49.  |t Approximating a Given Probability Distribution by a Maximum Entropy Distribution --  |g 50.  |t Closest Approximation to a Mixture of Distributions --  |g 51.  |t Some Entropy-Based Logistic Type Growth Models --  |g 52.  |t Information Theoretic Proofs of Some Algebraic Geometric and Trigonometric Inequalities --  |g 53.  |t On Use of Information-Theoretic Concepts in Proving Inequalities --  |g 54.  |t The Three Laws of Information --  |g 55.  |t A Paradox in Information Theory and Its Resolution Though the Concept of Bayesian Entropy --  |g 56.  |t Some Theorems in Information Theory --  |g 57.  |t Probability Distributions for Entropy Functions --  |g 58.  |t The Inverse Problems on Entropic Means --  |g 59.  |t The Mean-Entropy Frontier --  |g 60.  |t Maximum and Minimum Values of Some Characteristic Parameters of a Probability Distribution When One of These has a Prescribed Value. 
650 0 |a Entropy (Information theory)  |0 http://id.loc.gov/authorities/subjects/sh85044152 
650 7 |a Entropy (Information theory)  |2 fast  |0 http://id.worldcat.org/fast/fst00912828 
901 |a ToCBNA 
903 |a HeVa 
035 |a (OCoLC)27012387 
929 |a cat 
999 f f |i ca0de795-08c1-578d-9c9b-b6d71f79f78b  |s 256211ce-a5dc-59aa-a532-3e11df54ad00 
928 |t Library of Congress classification  |a Q370.K380 1994  |l JCL  |c JCL-Sci  |i 2881026 
927 |t Library of Congress classification  |a Q370.K380 1994  |l JCL  |c JCL-Sci  |e CRERAR  |b 42498835  |i 3233706