libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro

pardo leandro - statistical inference based on divergence measures

Statistical Inference Based on Divergence Measures




Disponibilità: Normalmente disponibile in 20 giorni
A causa di problematiche nell'approvvigionamento legate alla Brexit sono possibili ritardi nelle consegne.


PREZZO
136,98 €
NICEPRICE
130,13 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, 18App Bonus Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Pubblicazione: 10/2005
Edizione: 1° edizione





Trama

Organized in systematic way, Statistical Inference Based on Divergence Measures presents classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence with applications to multinomial and generation populations. On the basis of divergence measures, this book introduces minimum divergence estimators as well as divergence test statistics and compares them to the classical maximum likelihood estimator, chi-square test statistics, and the likelihood ratio test in different statistical problems. The text includes over 120 exercises with solutions, making it ideal for students with a basic knowledge of statistical methods.




Note Editore

The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions.Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.




Sommario

DIVERGENCE MEASURES: DEFINITION AND PROPERTIES Introduction Phi-divergence. Measures between Two Probability Distributions: Definition and Properties Other Divergence Measures between Two Probability Distributions Divergence among k Populations Phi-disparities Exercises Answers to Exercises ENTROPY AS A MEASURE OF DIVERSITY: SAMPLING DISTRIBUTIONS Introduction Phi-entropies. Asymptotic Distribution Testing and Confidence Intervals for Phi-entropies Multinomial Populations: Asymptotic Distributions Maximum Entropy Principle and Statistical Inference on Condensed Ordered Data Exercises Answers to Exercises GOODNESS-OF-FIT: SIMPLE NULL HYPOTHESIS Introduction Phi-divergences and Goodness-of-fit with Fixed Number of Classes Phi-divergence Test Statistics under Sparseness Assumptions Nonstandard Problems: Tests Statistics based on Phi-divergences Exercises Answers to Exercises OPTIMALITY OF PHI-DIVERGENCE TEST STATISTICS IN GOODNESS-OF-FIT Introduction Asymptotic Effciency Exact and Asymptotic Moments: Comparison A Second Order Approximation to the Exact Distribution Exact Powers Based on Exact Critical Regions Small Sample Comparisons for the Phi-divergence Test Statistics Exercises Answers to Exercises MINIMUM PHI-DIVERGENCE ESTIMATORS Introduction Maximum Likelihood and Minimum Phi-divergence Estimators Properties of the Minimum Phi-divergence Estimator Normal Mixtures: Minimum Phi-divergence Estimator Minimum Phi-divergence Estimator with Constraints: Properties Exercises Answers to Exercises GOODNESS-OF-FIT: COMPOSITE NULL HYPOTHESIS Introduction Asymptotic Distribution with Fixed Number of Classes Nonstandard Problems: Test Statistics Based on Phi-divergences Exercises Answers to Exercises Testing Loglinear Models Using Phi-divergence Test Statistics Introduction Loglinear Models: Definition Asymptotic Results for Minimum Phi-divergence Estimators in Loglinear ModelsTesting in Loglinear Models Simulation Study Exercises Answers to Exercises PHI-DIVERGENCE MEASURES IN CONTINGENCY TABLES Introduction Independence Symmetry Marginal Homogeneity Quasi-symmetry Homogeneity Exercises Answers to Exercises TESTING IN GENERAL POPULATIONS Introduction Simple Null Hypotheses: Wald, Rao, Wilks and Phi-divergence Test Statistics Composite Null Hypothesis Multi-sample Problem Some Topics in Multivariate Analysis Exercises Answers to Exercises References Index




Autore

Leandro Pardo










Altre Informazioni

ISBN:

9781584886006

Condizione: Nuovo
Collana: Statistics: A Series of Textbooks and Monographs
Dimensioni: 9.25 x 6.25 in Ø 1.75 lb
Formato: Copertina rigida
Illustration Notes:22 b/w images, 25 tables and 500 equations
Pagine Arabe: 512


Dicono di noi