Lavoisier S.A.S.
14 rue de Provigny
94236 Cachan cedex

Heures d'ouverture 08h30-12h30/13h30-17h30
Tél.: +33 (0)1 47 40 67 00
Fax: +33 (0)1 47 40 67 02

Url canonique :
Url courte ou permalien :

Applied multivariate data analysis volume 1: regression and experimental design springer texts in statistics (1st ed 1991 corr 4th printing) POD, Softcover reprint of the original 1st ed. 1991 Regression and Experimental Design Springer Texts in Statistics Series

Langue : Anglais

Auteur :

Couverture de l’ouvrage Applied multivariate data analysis volume 1: regression and experimental design springer texts in statistics (1st ed 1991 corr 4th printing) POD
An easy to read survey of data analysis, linear regression models and analysis of variance. The extensive development of the linear model includes the use of the linear model approach to analysis of variance provides a strong link to statistical software packages, and is complemented by a thorough overview of theory. It is assumed that the reader has the background equivalent to an introductory book in statistical inference. Can be read easily by those who have had brief exposure to calculus and linear algebra. Intended for first year graduate students in business, social and the biological sciences. Provides the student with the necessary statistics background for a course in research methodology. In addition, undergraduate statistics majors will find this text useful as a survey of linear models and their applications.
1 Introduction.- 1.1 Multivariate Data Analysis, Data Matrices and Measurement Scales.- 1.1.1 Data Matrices.- 1.1.2 Measurement Scales.- Quantitative Scales, Qualitative Scales, Measurement Scales and Analysis.- 1.2 The Setting.- 1.2.1 Data Collection and Statistical Inference.- Probability Samples and Random Samples, Exploratory and Confirmatory Analysis.- 1.2.2 An Outline of the Techniques to Be Studied.- Topics in Volume I.- 1.3 Review of Statistical Inference for Univariate Distributions.- 1.3.1 Populations, Samples and Sampling Distributions.- Probability Distribution and Density, Discrete Distribution, Expectation, Parameters, Sampling.- 1.3.2 Statistical Inference, Estimation and Hypothesis Testing.- Estimation, Confidence Intervals, Hypothesis Testing.- 1.3.3 Three Useful Sampling Distributions.- The?2Distribution, ThetDistribution, TheFDistribution.- 1.3.4 Some Inference Procedures for Normal Populations.- Inference for the Mean, Inference for the Variance.- 1.3.5 Inference Procedures for Non-Normal Populations.- Statistical Inference for a Population Proportion, Multinomial Distribution, Hypergeometric Distribtution, Poisson Distribution.- Exercises for Chapter 1.- Questions for Chapter 1.- 2 Univariate Data Analysis.- 2.1 Data Analysis for Univariate Samples.- 2.1.1 Stem and Leaf Plots.- Optimum Number of Intervals.- 2.1.2 Frequency Distributions.- Relative Frequency, Cumulative Frequency.- 2.1.3 Quantile Plots.- Quantile Plot for a Normal Distribution.- 2.2 Characteristics of Sample Distributions.- 2.2.1 Measures of Location.- Mean, Median and Mode, Geometric Mean, Average Rate of Change.- 2.2.2 Measures of Spread.- Variance, Standard Deviation, Median Absolute Deviation, Range, Interquartile Range, Box Plots.- 2.2.3 Measures of Skewness.- Coefficient of Skewness, A Symmetry Plot, Index of Skewness, Examples of Skewed Distributions, Sample Form of Index of Skewness.- 2.2.4 Measures of Kurtosis.- Measuring Kurtosis Using the Spread, An Index of Kurtosis, Some Examples of Distributions with Kurtosis, Sample Measures of Kurtosis.- 2.2.5 Impact of Shape on Inference forµand?.- 2.3 Outliers.- 2.3.1 Detection of Outliers.- Inference for Outliers, Tests for Outliers Using Order Statistics, Using Indices of Skewness and Kurtosis.- 2.3.2 Robust Estimation.- 2.4 Assessing Normality.- 2.4.1 K-S Test for Normality.- 2.4.2 Graphical Techniques.- The Q-Q Plot.- 2.4.3 Other Tests for Normality.- 2.5 Transformations.- 2.5.1 Removing Skewness.- Box-Cox ? Method, Approximating ?, An Alternative Approximation, Negative Observations.- 2.5.2 Removing Kurtosis.- A Modified Power Transformation.- 2.5.3 Transformations in the Presence of Other Variables.- Cited Literature for Chapter 2.- Exercises for Chapter 2.- Questions for Chapter 2.- 3 Bivariate Analysis for Qualitative Random Variables.- 3.1 Joint Distributions.- 3.1.1 Discrete Bivariate Distributions.- Moments for Joint Distributions, Linear Combinations, Conditional Distributions and Independence, Conditional Densities, Moments for Conditional Distributions.- 3.1.2 Continuous Bivariate Distributions and the Bivariate Normal.- The Bivariate Normal, Elliptical Contours for the Bivariate Normal Distribution, Some Geometry for Elliptical Contours, Mahalanobis Distance, Conditional Density and Regression Functions, Partitioning the Variance ?y2, Regression Function forX on Y, Regression Functions, Control Variables, Alternative Linear Relationships, Determining the True Parameters.- 3.2 Statistical Inference for Bivariate Random Variables.- 3.2.1 The Correlation Coefficient.- Pearson Correlation Coefficient, Inference when? = 0, Fisher Transformation for? ? 0, Other Measures of Correlation.- 3.2.2 Goodness of Fit and Outlier Detection for a Bivariate Normal.- Elliptical Contours, An Alternative Method, Robust Estimation of Covariance and Correlation.- 3.2.3 Comparison of Marginal Distributions.- Graphical Comparisons, Paired Comparisons.- 3.3 The Simple Linear Regression Model.- 3.3.1 The Theoretical Model and Estimation.- Theory, Assumptions, Least Squares Estimation, Residuals.- 3.3.2 Inference for the Regression Coefficients.- Confidence Intervals, Joint Inferences, Bonferroni Approximation.- 3.3.3 Goodness of Fit and Prediction.- Analysis of Variance, Prediction Intervals, Confidence Interval forE[Y], Confidence Interval for a ParticularY, Confidence Band for the Entire Regression Line, Using Prediction Intervals.- 3.3.4 Repeated Observations and a Test for Lack of Fit.- 3.3.5 An Alternative Geometric Interpretation.- 3.3.6 Scatterplots, Tansformations and Smoothing.- Scatterplots, Smoothing, Simple Moving Average, Weighted Moving Average, Using Medians, Other Plots, Transformations, Estimating a Power Transformation Using the Box-Cox Family.- 3.3.7 Influence, Outliers and Leverage.- Leverage, Residuals and Deleted Residuals, Press Statistic, Standardized Residuals, Studentized Residuals, Influence Diagnostics, Cook’s D, The DF Family, Robust Linear Fit.- 3.3.8 Residual Plots, Assumption Violations and Some Remedies.- Residual Plots, Heteroscedasticity, Autocorrelation, First Order Autocorrelation, Cochrane—Orcutt Procedure, Higher Order Autoregressive Models.- 3.3.9 Measurement Error in the Regression Model.- Measurement Error, Measurement Error in Bivariate Analysis, Estimator for the Simple Linear Regression Model With Measurement Error, Grouping Method.- 3.4 Regression and Correlation in a Multivariate Setting.- 3.4.1 The Partial Correlation Coefficient.- Joint Distribution of Three Variables and Partial Correlation, Sample Partial Correlation, Inference for Partial Correlation, Some Relationships Among ?XZ, ?YZ, ?XY, Supressor Variables, Possible Range for ?XY Given ?XZ and ?YZ.- 3.4.2 Partial Correlation and Regression.- Partial Regression Coefficients, Sample Relationships, Multiple Correlation, Supressor Variables and Multiple Correlation, Geometric Explanation of Supressor Variable Effect, Partial Coefficient of Determination, Sample Inference and Variance Inflation.- 3.4.3 Missing Variable Plots.- 3.4.4 Empirical vs Structural Relationships and Omitted Variable Bias.- 3.4.5 Qualitative Lurking Variables.- Cited Literature for Chapter 3.- Exercises for Chapter 3.- Questions for Chapter 3.- 4 Multiple Linear Regression.- 4.1 The Multiple Linear Regression Model.- 4.1.1 The Model, Assumptions and Least Squares Estimation.- The Model and Assumptions, Least Squares Estimation, Properties of Least Squares Estimators, Sums of Squares, Coefficient of Multiple Determination, Coefficient of Multiple Correlation, Some Population Parameters.- 4.1.2 Statistical Inference in Multiple Linear Regression.- An Additional Assumption, Inferences for Regression Coefficients, Inferences for the Model, Inference for Reduced Models, Relation to t-Test, Inferences for ? other than 0, Inferences for Linear Functions, Prediction for New Observations, Prediction Using Statistical Software.- 4.1.3 A Second Regression Example - Air Pollution Data.- Estimated Full Model, Some Reduced Models, Inferences for Linear Functions.- 4.1.4 Standardized Variables, Standardized Regression Coefficients and Partial Correlation.- Some Notation, Non-intercept Portion of Parameter Vector, Sample Standardized Values and Mean Corrected Values, Standardized Regression Coefficients, Matrix Notation for Standardized Coefficients, Corrected Sums of Squares and Cross-Products Matrix and Covariance Matrix, Correlation Matrix, Mean Corrected Form of Linear Model, Partial Correlation Coefficients, Higher Order Partial Correlation Coefficients, Orthogonal Columns of X.- 4.1.5 Omitted Variables, Irrelevant Variables and Incorrect Specification.- Notation for Omitted Variable Models, Least Squares Estimators, Properties of Estimators, Properties for Predictors, Irrelevant Variables.- 4.1.6 Geometric Representations.- 4.2 Variable Selection.- 4.2.1 Stepwise Regression Methods.- Selection and Deletion Criteria, Forward Selection, A Useful Analogy, Backward Elimination, Stepwise Method, Selecting a Model.- 4.2.2 Other Selection Criteria and Methods.- The R2 Measure, Adjusted R2 and Mean Square Error, Mallow’s Cp Statistic, Expected Mean Square Error of Prediction, Press Statistic, Maximum R2 Method, Minimum R2 Method, All Possible Regressions.- 4.2.3 Variable Selection Caution.- 4.3 Multicollinearity and Biased Regression.- 4.3.1 Multicollinearity.- Multicollinearity and Its Importance, Geometric View of Multicollinearity, Impact of Muticollinearity, Variance Inflation Factor, Multicollinearity Diagnostics, Using the Eigenvalue Structure, Multicollinearity Condition Number, Information from Eigenvectors, Inclusion of the Intercept, Multicollinearity Remedies.- 4.3.2 Biased Estimation.- Ridge Regression, Determination of k Principal Component Regression, Latent Root Regression.- 4.4 Residuals, Influence, Outliers and Model Validation.- 4.4.1 Residuals and Influence.- Leverage, Range of h Leverage and Multicollinearity, Residuals and Deleted Residuals, Measures of Influence, Cook’s D, The DFFIT Family, Andrews-Pregibon, Influential Subsets, Cook’s D and Influential Subsets, Andrews—Pregibon, DFFITS, Application of Multiple Case Measures, Using Cluster Analysis.- 4.4.2 Outliers.- Mean Shift Model, Multiple Outliers.- 4.4.3 Missing Variable Plots.- Added Variable Plot, Partial Residual Plot, Nonlinear Relationships.- 4.4.4 Model Validation.- Chow Test, Validation by Prediction, Press Statistic, Theoretical Consistency.- 4.5 Qualitative Explanatory Variables.- 4.5.1 Dummy Variables.- 4.5.2 Slope Shifters.- 4.5.3 An Example from Real Estate Pricing.- Comparison of Brokers, Seasonal Effects.- 4.5.4 Other Methods.- Effect Coding, Cell Parameter Coding.- 4.5.5 Two Qualitative Explanatory Variables and Interaction.- Interaction, Interaction and Outliers.- 4.5.6 An Example from Automobile Fuel Consumption.- 4.5.7 Other Approaches.- Effect Coding, Example for Automobile Data, Cell Parameter Coding, Example for Income vs. Education, Analysis of Variance.- 4.6 Additional Topics in Linear Regression.- 4.6.1 Curvilinear Regression Models.- Polynomial Models, Grafted Quadratics, Response Surface Models.- 4.6.2 Generalized Least Squares.- The Generalized Least Squares Estimator, Unknown Covariance Matrix, Feasible Generalized Least Squares, Heteroscedasticity, Models for Heteroscedasticity, Tests for Heteroscedasticity, Error Components Models, Autocorrelation.- Cited Literature and Additional References for Chapter 4.- Exercises for Chapter 4.- Questions for Chapter 4.- 5 Analysis of Variance and Experimental Design.- 5.1 One-Way Analysis of Variance.- 5.1.1 The One-Way Analysis of Variance Model.- The Sampling Model, The One-Way ANOVA Model, Sums of Squares, Computation Formulae, Estimation of ?2, The ANOVA Table, Expected Mean Squares, Group Effects Models, A Base Cell Model, Robustness, Experimental Design and Randomization.- 5.1.2 Multiple Comparisons.- LSD, Per Comparison Error and Experimentwise Error, Fisher’s LSD, Bonferroni, Sidak, Scheffé, Tukey, GT2, Gabriel, Duncan, S-N-K, REGW, Contrasts.- 5.1.3 Multiple Regression Model Approach.- Dummy Variables, Linear Model Notation, Least Squares Estimation, Effect Coding, Orthogonal Coding, A Cell Means Model Approach.- 5.1.4 Nonparametric ANOVA.- Kruskal-Wallis, Van Der Waerden Test.- 5.1.5 Homogeneity of Variance.- Bartlett’s Test, Hartley’s Test, Cochran’s Test, Some Nonparametric Alternatives, Levene’s Method, A Jackknife Method, A Procedure Based on Ranks.- 5.2 Two-Way Analysis of Variance.- 5.2.1 Randomized Block Design.- The Randomized Block Design Model, The ANOVA Table, Expected Mean Squares, Test for Equality of Group Means, Test for Equality of Block Means, Efficiency of the Blocked Design, A Cell Means Model, A Base Cell Model, A Nonparametric Approach, A Generalized Randomized Block Design.- 5.2.2 The Two Factor Factorial.- The Two-Way Factorial Model, Interaction, Interactions and Nonlinearity, Estimation, The ANOVA Table, Expected Mean Squares.- 5.2.3 A Multiple Regression Approach for the Balanced Two-way Model.- Dummy Coding, Effect Coding, Cell Means Model Approach, Orthogonal Coding, Interaction Effects.- 5.2.4 An Unbalanced Two-Way Analysis.- Dummy Variable Coding, Effect Coding, A Cell Means Approach.- 5.2.5 A Two-Way Nested Treatment Design.- 5.3 Analysis of Covariance.- The Analysis of Covariance Model, Inference for the Model, Multiple Regression Model Approach, Example with an Unbalanced Design.- 5.4 Some Three-Way Analysis of Variance Models.- 5.4.1 A Two-Way Factorial in a Randomized Block Design.- 5.4.2 The Latin Square Design.- Multiple Regression Approach to Latin Squares.- 5.4.3 Three-Way Factorial Model.- Three-Way Interactions.- 5.4.4 An Unbalanced Three-Way Analysis Example.- Examination of Cell Means, Analysis of Variance Approach, Multiple Regression Estimation, Effect Coding, A Cell Means Model Approach.- 5.5 Some Basics of Experimental Design.- Terminology for Describing Experimental Designs, Treatment Structure, Design Structure.- 5.6 Multifactor Factorials, Fractional Replication Confounding and Incomplete Blocks.- 5.6.1 The 2p Design.- Notation for Treatment Combinations, Determining Interaction Effects Using Effect Coding, Analysis of Variance Table for 2p Design.- 5.6.2 Single and Fractional Replication.- Single Replication, Degrees of Freedom for Error in a 2p Design, Effect Coding for a 24 Design, Fractional Replication and Aliases, Aliasing in a 2 4Design, Experimental Units Requirements for Various Fractional Replication of 2 p Designs, A Fractional Replication for a 35 Factorial.- 5.6.3 Confounding and Incomplete Block Designs.- Incomplete Block Designs.- 5.7 Random Effects Models and Variance Components.- 5.7.1 The One-Way Random Effects Model.- Analysis of Variance, Intraclass Correlation.- 5.7.2 Two-Way Models with Random Effects and Mixed Models.- A Two-Way Random Effects Model, Analysis of Variance, Intraclass Correlation, Two-Way Mixed Model, Intraclass Correlation.- 5.7.3 The Variance Components Model.- 5.7.4 Nested Designs and Variance Components.- 5.8 Repeated Measures and Split Plots Designs.- 5.8.1 Repeated Measures Designs.- Applications, The Single Factor Repeated Measures Design, Analysis of Variance Model, The Correlation Structure, Compound Symmetry and Type H Covariance Matrices, Some Modified Test Procedures, Test for Compound Symmetry, Test for Huynh-Feldt Pattern (Type H).- 5.8.2 A Two-Way Split Plot Design.- Model for the Split Plot Design.- 5.8.3 A Repeated Measures Split Plot Design.- Cited Literature for Chapter 5.- Exercises for Chapter 5.- Questions for Chapter 5.- 1. Matrix Algebra.- 1.1 Matrices.- Matrix, Transpose of a Matrix, Row Vector and Column Vector, Square Matrix, Symmetric Matrix, Diagonal Elements, Trace of a Matrix, Null or Zero Matrix, Identity Matrix, Diagonal Matrix, Submatrix.- 1.2 Matrix Operations.- Equality of Matrices, Addition of Matrices, Additive Inverse, Scalar Multiplication of a Matrix, Product of Two Matrices, Multiplicative Inverse, Idempotent Matrix, Kronecker Product.- 1.3 Determinants and Rank.- Determinant, Nonsingular, Relation Between Inverse and Determinant, Rank of a Matrix.- 1.4 Quadratic Forms and Positive Definite Matrices.- Quadratic Form, Congruent Matrix, Positive Definite, Positive Semidefinite, Negative Definite, Non-negative Definite.- 1.5 Partitioned Matrices.- Product of Partitioned Matrices, Inverse of a Partitioned Matrix, Determinant of a Partitioned Matrix.- 1.6 Expectations of Random Matrices.- 1.7 Derivatives of Matrix Expressions.- 2. Linear Algebra.- 2.1 Geometric Representation for Vectors.- 2.2 Linear Dependence And Linear Transformations.- 2.3 Systems of Equations.- Solution Vector for a System of Equations, Homogeneous Equations — Trivial and Nontrivial Solutions.- 2.4 Column Spaces, Projection Operators and Least Squares.- Column Space, Orthogonal Complement, Projection, Ordinary Least Squares Solution Vector, Idempotent Matrix — Projection Operator.- 3. Eigenvalue Structure and Singular Value Decomposition.- 3.1 Eigenvalue Structure for Square Matrices.- Eigenvalues and Eigenvectors, Characteristic Polynomial, Characteristic Roots, Latent Roots, Eigenvalues, Eigenvalues and Eigenvectors for Real Symmetric Matrices and Some Properties, Spectral Decomposition, Matrix Approximation, Eigenvalues for Nonnegative Definite Matrices.- 3.2 Singular Value Decomposition.- Left and Right Singular Vectors, Complete Singular Value Decomposition, Generalized Singular Value Decomposition, Relationship to Spectral Decomposition and Eigenvalues.- Data Appendix for Volume I.- Data Set D1, Data Set D2, Data Set D3, Data Set D4, Data Set D5, Data Set D6, Data Set D7, Data Set D8, Data Set D.- Table D1.- Table D2.- Table D3.- Table D4.- Table D5.- Table D6.- Table D7.- Table D8.- Table D9.- Table Appendix.- Table 1 The Cumulative Distribution Function for the Standard Normal.- Table 3 Critical Values for the Chi-Square Distribution.- Table 5 Critical Values for the Studentized Range Distribution.- Author Index.

Date de parution :

Ouvrage de 622 p.

15.5x23.5 cm

Disponible chez l'éditeur (délai d'approvisionnement : 15 jours).

94,94 €

Ajouter au panier

Ces ouvrages sont susceptibles de vous intéresser

En continuant à naviguer, vous autorisez Lavoisier à déposer des cookies à des fins de mesure d'audience. Pour en savoir plus et paramétrer les cookies, rendez-vous sur la page Confidentialité & Sécurité.