Multivariate methods are now widely used in the quantitative sciences as well as in statistics because of the ready availability of computer packages for performing the calculations. While access to suitable computer software is essential to using multivariate methods, using the software still requires a working knowledge of these methods and how they can be used. Multivariate Statistical Methods: A Primer, Third Edition introduces these methods and provides a general overview of the techniques without overwhelming you with comprehensive details. This thoroughly revised, updated edition of a best-selling introductory text retains the author's trademark clear, concise style but includes a range of new material, new exercises, and supporting materials on the Web. New in the Third Edition: Fully updated references Additional examples and exercises from the social and environmental sciences A comparison of the various statistical software packages, including Stata, Statistica, SAS Minitab, and Genstat, particularly in terms of their ease of use by beginners In his efforts to produce a book that is as short as possible and that enables you to begin to use multivariate methods in an intelligent manner, the author has produced a succinct and handy reference. With updated information on multivariate analyses, new examples using the latest software, and updated references, this book provides a timely introduction to useful tools for statistical analysis.
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated "A Primer of Multivariate Statistics" to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why one should consider diving into more detailed treatments of computer-modeling and latent-variable techniques, such as non-recursive path analysis, confirmatory factor analysis, and hierarchical linear modeling. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis.
This comprehensive, flexible text is used in both one- and two-semester courses to review introductory through intermediate statistics. Instructors select the topics that are most appropriate for their course. Its conceptual approach helps students more easily understand the concepts and interpret SPSS and research results. Key concepts are simply stated and occasionally reintroduced and related to one another for reinforcement. Numerous examples demonstrate their relevance. This edition features more explanation to increase understanding of the concepts. Only crucial equations are included. In addition to updating throughout, the new edition features: New co-author, Debbie L. Hahs-Vaughn, the 2007 recipient of the University of Central Florida's College of Education Excellence in Graduate Teaching Award. A new chapter on logistic regression models for today's more complex methodologies. More on computing confidence intervals and conducting power analyses using G*Power. Many more SPSS screenshots to assist with understanding how to navigate SPSS and annotated SPSS output to assist in the interpretation of results. Extended sections on how to write-up statistical results in APA format. New learning tools including chapter-opening vignettes, outlines, and a list of key concepts, many more examples, tables, and figures, boxes, and chapter summaries. More tables of assumptions and the effects of their violation including how to test them in SPSS. 33% new conceptual, computational, and all new interpretative problems. A website that features PowerPoint slides, answers to the even-numbered problems, and test items for instructors, and for students the chapter outlines, key concepts, and datasets that can be used in SPSS and other packages, and more. Each chapter begins with an outline, a list of key concepts, and a vignette related to those concepts. Realistic examples from education and the behavioral sciences illustrate those concepts. Each example examines the procedures and assumptions and provides instructions for how to run SPSS, including annotated output, and tips to develop an APA style write-up. Useful tables of assumptions and the effects of their violation are included, along with how to test assumptions in SPSS. 'Stop and Think' boxes provide helpful tips for better understanding the concepts. Each chapter includes computational, conceptual, and interpretive problems. The data sets used in the examples and problems are provided on the web. Answers to the odd-numbered problems are given in the book. The first five chapters review descriptive statistics including ways of representing data graphically, statistical measures, the normal distribution, and probability and sampling. The remainder of the text covers inferential statistics involving means, proportions, variances, and correlations, basic and advanced analysis of variance and regression models. Topics not dealt with in other texts such as robust methods, multiple comparison and nonparametric procedures, and advanced ANOVA and multiple and logistic regression models are also reviewed. Intended for one- or two-semester courses in statistics taught in education and/or the behavioral sciences at the graduate and/or advanced undergraduate level, knowledge of statistics is not a prerequisite. A rudimentary knowledge of algebra is required.
Praise for the Second Edition “This book should be an essential part of the personallibrary of every practicingstatistician.”—Technometrics Thoroughly revised and updated, the new edition of NonparametricStatistical Methods includes additional modern topics andprocedures, more practical data sets, and new problems fromreal-life situations. The book continues to emphasize theimportance of nonparametric methods as a significant branch ofmodern statistics and equips readers with the conceptual andtechnical skills necessary to select and apply the appropriateprocedures for any given situation. Written by leading statisticians, Nonparametric StatisticalMethods, Third Edition provides readers with crucialnonparametric techniques in a variety of settings, emphasizing theassumptions underlying the methods. The book provides an extensivearray of examples that clearly illustrate how to use nonparametricapproaches for handling one- or two-sample location and dispersionproblems, dichotomous data, and one-way and two-way layoutproblems. In addition, the Third Edition features: The use of the freely available R software to aid incomputation and simulation, including many new R programs writtenexplicitly for this new edition New chapters that address density estimation, wavelets,smoothing, ranked set sampling, and Bayesian nonparametrics Problems that illustrate examples from agricultural science,astronomy, biology, criminology, education, engineering,environmental science, geology, home economics, medicine,oceanography, physics, psychology, sociology, and spacescience Nonparametric Statistical Methods, Third Edition is anexcellent reference for applied statisticians and practitioners whoseek a review of nonparametric methods and their relevantapplications. The book is also an ideal textbook forupper-undergraduate and first-year graduate courses in appliednonparametric statistics.
Theories and practices to assess critical information in acomplex adaptive system Organized for readers to follow along easily, The Fitness ofInformation: Quantitative Assessments of Critical Evidenceprovides a structured outline of the key challenges in assessingcrucial information in a complex adaptive system. Illustrating avariety of computational and explanatory challenges, the bookdemonstrates principles and practical implications of exploring andassessing the fitness of information in an extensible framework ofadaptive landscapes. The book’s first three chapters introduce fundamentalprinciples and practical examples in connection to the nature ofaesthetics, mental models, and the subjectivity of evidence. Inparticular, the underlying question is how these issues can beaddressed quantitatively, not only computationally but alsoexplanatorily. The next chapter illustrates how one can reduce thelevel of complexity in understanding the structure and dynamics ofscientific knowledge through the design and use of the CiteSpacesystem for visualizing and analyzing emerging trends in scientificliterature. The following two chapters explain the concepts ofstructural variation and the fitness of information in a frameworkthat builds on the idea of fitness landscape originally introducedto study population evolution. The final chapter presents adual-map overlay technique and demonstrates how it supports avariety of analytic tasks for a new type of portfolioanalysis. The Fitness of Information: Quantitative Assessments of CriticalEvidence also features: In-depth case studies and examples that characterizefar-reaching concepts, illustrate underlying principles, anddemonstrate profound challenges and complexities at various levelsof analytic reasoning Wide-ranging topics that underline the common theme, from thesubjectivity of evidence in criminal trials to detecting earlysigns of critical transitions and mechanisms behind radicalpatents An extensible and unifying framework for visual analytics bytransforming analytic reasoning tasks to the assessment of criticalevidence The Fitness of Information: Quantitative Assessments of CriticalEvidence is a suitable reference for researchers, analysts, andpractitioners who are interested in analyzing evidence and makingdecisions with incomplete, uncertain, and even conflictinginformation. The book is also an excellent textbook forupper-undergraduate and graduate-level courses on visual analytics,information visualization, and business analytics and decisionsupport systems.
The concise yet authoritative presentation of key techniques for basic mixtures experiments Inspired by the author's bestselling advanced book on the topic, A Primer on Experiments with Mixtures provides an introductory presentation of the key principles behind experimenting with mixtures. Outlining useful techniques through an applied approach with examples from real research situations, the book supplies a comprehensive discussion of how to design and set up basic mixture experiments, then analyze the data and draw inferences from results. Drawing from his extensive experience teaching the topic at various levels, the author presents the mixture experiments in an easy-to-follow manner that is void of unnecessary formulas and theory. Succinct presentations explore key methods and techniques for carrying out basic mixture experiments, including: Designs and models for exploring the entire simplex factor space, with coverage of simplex-lattice and simplex-centroid designs, canonical polynomials, the plotting of individual residuals, and axial designs Multiple constraints on the component proportions in the form of lower and/or upper bounds, introducing L-Pseudocomponents, multicomponent constraints, and multiple lattice designs for major and minor component classifications Techniques for analyzing mixture data such as model reduction and screening components, as well as additional topics such as measuring the leverage of certain design points Models containing ratios of the components, Cox's mixture polynomials, and the fitting of a slack variable model A review of least squares and the analysis of variance for fitting data Each chapter concludes with a summary and appendices with details on the technical aspects of the material. Throughout the book, exercise sets with selected answers allow readers to test their comprehension of the material, and References and Recommended Reading sections outline further resources for study of the presented topics. A Primer on Experiments with Mixtures is an excellent book for one-semester courses on mixture designs and can also serve as a supplement for design of experiments courses at the upper-undergraduate and graduate levels. It is also a suitable reference for practitioners and researchers who have an interest in experiments with mixtures and would like to learn more about the related mixture designs and models.
Intended for a second course in stationary processes, Stationary Stochastic Processes: Theory and Applications presents the theory behind the field’s widely scattered applications in engineering and science. In addition, it reviews sample function properties and spectral representations for stationary processes and fields, including a portion on stationary point processes. Features Presents and illustrates the fundamental correlation and spectral methods for stochastic processes and random fields Explains how the basic theory is used in special applications like detection theory and signal processing, spatial statistics, and reliability Motivates mathematical theory from a statistical model-building viewpoint Introduces a selection of special topics, including extreme value theory, filter theory, long-range dependence, and point processes Provides more than 100 exercises with hints to solutions and selected full solutions This book covers key topics such as ergodicity, crossing problems, and extremes, and opens the doors to a selection of special topics, like extreme value theory, filter theory, long-range dependence, and point processes, and includes many exercises and examples to illustrate the theory. Precise in mathematical details without being pedantic, Stationary Stochastic Processes: Theory and Applications is for the student with some experience with stochastic processes and a desire for deeper understanding without getting bogged down in abstract mathematics.
Mathematics of Chance utilizes simple, real-world problems-some of which have only recently been solved-to explain fundamental probability theorems, methods, and statistical reasoning. Jiri Andel begins with a basic introduction to probability theory and its important points before moving on to more specific sections on vital aspects of probability, using both classic and modern problems. Each chapter begins with easy, realistic examples before covering the general formulations and mathematical treatments used. The reader will find ample use for a chapter devoted to matrix games and problem sets concerning waiting, probability calculations, expectation calculations, and statistical methods. A special chapter utilizes problems that relate to areas of mathematics outside of statistics and considers certain mathematical concepts from a probabilistic point of view. Sections and problems cover topics including: * Random walks * Principle of reflection * Probabilistic aspects of records * Geometric distribution * Optimization * The LAD method, and more Knowledge of the basic elements of calculus will be sufficient in understanding most of the material presented here, and little knowledge of pure statistics is required. Jiri Andel has produced a compact reference for applied statisticians working in industry and the social and technical sciences, and a book that suits the needs of students seeking a fundamental understanding of probability theory.
A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the subject, beginning with basic concepts of DOE and a review of elementary normal theory statistical methods. Subsequent chapters present a uniform, model-based approach to DOE. Each design is presented in a comprehensive format and is accompanied by a motivating example, discussion of the applicability of the design, and a model for its analysis using statistical methods such as graphical plots, analysis of variance (ANOVA), confidence intervals, and hypothesis tests. Numerous theoretical and applied exercises are provided in each chapter, and answers to selected exercises are included at the end of the book. An appendix features three case studies that illustrate the challenges often encountered in real-world experiments, such as randomization, unbalanced data, and outliers. Minitab® software is used to perform analyses throughout the book, and an accompanying FTP site houses additional exercises and data sets. With its breadth of real-world examples and accessible treatment of both theory and applications, Statistical Analysis of Designed Experiments is a valuable book for experimental design courses at the upper-undergraduate and graduate levels. It is also an indispensable reference for practicing statisticians, engineers, and scientists who would like to further their knowledge of DOE.
Praise for the Third Edition “. . . an easy-to read introduction to survival analysiswhich covers the major concepts and techniques of thesubject.” —Statistics in Medical Research Updated and expanded to reflect the latest developments,Statistical Methods for Survival Data Analysis, FourthEdition continues to deliver a comprehensive introduction tothe most commonly-used methods for analyzing survival data.Authored by a uniquely well-qualified author team, the FourthEdition is a critically acclaimed guide to statistical methods withapplications in clinical trials, epidemiology, areas of business,and the social sciences. The book features many real-world examplesto illustrate applications within these various fields, althoughspecial consideration is given to the study of survival data inbiomedical sciences. Emphasizing the latest research and providing the mostup-to-date information regarding software applications in thefield, Statistical Methods for Survival Data Analysis, FourthEdition also includes: Marginal and random effect models for analyzing correlatedcensored or uncensored data Multiple types of two-sample and K-sample comparisonanalysis Updated treatment of parametric methods for regression modelfitting with a new focus on accelerated failure time models Expanded coverage of the Cox proportional hazards model Exercises at the end of each chapter to deepen knowledge of thepresented material Statistical Methods for Survival Data Analysis is anideal text for upper-undergraduate and graduate-level courses onsurvival data analysis. The book is also an excellent resource forbiomedical investigators, statisticians, and epidemiologists, aswell as researchers in every field in which the analysis ofsurvival data plays a role.