
Annað
- Höfundur: Jim Fowler, Lou Cohen, Phil Jarvis
- Útgáfa:2
- Útgáfudagur: 1998-08-18
- Hægt að prenta út 10 bls.
- Hægt að afrita 2 bls.
- Format:Page Fidelity
- ISBN 13: 9781118300558
- Print ISBN: 9780471982968
- ISBN 10: 1118300556
Efnisyfirlit
- CONTENTS
- Preface
- 1 Introduction
- 1.1 What do we mean by statistics?
- 1.2 Why is statistics necessary?
- 1.3 Statistics in field biology
- 1.4 The limitations of statistics
- 1.5 The purpose of this text
- 2 Measurement and Sampling Concepts
- 2.1 Populations, samples and observations
- 2.2 Counting things – the sampling unit
- 2.3 Random sampling
- 2.4 Random numbers
- 2.5 Independence
- 2.6 Statistics and parameters
- 2.7 Descriptive and inferential statistics
- 2.8 Parametric and non-parametric statistics
- 3 Processing Data
- 3.1 Scales of measurement
- 3.2 The nominal scale
- 3.3 The ordinal scale
- 3.4 The interval scale
- 3.5 The ratio scale
- 3.6 Conversion of interval observations to an ordinal scale
- 3.7 Derived variables
- 3.8 The precision of observations
- 3.9 How precise should we be?
- 3.10 The frequency table
- 3.11 Aggregating frequency classes
- 3.12 Frequency distribution of count observations
- 3.13 Dispersion
- 3.14 Bivariate data
- 4 Presenting Data
- 4.1 Introduction
- 4.2 Dot plot or line plot
- 4.3 Bar graph
- 4.4 Histogram
- 4.5 Frequency polygon and frequency curve
- 4.6 Scattergram (scatter plot)
- 4.7 Circle or pie graph
- 5 Measuring the Average
- 5.1 What is an average?
- 5.2 The mean
- 5.3 The median – a resistant statistic
- 5.4 The mode
- 5.5 Relationship between the mean, median and mode
- 6 Measuring Variability
- 6.1 Variability
- 6.2 The range
- 6.3 The standard deviation
- 6.4 Calculating the standard deviation
- 6.5 Calculating the standard deviation from grouped data
- 6.6 Variance
- 6.7 An alternative formula for calculating the variance and standard deviation
- 6.8 Obtaining the standard deviation, variance and the sum of squares from a calculator
- 6.9 Degrees of freedom
- 6.10 The coefficient of variation
- 7 Probability
- 7.1 The meaning of probability
- 7.2 Compound probabilities
- 7.3 Probability distribution
- 7.4 Models of probability distribution
- 7.5 The binomial probability distribution
- 7.6 The Poisson probability distribution
- 7.7 The negative binomial probability distribution
- 7.8 Critical probability
- 8 Probability Distributions as Models of Dispersion
- 8.1 Dispersion
- 8.2 An Index of Dispersion
- 8.3 Choosing a model of dispersion
- 8.4 The binomial model
- 8.5 Poisson model
- 8.6 The negative binomial model
- 8.7 Deciding the goodness of fit
- 9 The Normal Distribution
- 9.1 The normal curve
- 9.2 Some mathematical properties of the normal curve
- 9.3 Standardizing the normal curve
- 9.4 Two-tailed or one-tailed?
- 9.5 Small samples: the t-distribution
- 9.6 Are our data 'normal'?
- 10 Data Transformation
- 10.1 The need for transformation
- 10.2 The logarithmic transformation
- 10.3 When there are zero counts – the arcsinh transformation
- 10.4 The square root transformation
- 10.5 The arcsine transformation
- 10.6 Back-transforming transformed numbers
- 10.7 Is data transformation really necessary?
- 11 How Good are our Estimates?
- 11.1 Sampling error
- 11.2 The distribution of a sample mean
- 11.3 The confidence interval of the mean of a large sample
- 11.4 The confidence interval of the mean of a small sample
- 11.5 The confidence interval of the mean of a sample of count data
- 11.6 The difference between the means of two large samples
- 11.7 The difference between the means of two small samples
- 11.8 Estimating a proportion
- 11.9 Estimating a Lincoln Index
- 11.10 Estimating a diversity index
- 11.11 The distribution of a variance – chi-square distribution
- 12 The Basis of Statistical Testing
- 12.1 Introduction
- 12.2 The experimental hypothesis
- 12.3 The statistical hypothesis
- 12.4 Test statistics
- 12.5 One-tailed tests and two-tailed tests
- 12.6 Hypothesis testing and the normal curve
- 12.7 Type 1 and type 2 errors
- 12.8 Parametric and non-parametric statistics: some further observations
- 12.9 The power of a test
- 13 Analysing Frequencies
- 13.1 The chi-square test
- 13.2 Calculating the x2 test statistic
- 13.3 A practical example of a test for homogeneous frequencies
- 13.4 The problem of independence
- 13.5 One degree of freedom – Yates' correction
- 13.6 Goodness of fit tests
- 13.7 Tests for association – the contingency table
- 13.8 The r x c contingency table
- 13.9 The G-test
- 13.10 Applying the G-test to a one-way classification of frequencies
- 13.11 Applying the G-test to a 2 x 2 contingency table
- 13.12 Applying the G-test to an r x c contingency table
- 13.13 Advice on analysing frequencies
- 14 Measuring Correlations
- 14.1 The meaning of correlation
- 14.2 Investigating correlation
- 14.3 The strength and significance of a correlation
- 14.4 Covariance
- 14.5 The Product Moment Correlation Coefficient
- 14.6 The coefficient of determination r2
- 14.7 The Spearman Rank Correlation Coefficient rs
- 14.8 Advice on measuring correlations
- 15 Regression Analysis
- 15.1 Introduction
- 15.2 Gradients and triangles
- 15.3 Dependent and independent variables
- 15.4 A perfect rectilinear relationship
- 15.5 The line of least squares
- 15.6 Simple linear regression
- 15.7 Fitting the regression line to the scattergram
- 15.8 The error of a regression line
- 15.9 Confidence limits of an individual estimate
- 15.10 The significance of the regression line
- 15.11 The difference between two regression lines
- 15.12 Dealing with curved relationships
- 15.13 Transformation of both axes
- 15.14 Regression through the origin
- 15.15 An alternative line of best fit
- 15.16 Advice on using regression analysis
- 16 Comparing Averages
- 16.1 Introduction
- 16.2 Matched and unmatched observations
- 16.3 The Mann–Whitney U-test for unmatched samples
- 16.4 Advice on using the Mann–Whitney U-test
- 16.5 More than two samples – the Kruskal–Wallis test
- 16.6 Advice on using the Kruskal-Wallis test
- 16.7 The Wilcoxon test for matched pairs
- 16.8 Advice on using the Wilcoxon test for matched pairs
- 16.9 Comparing means – parametric tests
- 16.10 The F-test (two-tailed)
- 16.11 The z-test for comparing the means of two large samples
- 16.12 The t-test for comparing the means of two small samples
- 16.13 The t-test for matched pairs
- 16.14 Advice on comparing means
- 17 Analysis of Variance – ANOVA
- 17.1 Why do we need ANOVA?
- 17.2 How ANOVA works
- 17.3 Procedure for computing one-way ANOVA
- 17.4 Procedure for computing the Tukey test
- 17.5 Two-way ANOVA
- 17.6 Procedure for computing two-way ANOVA
- 17.7 Procedure for computing the Tukey test in two-way ANOVA
- 17.8 Two-way ANOVA with single observations
- 17.9 The randomized block design
- 17.10 The Latin square
- 17.11 Analysis of variance in regression
- 17.12 Advice on using ANOVA
- 18 Multivariate Analysis
- 18.1 Introduction
- 18.2 What is information?
- 18.3 Making large problems manageable
- 18.4 Are there three groups or four?
- 18.5 Learning from experience?
- 18.6 Variations on a theme
- 18.7 Summary
- Appendices
- Appendix 1: Table of random numbers
- Appendix 2: t-distribution
- Appendix 3: x2-distribution
- Appendix 4: Critical values of Spearman's Rank Correlation Coefficient
- Appendix 5: Product moment correlation values at the 0.05 and 0.01 levels of significance
- Appendix 6: Mann–Whitney U-test values (two-tailed test) P = 0.05
- Appendix 7: Critical values of T in the Wilcoxon test for two matched samples
- Appendix 8: F-distribution, 0.05 level of significance, two-tailed test
- Appendix 9: Critical values of Fmax 0.05 level of significance
- Appendix 10: F-distribution
- Appendix 11: Tukey test
- Appendix 12: Symbols
- Appendix 13: Matrices and vectors
- Appendix 14: Computer packages
- Bibliography and further reading
- Index
UM RAFBÆKUR Á HEIMKAUP.IS
Bókahillan þín er þitt svæði og þar eru bækurnar þínar geymdar. Þú kemst í bókahilluna þína hvar og hvenær sem er í tölvu eða snjalltæki. Einfalt og þægilegt!Rafbók til eignar
Rafbók til eignar þarf að hlaða niður á þau tæki sem þú vilt nota innan eins árs frá því bókin er keypt.
Þú kemst í bækurnar hvar sem er
Þú getur nálgast allar raf(skóla)bækurnar þínar á einu augabragði, hvar og hvenær sem er í bókahillunni þinni. Engin taska, enginn kyndill og ekkert vesen (hvað þá yfirvigt).
Auðvelt að fletta og leita
Þú getur flakkað milli síðna og kafla eins og þér hentar best og farið beint í ákveðna kafla úr efnisyfirlitinu. Í leitinni finnur þú orð, kafla eða síður í einum smelli.
Glósur og yfirstrikanir
Þú getur auðkennt textabrot með mismunandi litum og skrifað glósur að vild í rafbókina. Þú getur jafnvel séð glósur og yfirstrikanir hjá bekkjarsystkinum og kennara ef þeir leyfa það. Allt á einum stað.
Hvað viltu sjá? / Þú ræður hvernig síðan lítur út
Þú lagar síðuna að þínum þörfum. Stækkaðu eða minnkaðu myndir og texta með multi-level zoom til að sjá síðuna eins og þér hentar best í þínu námi.
Fleiri góðir kostir
- Þú getur prentað síður úr bókinni (innan þeirra marka sem útgefandinn setur)
- Möguleiki á tengingu við annað stafrænt og gagnvirkt efni, svo sem myndbönd eða spurningar úr efninu
- Auðvelt að afrita og líma efni/texta fyrir t.d. heimaverkefni eða ritgerðir
- Styður tækni sem hjálpar nemendum með sjón- eða heyrnarskerðingu
- Gerð : 208
- Höfundur : 10251
- Útgáfuár : 1998
- Leyfi : 379