9.1 Classical Parameter Estimation

  • Classical Parameter Estimation: In the classical framework, parameters are considered unknown constants rather than random variables. Estimators are functions of observed data that provide estimates for these parameters.
    • Maximum Likelihood (ML) Estimation: This method chooses the parameter value that maximizes the likelihood of the observed data.
    • Confidence Intervals: A confidence interval for a parameter provides a range of values within which lies with a certain probability (e.g., 95% confidence level).

9.2 Linear Regression

  • Linear Regression: A method to estimate the relationship between a dependent variable and one or more independent variables . The goal is to find the coefficients that minimize the sum of squared errors:

9.3 Binary Hypothesis Testing

  • Binary Hypothesis Testing: Involves deciding between two hypotheses: the null hypothesis and the alternative hypothesis .
    • Likelihood Ratio Test (LRT): The decision rule is based on the likelihood ratio: The hypothesis is accepted if , where is a threshold chosen to control error probabilities.
    • Type I and Type II Errors:
      • Type I Error: Rejecting when it is true.
      • Type II Error: Accepting when is true.

9.4 Significance Testing

  • Significance Testing: Used when testing a specific hypothesis (e.g., testing whether a coin is fair). The hypothesis is rejected if the observed data fall into a predefined rejection region.
    • P-value: The probability of observing data as extreme as, or more extreme than, the actual observed data, under the assumption that is true.

9.5 Summary and Discussion

  • Classical Inference: Treats parameters as fixed unknown constants. In parameter estimation, the goal is to generate accurate estimates across all possible values of the parameter.
  • Hypothesis Testing: Aims to minimize the error probabilities when choosing between competing hypotheses. Significance testing focuses on controlling the probability of false rejection (Type I error).

This chapter provides a foundation for methods like Maximum Likelihood Estimation, linear regression, and hypothesis testing, which are central to classical statistics.