Karl Friedrich Gauss
- Student of: Johann Friedrich Pfaff
- Influenced by:
- Students: Friedrich Bessel, Christian Gerling, Johann Listing, Christoph Gudermann, Johann Encke, Georg Riemann, J. W. Richard Dedekind
- Influenced: Galton, LaPlace
- Time Period: Modern Foundations
- Colegium Carolinum, studied languages, and the works of Newton, Euler, and Lagrange (1792-1795)
- University of Göttingen (1795-1798)
- University of Helmstedt, doctor of philosophy in absentia (1799)
- Director of the Göttingen Observatory from 1807 until his death in 1855
- Numerous innovations in physics and mathematics, including the principle of least squares
Ideas and Interests
The German mathematician Carl (or Karl) Gauss was born into humble circumstances in 1777; his mother had some basic reading skills, but she could not write. His father was a competent and honest businessman, and was known to be a very capable accountant. Gauss was a mathematically precocious child, and in later years he would joke that he could do computations in his head before he could talk. The true extent of his gifts became manifest at age 3 when he announced, loudly and in public, that his father’s payroll calculations were incorrect. His early education was accelerated through the help of special books and private tutoring, and he entered the University of Göttingen when he was 18 years old. By this time he had already made some important mathematical discoveries (Dunnington, 1955/2004).
Intelligence theorists are indebted to Carl Gauss in particular for his discovery of the method of least squares, a mathematical concept that is used in statistical analyses like regression. Other mathematicians, including Daniel Huber (1768-1829), and Adrien-Marie Legendre (1752-1833) are credited with independent discoveries of least squares, but it was Gauss who fully developed the idea, which he published in 1809 (Dunnington, 1955/2004). It is to Carl Gauss and Andrey Markov’s (1856-1922) credit that the Gauss-Markov Theorem is named. The Gauss-Markov Theorem states that for a combination of two or more linear variables meeting the assumptions of uncorrelated errors with equal variance and expected values of zero, the least squares coefficient estimates of the resulting regression line are BLUE (Best Linear Unbiased Estimates).
Ordinary Least Squares (OLS) Regression is a method used to predict the value of one variable from a number of other predictor variables. The simplest model is called simple regression, where there is only one predictor variable (X) that predicts an outcome (Y). For example, an intelligence researcher might use simple regression to predict full-scale IQ scores of several individuals from their scores on the Scholastic Achievement Test (SAT). Of course, it is desirable for these predictions to be unbiased with the least error possible. The Gauss-Markov Theorem ensures this outcome.
Gauss, C. F. (1801). Disquisitiones arithmeticae. (Arthur A. Clarke, Trans.) New York: Springer-Verlag. (Original work published 1801).
Gauss, C. F. (1857). Theoria motus corporum coelestium in sectioibus conicis solem ambientium. (Charles Henry Davis, Trans.) Boston: Little, Brown and Company. (Original work published 1809).
Dunnington, G. W. (1855/2004). Gauss: Titan of science. Washington, DC: The Mathematical Association of America, Inc.
Home | Interactive
Map | Alphabetic Index | Time
Hot Topics | Map - PDF | References | Contributors | Comments
For further information please contact
Content questions: Dr. Jonathan
Plucker (jonathan.plucker AT uconn.edu)
Technical questions: Technology Co-Director (intelltheory AT gmail.com)
Copyright © 2013
20 December 2016