Abstract

Current research on statistical topics spans a wide range of ideas and fields of application. Much is connected with the systematic theory of methods for the analysis of empirical data, especially for situations, common in many areas of science and technology, where the random or haphazard element in the data is too strong to be ignored. Modern computer technology is important in enabling large amounts of data to be explored quickly and efficiently, in allowing methods involving iterative calculations to be handled as a matter of routine and in facilitating display of the results of analysis via sophisticated graphical devices. This has allowed both the development of new methods and also the application of ideas long known in principle but until relatively recently too complicated for other than occasional use. Some of the key ideas involved go back in essence to the 19th century, for example to Gauss, and others are more strongly associated with the first half of the 20th century, in particular to the pioneering work of R. A. Fisher, although many of the detailed developments are, of course, much more recent.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.