Applied mathematics

applied mathematicianappliedapplications of mathematics
The advent of the computer has enabled new applications: studying and using the new computer technology itself (computer science) to study problems arising in other areas of science (computational science) as well as the mathematics of computation (for example, theoretical computer science, computer algebra, numerical analysis). Statistics is probably the most widespread mathematical science used in the social sciences, but other areas of mathematics, most notably economics, are proving increasingly useful in these disciplines. Academic institutions are not consistent in the way they group and label courses, programs, and degrees in applied mathematics.

Linear regression

regression coefficientregressionmultiple linear regression
In Canada, the Environmental Effects Monitoring Program uses statistical analyses on fish and benthic surveys to measure the effects of pulp mill or metal mine effluent on the aquatic ecosystem. Linear regression plays an important role in the field of artificial intelligence such as machine learning.

Biostatistics

biostatisticianbiometrybiometrician
Nowadays, increase in size and complexity of molecular datasets leads to use of powerful statistical methods provided by computer science algorithms which are developed by machine learning area. Therefore, data mining and machine learning allow detection of patterns in data with a complex structure, as biological ones, by using methods of supervised and unsupervised learning, regression, detection of clusters and association rule mining, among others. To indicate some of them, self-organizing maps and k-means are examples of cluster algorithms; neural networks implementation and support vector machines models are examples of common machine learning algorithms.

Adaptive website

Adaptive websites
Collaborative filtering such as recommender systems, generate and test methods such as A/B testing, and machine learning techniques such as clustering and classification that are used on a website do not make it an adaptive website. They are all tools and techniques that may be used toward engineering an adaptive website. The collaborative filtering method: Collected user data may be assessed in aggregate (across multiple users) using machine learning techniques to cluster interaction patterns to user models and classify specific user patterns to such models. The website may then be adapted to target clusters of users.

List of statistical packages

statistical softwarestatistical packagestatistical packages
Statistical Lab – R-based and focusing on educational purposes. Torch (machine learning) – a deep learning software library written in Lua (programming language). Weka (machine learning) – a suite of machine learning software written at the University of Waikato. CSPro. Epi Info. X-12-ARIMA. BV4.1. GeoDA. MaxStat Lite – general statistical software. MINUIT. WinBUGS – Bayesian analysis using Markov chain Monte Carlo methods. Winpepi – package of statistical programs for epidemiologists. Analytica – visual analytics and statistics package. Angoss – products KnowledgeSEEKER and KnowledgeSTUDIO incorporate several data mining algorithms. ASReml – for restricted maximum likelihood analyses.

History of statistics

foundational advanceshistorian of statisticsstat-'' etymology
Despite growth of Bayesian research, most undergraduate teaching is still based on frequentist statistics. Nonetheless, Bayesian methods are widely accepted and used, such as for example in the field of machine learning. * ( Revised version, 2002) * Kotz, S., Johnson, N.L. (1992,1992,1997). Breakthroughs in Statistics, Vols I, II, III. Springer ISBN: 0-387-94037-5, ISBN: 0-387-94039-1, ISBN: 0-387-94989-5 * Salsburg, David (2001). The Lady Tasting Tea: How Statistics Revolutionized Science in the Twentieth Century. ISBN: 0-7167-4106-7 * Stigler, Stephen M. (1999) Statistics on the Table: The History of Statistical Concepts and Methods. Harvard University Press.

Computational science

scientific computingscientific computationscientific
Computational statistics. Computational sustainability. Computer algebra. Computer simulation. Financial modeling. Geographic information system (GIS). High-performance computing. Machine learning. Network analysis. Neuroinformatics. Numerical linear algebra. Numerical weather prediction. Pattern recognition. Scientific visualization. Simulation. Computer simulations in science. Computational science and engineering. Comparison of computer algebra systems. List of molecular modeling software. List of numerical analysis software. List of statistical packages. Timeline of scientific computing. Simulated reality. Extensions for Scientific Computation (XSC). E. Gallopoulos and A.

Formal science

formal sciencesformalformal disciplines
In the early 1800s, Gauss and Laplace developed the mathematical theory of statistics, which also explained the use of statistics in insurance and governmental accounting. Mathematical statistics was recognized as a mathematical discipline in the early 20th century. In the mid-20th century, mathematics was broadened and enriched by the rise of new mathematical sciences and engineering disciplines such as operations research and systems engineering. These sciences benefited from basic research in electrical engineering and then by the development of electrical computing, which also stimulated information theory, numerical analysis (scientific computing), and theoretical computer science.

Statistica

Statistica provides data analysis, data management, statistics, data mining, machine learning, text analytics and data visualization procedures. Statistica originally derives from a set of software packages and add-ons that were initially developed during the Mid-1980's by StatSoft. Following the 1986 release of Complete Statistical System (CSS) and the 1988 release of Macintosh Statistical System (MacSS), the first DOS version (trademarked in capitals as STATISTICA) was released in 1991. In 1992, the Macintosh version of Statistica was released. Statistica 5.0 was released in 1995. It ran on both the new 32-bit Windows 95/NT and the older version of Windows (3.1).

Heuristic (computer science)

heuristicheuristicsheuristic algorithm
Metaheuristic: Methods for controlling and tuning basic heuristic algorithms, usually with usage of memory and learning. Matheuristics: Optimization algorithms made by the interoperation of metaheuristics and mathematical programming (MP) techniques. Reactive search optimization: Methods using online machine learning principles for self-tuning of heuristics.

Jerome H. Friedman

Jerome FriedmanFriedman, JeromeFriedman, Jerome H.
In 1982 he was appointed Professor of Statistics at Stanford University. In 1984 he was elected as a Fellow of the American Statistical Association. In 2002 he was awarded the SIGKDD Innovation Award by the ACM. In 2010 he was elected as a member of the [[List of members of the National Academy of Sciences (Applied mathematical sciences)|National Academy of Sciences (Applied mathematical sciences)]]. Friedman has authored and co-authored many publications in the field of data-mining including "nearest neighbor classification, logistical regressions, and high dimensional data analysis. His primary research interest is in the area of machine learning."

Email filtering

spam filterspam filtersfilter
Some more advanced filters, particularly anti-spam filters, use statistical document classification techniques such as the naive Bayes classifier. Image filtering can use complex image-analysis algorithms to detect skin-tones and specific body shapes normally associated with pornographic images. Microsoft Outlook includes user-generated email filters called "rules". Bayesian spam filtering. CRM114. Information filtering. Markovian discrimination. Outbound Spam Protection. Sieve (mail filtering language) is an RFC standard for describing mail filters. White list#Email whitelists.

Mathematical optimization

optimizationmathematical programmingoptimal
Optimization algorithm machine learning Introduction An optimization algorithm is a procedure which is executed iteratively by comparing various solutions till an optimum or a satisfactory solution will be found. Optimization algorithms help us to minimize or maximize an objective function E(x) which is simply a mathematical function dependent on the Model’s internal parameters which are used in computing the target values(Y) from the set of predictors(X) used in the model. There are two types of optimization algorithms which are widely used such as Zero-order algorithms, First Order Optimization Algorithms and Second Order Optimization Algorithms.

Boolean data type

booleanBoolean valueboolean variable
In computer science, the Boolean data type is a data type that has one of two possible values (usually denoted true and false), intended to represent the two truth values of logic and Boolean algebra. It is named after George Boole, who first defined an algebraic system of logic in the mid 19th century. The Boolean data type is primarily associated with conditional statements, which allow different actions by changing control flow depending on whether a programmer-specified Boolean condition evaluates to true or false. It is a special case of a more general logical data type (see probabilistic logic)—logic need not always be Boolean.

Function (mathematics)

functionfunctionsmathematical function
In mathematics, a function was originally the idealization of how a varying quantity depends on another quantity. For example, the position of a planet is a function of time. Historically, the concept was elaborated with the infinitesimal calculus at the end of the 17th century, and, until the 19th century, the functions that were considered were differentiable (that is, they had a high degree of regularity). The concept of function was formalized at the end of the 19th century in terms of set theory, and this greatly enlarged the domains of application of the concept.

Statistical inference

inferenceinferential statisticsinferences
However, some elements of frequentist statistics, such as statistical decision theory, do incorporate utility functions. In particular, frequentist developments of optimal inference (such as minimum-variance unbiased estimators, or uniformly most powerful testing) make use of loss functions, which play the role of (negative) utility functions. Loss functions need not be explicitly stated for statistical theorists to prove that a statistical procedure has an optimality property.

Backpropagation

back-propagationback propagation backpropagation algorithm
In the context of learning, backpropagation is commonly used by the gradient descent optimization algorithm to adjust the weight of neurons by calculating the gradient of the loss function. The goal of any supervised learning algorithm is to find a function that best maps a set of inputs to their correct output. The motivation for backpropagation is to train a multi-layered neural network such that it can learn the appropriate internal representations to allow it to learn any arbitrary mapping of input to output.

Linguistics

linguistlinguisticlinguists
Researchers are drawn to the field from a variety of backgrounds, bringing along a variety of experimental techniques as well as widely varying theoretical perspectives. Much work in neurolinguistics is informed by models in psycholinguistics and theoretical linguistics, and is focused on investigating how the brain can implement the processes that theoretical and psycholinguistics propose are necessary in producing and comprehending language. Neurolinguists study the physiological mechanisms by which the brain processes information related to language, and evaluate linguistic and psycholinguistic theories, using aphasiology, brain imaging, electrophysiology, and computer modelling.

Scientific control

controlcontrolscontrolled
The blinding eliminates effects such as confirmation bias and wishful thinking that might occur if the samples were evaluated by someone who knew which samples were in which group. In double-blind experiments, at least some participants and some experimenters do not possess full information while the experiment is being carried out. Double-blind experiments are most often used in clinical trials of medical treatments, to verify that the supposed effects of the treatment are produced only by the treatment itself. Trials are typically randomized and double-blinded, with two (statistically) identical groups of patients being compared.

Stock

equitiesequityshares
The fields of fundamental analysis and technical analysis attempt to understand market conditions that lead to price changes, or even predict future price levels. A recent study shows that customer satisfaction, as measured by the American Customer Satisfaction Index (ACSI), is significantly correlated to the market value of a stock. Stock price may be influenced by analysts' business forecast for the company and outlooks for the company's general market segment. Stocks can also fluctuate greatly due to pump and dump scams. At any given moment, an equity's price is strictly a result of supply and demand.

Number theory

number theoristcombinatorial number theorytheory of numbers
While Asian mathematics influenced Greek and Hellenistic learning, it seems to be the case that Greek mathematics is also an indigenous tradition.