# Central Research Theme: Algorithmic Decision Processes

Research in algorithmic decision processes is the study of the dynamics of systems that process information to make decisions. Decision systems are ubiquitous, arising everywhere from the strategies deployed by sports teams to investment decisions made by fund managers, and from regulatory signals deployed in metabolic networks to policy decisions made by governments. These processes become algorithmic when an effort is made to make the decisions scientifically, that is, based on data.

Such processes historically have been decomposed into a sequence of stages: modeling, parameter estimation, control, and verification. Although various disciplines have different names for these stages, virtually all scientific decision processes generate a model, parametrize this model using part of the available observation data, use this model to select a decision that is acceptable or optimal according to some objective or measure, and then use the remaining data to verify that the model and/or decision is adequate.

As students strip the jargon of various fields and focus on these scientific decision process, they will discover the universality and importance of abstraction in the mathematical sciences. Through the algorithmic development of decision processes, students will discover the fundamental relationships between information, uncertainty, and complexity that govern the transformation of data into useful decisions.

## Mathematical Modeling

The need for mathematical models to characterize ones understanding of the consequences of various possible decisions builds from differential equations and applied probability, but it also can lead to problems in complex analysis for the study of transfer functions and transform theory), combinatorics, graph theory, differential geometry (dynamics on manifolds), etc.

## Parameter Estimation

Regression quickly leads one to statistics, and more advanced topics in time-series analysis and system identification. The mathematics of learning also motivates functional analysis, measure theory, Bayesian analysis, and subjective probability.

## Control

With a particular, albeit approximate, model of the consequences of available choices, we can compute the best decision to meet a particular objective. Since the model is approximate, “robust” decisions must consider that the predicted consequences may differ from the actual consequences. Typically such problems are identified as control problems. They can be classified by the characteristics of the model, such as linear control problems or non-linear control problems, or by solution technique, such as stochastic control, robust control or model-predictive control. Adaptive control problems attempt to explicitly integrate the learning and feedback design steps of the decision process, and they can result in techniques such as reinforcement learning.

## Verification

Given an implementation of a particular control design, and access to the phenomenon it was designed to control, verification is the process of characterizing the success or failure of the design effort. While sudden catastrophic failure typically is unacceptable, degradation in performance due to errors in the controller implementation, or due to uncertainty in the phenomenon to be controlled, may be expected and important to understand; a gradual degradation in performance is the hallmark of a good design.

## Computational Focus

If mathematics and statistics are the backbone of research in algorithmic decision processes, computation provides the muscle. At each stage of a decision algorithm, there are not only mathematical and statistical challenges, but there are also computational issues that arise. These are usually associated with the complexity of the problem. Indeed many algorithms are “good” in the sense that they will produce quality results and yet do not scale well in high-dimensional settings due to constraints in spatial and/or temporal complexity. This “curse of dimensionality” requires new decision algorithms that are usually weaker approximations than the original, but that are more readily computable. This interplay between complexity and uncertainty seems to be a fundamental tradeoff in the empirical sciences, and need to be studied and understood by students.