Glossary

Adam

A SGD-based optimization algorithm. See [GLO-KB15] for details.

ADMM

The alternating direction method of multipliers (ADMM) algorithm. See [GLO-BPC+11] for details.

BFGS

The Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm.

IIS

The improved iterative scaling (IIS) algorithm.

MaxEnt

The maximum entropy (MaxEnt) model.

RProp

The resilient backpropagation (RProp) algorithm. See [GLO-RB93] for details.

References

[GLO-KB15]

Diederik P. Kingma and Jimmy Lei Ba. Adam: a method for stochastic optimization. In Proceedings of the International Conference on Learning Representations. 2015.

[GLO-BPC+11]

Stephen Boyd, Neal Parikh, Eric Chu, Borja Peleato, Jonathan Eckstein, and others. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine learning, 3(1):1–122, 2011. URL: https://doi.org/10.1561/2200000016.

[GLO-RB93]

Martin Riedmiller and Heinrich Braun. A direct adaptive method for faster backpropagation learning: the RPROP algorithm. In Proceedings of International Conference on Neural Networks, 586–591. 1993. URL: https://doi.org/10.1109/ICNN.1993.298623.