Go back to Andrew Guillory's home page
ABAGAIL
the Absolute Best Andrew Guillory Artificial Intelligence Library
This library is the result of close to a year of research
and class work in artificial intelligence (AI). It contains
a number of interconnected Java packages that implement
machine learning and artificial intelligence algorithms.
These are artificial intelligence
algorithms implemented for the kind of people that
like to implement algorithms themselves.
June 27, 2005Fixed a bug in the SVM code, expanded
the reinforcement learning package.
Download Java Source Code
Features
Hidden Markov Models
- Baum-Welch reestimation algorithm, scaled forward-backward
algorithm, Viterbi algorithm
- Support for Input-Output Hidden Markov Models
- Write your own output or transition probability distribution
or use the provided distributions, including neural network
based conditional probability distributions
Neural Networks
- Feed-forward backpropagation neural networks of
arbitrary topology
- Configurable error functions with sum of squares,
weighted sum of squares
- Multiple activation functions with logistic sigmoid,
linear, tanh, and soft max
- Choose your weight update rule with standard update rule,
standard update rule with momentum, Quickprop, RPROP
- Online and batch training
Support Vector Machines
- Fast training with the sequential minimal optimization
algorithm
- Support for linear, polynomial, tanh, radial basis
function kernels
Decision Trees
- Information gain or GINI index split criteria
- Binary or all attribute value splitting
- Chi-square signifigance test pruning with configurable
confidence levels
- Boosted decision stumps with AdaBoost
K Nearest Neighbors
- Fast kd-tree implementation for instance based algorithms of
all kinds
- KNN Classifier with weighted or non-weighted classification,
customizable distance function
Linear Algebra Algorithms
- Basic matrix and vector math,
a variety of matrix decompositions based on
the standard algorithms
- Solve square systems, upper triangular systems,
lower triangular systems, least squares
- Singular Value Decomposition, QR Decomposition,
LU Decomposition, Schur Decomposition, Symmetric Eigenvalue Decomposition,
Cholesky Factorization
- Make your own matrix decomposition with the easy to use
Householder Reflection and Givens Rotation classes
Optimization Algorithms
- Randomized hill climbing, simulated annealing,
genetic algorithms, and discrete dependency tree MIMIC
- Make your own crossover functions, mutation functions, neighbor
functions, probability distributions, or use the provided ones.
- Optimize the weights of neural networks and solve
travelling salesman problems
Graph Algorithms
Clustering Algorithms
- EM with gaussian mixtures, K-means
Data Preprocessing
- PCA, ICA, LDA, Randomized Projections
- Convert from continuous to discrete, discrete to binary
Reinforcement Learning
- Value and policy iteration for Markov decision processes