Most of the algorithms described below are implemented in utils.py. Each algorithm is accompanied by examples below its class definition, allowing for quick implementation and testing. Some datasets are provided, plot them to see what they look like.
Optimizer is Newton's Method
Add the attention mechanism into linear regression to fit a non-linear model
The supervised learning version of Gaussian Mixture Model
Used for language processing, eg. spam classification or sentiment analysis
It's a single perceptron with kernel trick
- hard margin: Hard margin version is only for doing language processing
- soft margin: Soft margin version is implemented refer to Platt's paper which introduce SMO algorithm in dealing with KKT condition
- SVR: SMO algorithm in regression task
- Decision Tree: Using ratio to pick the best spilt feature
- Random Forest: Using bootstrap to pick samples, and pick random features to plant decision trees
- XGBoost: boosting version of random forest, for regression only, loss is MSE
Can also deal with semi-supervised learning problems
Powerful in compressing images
Powerful in recovering the mixing audios
Reducing relevant features
Has three hidden layers, only for binary classification and only sigmoid function as activation func
Architecture can be seen in the comments of the class, for addressing image classification problems