Support      Solutions        Blog        News       Contact 
DOWNLOAD   /   DISCOVER  /   TUTORIALS   /   VIDEOS   /   STORE   /   ABOUT
 Leif Peterson - March 25, 2020
There is an ever-growing need for researchers to have access to the fundamental theory and development of machine learning methods.  The majority of time-proven methods are not new by any means, but what is new is the growing number of people wanting access to learning materials.  In this blog, there are no run examples involving Python scripts, the R-Project, Numpy, Scikit Learn, Anaconda, Keras, or Jupiter, etc. to show you how to get on the fast track for getting a job.  

From an applied computer science perspective, I instead provide a link to pdf files which introduce the supporting numerical methods for each method, but without example runs.  Thus, the presentation materials could be assumed to form the "lecture" component of a combined lecture/computer lab tutorial in an academic setting.  The various methods discussed have all been implemented in the NXG Logic Explorer package by using compilable language programming, and not by using any 3rd party add-ins or plug ins, or calls to e.g. Numpy or other math add-ins.   Over time, if you do peruse through some of the web pages at www.nxglogic.com, and look at features of the machine learning and statistical methods implemented, you will begin to appreciate the "home-made blueberry muffin approach" (compilable language approach) to software development, where the "eggs, flour, milk and butter" (constituent components) all originated from our "own farm" (compiled resources).  So there is nothing "instant" which can be microwaved, like calling 3rd party or open source math libraries or rapidly slapping together interpretive language code to perform the numerical methods. 

Random Matrix Theory (RMT)
- Introduces RMT with detailed history of Wishart ensembles, eigendecomposition, component subtraction, and covariance matrix shrinkage techniques used in the NXG Logic Explorer package.

Feed-Forward Back-Propagation Artificial Neural Networks (ANN)
-
Introduces background history and theory on single hidden layer feed-forward, back-propagation artificial neural networks (ANN) used in the NXG Logic Explorer package.

Supervised and Unsupervised Random Forests
- Introduces supervised RF for classification, and unsupervised RF for cluster analysis used in the NXG Logic Explorer package.

Naïve Bayes Classifier (NBC) with k-Fold Cross Validation
- Introduces NBC with k-fold CV used in the NXG Logic Explorer package.

Learning Vector Quantization (LVQ1)
- Introduces the theoretical background on prototype ("punish-reward") learning methods (LVQ) from the standpoint of numerical methods, which are implemented in the NXG Logic Explorer package.

Covariance Matrix Self-Adaptation Evolution Strategies and Other Metaheuristic Techniques for Neural Adaptive Learning
- Covers background theory on employing metaheuristics (covariance matrix self-adaptation, genetic algorithms, ant colony optimization, particle swarm optimization) to train neural networks. Particle swarm optimization is widely used in the Explorer package for function minimization/maximization.

Non-Linear Manifold Learning (NLML)
- Introduces background theory on NLML from a numerical methods approach. Covers Kernel PCA (distance-based, Gaussian-based, Tanimoto distance-based), Diffusion Maps (DM), Laplacian Eigenmaps (LEM), Local Linear Embedding (LLE), Locality Preserving Projections (LPP), Sammon Mapping (Sammon).
All of the above listed methods are implemented in the NXG Logic Explorer package, along with t-SNE.

Principal Components Analysis (PCA)
- Introduces background theory on PCA from a numerical methods approach. PCA is implemented in the NXG Logic Explorer package.

Text Mining and Document Clustering (DOCC)
-  Introduces background theory on DOCC, which are implemented in the NXG Logic Explorer package.

Gaussian Mixture Models (GMM)
- Introduces background theory on GMM, which is implemented in the NXG Logic Explorer package.

Hierarchical Cluster Analysis (HCA)
- Introduces background theory on HCA, which is implemented in the NXG Logic Explorer package.

Unsupervised Neural Gas (UNG)
- Introduces background theory on UNG, which is implemented in the NXG Logic Explorer package.

Self-Organizing Maps (SOM)
- Introduces background theory on SOM, which is implemented in the NXG Logic Explorer package.

Fuzzy K-Means Cluster Analysis (FKM)
- Introduces background theory on FKM,, which is implemented in the NXG Logic Explorer package.

Crisp K-Means Cluster Analysis (CKM)
- Introduces background theory on numerical methods for CKM, which is implemented in the NXG Logic Explorer package.