Toronto Metropolitan University
Browse

Artificial Neural Network Hyperparameter Effectiveness Determination And Optimization Algorithm

Download (1.2 MB)
thesis
posted on 2021-05-21, 09:40 authored by James VanderVeen
Machine learning models can contain many layers and branches. Each branch and layer, contain individual variables, know as hyperparameters, that require manual tuning. For instance, the genetic algorithm designed by Unit Amin [2] was designed to mimic the reproductive process of living organisms. The genetic algorithm and the Artificial Neural Network (ANN) training processes contain inherent randomness that reduces the replicability of results. Combined with the sheer magnitude of hyperparameter permutations, confidence that model has arrived at the best solution may be low. The algorithm designed for this thesis was designed to isolate portions of a complex ANN model and generate results showing the effect each hyperparameter has on the performance of the model as a whole. The results of this thesis show that the algorithm effectively generates data correlating model performance to hyperparameter selection. This is evident in section 3.1, and 3.2, where it is shown that using the sigmoid activation function with CNN layers, regardless of the number of filters, or hyperparameters used in the subsequent LSTM layers, produces superior RMSE scores. Section 3.2 also reveals that the model does not improve in performance as the number of CNN and LSTM layers are added to the model. Finally, the results in section 3.3 show that the rmsprop optimizer generates superior results regardless of the hyperparameters used in the rest of the model.

History

Editor

Ryerson University

Language

eng

Degree

  • Bachelor of Engineering

Program

  • Aerospace Engineering

Granting Institution

Ryerson University

LAC Thesis Type

  • Thesis Project

Usage metrics

    Undergraduate Research (Theses)

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC