IEEE/ICACT20230209 Slide.12        [Big Slide]       Oral Presentation
The Adam (Adaptive Moment Estimation) [8] optimizer was used by the Intent Classification Model established in this work to optimize the cost function, which is the categorical cross-entropy. The optimizer algorithm Adam's hyperparameters were tuned using Bayesian Sequential Model Optimization. Bayesian SMBO is a hyperparameter optimization that reduces a specific objective function by building a surrogate model from the objective function's prior evaluation results The set of hyperparameters that were chosen for the optimizer function were as follows, learning rate: 0.0001, beta-1: 0.93241, and decay: 0.0000024.

[Go to Next Slide]
Select Voice: