References
[1] M. Sheykhmousa, M. Mahdianpari, H.
Ghanbari, F. Mohammadimanesh, P. Ghamisi, and S. Homayouni, “Support vector machine versus random forest for remote sensing image classification: A meta-analysis and systematic review,” IEEE journal of selected topics in applied earth observations and remote sensing, vol. 13, pp. 6308–6325, 2020, doi: 10.1109/jstars.2020.3026724.
[2] Y. Li, H. Zhang, X. Xue, Y. Jiang, and Q.
Shen, “Deep learning for remote sensing image classification: A survey,” Wiley interdisciplinary reviews. Data mining and knowledge discovery, vol. 8, no. 6, p. e1264, Nov. 2018, doi: 10.1002/widm.1264.
[3] C. Sitaula and M. B. Hossain, “Attention- based VGG-16 model for COVID-19 chest X- ray image classification,” Applied intelligence (Dordrecht, Netherlands), vol. 51, no. 5, pp.
2850–2863, 2021, doi: 10.1007/s10489-020- 02055-x.
[4] J. Liu and X. Wang, “Plant diseases and pests detection based on deep learning: a review,”
Plant Methods, vol. 17, no. 1, p. 22, Feb.
2021, doi: 10.1186/s13007-021-00722-9.
[5] J. Manjunath, Mohana, M. M S, D. G D, M. R K, and A. S, “Feature extraction using convolution neural networks (CNN) and deep learning,” in 2018 3rd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), Bangalore, India, May 2018. doi:
10.1109/rteict42901.2018.9012507.
[6] A. Akbari, M. Awais, M. Bashar, and J.
Kittler, “How Does Loss Function Affect Generalization Performance of Deep Learning? Application to Human Age Estimation,” in International Conference on Machine Learning, Jul. 2021, pp. 141–151.
Accessed: Mar. 14, 2022. [Online]. Available:
https://proceedings.mlr.press/v139/akbari21a.
html
[7] “Impact of dataset size and variety on the effectiveness of deep learning and transfer learning for plant disease classification,”
Computers and Electronics in Agriculture, vol. 153, pp. 46–53, Oct. 2018, doi:
10.1016/j.compag.2018.08.013.
[8] S. Vani and T. V. M. Rao, “An experimental approach towards the performance assessment of various optimizers on convolutional neural network,” in 2019 3rd International
Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, Apr.
2019. doi: 10.1109/icoei.2019.8862686.
[9] R. Akut and S. Kulkarni, “NeuroEvolution : Using Genetic Algorithm for optimal design
of Deep Learning models,” in 2019 IEEE International Conference on Electrical, Computer and Communication Technologies (ICECCT), Coimbatore, India, Feb. 2019. doi:
10.1109/icecct.2019.8869233.
[10] H. A. N. Chong, W. Jun-li, W. U. Yu-xi, and Z. Chao-bo, “A Review of Deep Learning Models Based on Neuroevolution,” ACTA ELECTONICA SINICA, vol. 49, no. 2, p. 372, Feb. 2021, doi: 10.12263/DZXB.20200139.
[11] P. Verbancsics and J. Harguess, “Image classification using generative neuro evolution for deep learning,” in 2015 IEEE Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, Jan. 2015. doi:
10.1109/wacv.2015.71.
[12] A. Lambora, K. Gupta, and K. Chopra,
“Genetic algorithm- A literature review,” in 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon), Faridabad, India, Feb. 2019. doi:
10.1109/comitcon.2019.8862255.
[13] K. M. Hamdia, X. Zhuang, and T. Rabczuk,
“An efficient optimization approach for designing machine learning models based on genetic algorithm,” Neural computing &
applications, vol. 33, no. 6, pp. 1923–1933, Mar. 2021, doi: 10.1007/s00521-020-05035-x.
[14] S. Li, H. Wu, D. Wan, and J. Zhu, “An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector
machine,” Knowledge-based systems, vol. 24, no. 1, pp. 40–48, Feb. 2011, doi:
10.1016/j.knosys.2010.07.003.
[15] Y. Sun, B. Xue, M. Zhang, G. G. Yen, and J.
Lv, “Automatically Designing CNN
Architectures Using the Genetic Algorithm for Image Classification,” IEEE transactions on cybernetics, vol. 50, no. 9, pp. 3840–3854, Sep. 2020, doi: 10.1109/TCYB.2020.2983860.
[16] D. Newton, R. Pasupathy, and F. Yousefian,
“Recent trends in stochastic gradient descent for machine learning and big data,” in 2018 Winter Simulation Conference (WSC), Gothenburg, Sweden, Dec. 2018. doi:
10.1109/wsc.2018.8632351.
[17] R. Zaheer and H. Shaziya, “A study of the optimization algorithms in deep learning,” in 2019 Third International Conference on Inventive Systems and Control (ICISC), Coimbatore, India, Jan. 2019. doi:
10.1109/icisc44355.2019.9036442.
[18] A. Kumar, S. Sarkar, and C. Pradhan,
“Malaria disease detection using CNN technique with SGD, RMSprop and ADAM optimizers,” in Studies in Big Data, Cham:
Springer International Publishing, 2020, pp.
211–230. doi: 10.1007/978-3-030-33966-
1_11.
[19] S. Colianni, “MNIST as .jpg.” May 15, 2017.
Accessed: Oct. 28, 2022. [Online]. Available:
https://www.kaggle.com/scolianni/mnistasjpg [20] N. Srivastava, G. Hinton, A. Krizhevsky, I.
Sutskever, and R. Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks from Overfitting,” Journal of machine learning research: JMLR, vol. 15, no. 56, pp. 1929–
1958, 2014, Accessed: Oct. 28, 2022.
[Online]. Available:
http://jmlr.org/papers/v15/srivastava14a.html [21] S. Ioffe and C. Szegedy, “Batch
Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” Machine Learning, Feb. 2015, doi:
10.48550/arXiv.1502.03167.