International Journal on Advanced Science, Engineering and Information Technology, Vol. 12 (2022) No. 3, pages: 1244-1252, DOI:10.18517/ijaseit.12.3.15348

Convolutional Neural Networks for Herb Identification: Plain Background and Natural Environment

Supawadee Chaivivatrakul, Jednipat Moonrinta, Suchada Chaiwiwatrakul


Convolutional neural networks have achieved success in resolving object identification problems. This study contributes a suitable new approach to herb identification for educational and research purposes based on a small dataset and small-sized images. Two self-collected Thai herb datasets with either plain or natural environment backgrounds were used for experimentation to realize this objective. The plain background dataset includes 4,400 images of 11 leaf types, and the natural dataset contains 1,620 images of nine leaf types. The images were divided into a training set containing 75% of the images and a separate test set with the remaining 25%. The experiments included five-fold cross-validation applied to the training set; the InceptionV3, MobileNetV2, ResNet50V2, VGG16, and Xception convolutional neural network models RMSprop and Adam optimizers. Further, dropout rates of 0.3, 0.5, and 0.7 were considered along with five and ten epochs. Transfer learning was applied using pre-trained weights. The model with the best outcome, based on the average accuracy of the cross-validation results on both datasets (the plain background dataset was 94.55%, and the natural dataset was 90.37%), was the VGG16 with the RMSprop optimizer, which exhibited a dropout rate of 0.5 over ten epochs. The model achieved 96.64% and 92.00% accuracy on the plain background training and test sets, and 99.59% and 91.36% on the natural environment training and test sets, respectively. The results show that the method has a high potential for objective tasks and application in identifying herbs based on visual leaf information.


Deep learning; leaf identification; transfer learning.

Viewed: 1087 times (since abstract online)

cite this paper     download