Abstract:
Although Deep Learning (DL) is revolutionising practices across
fields, it requires a large amount of data and computing resources,
requires considerable training time, and is thus expensive. This
study proposes a transfer learning approach by adopting a sim-
plified version of a standard Convolution Neural Network (CNN),
which is successful in another domain. We explored three transfer
learning approaches: freezing all layers except the first and the
last layer of the CNN model, which we had modified, freezing the
first layer, updating the weights of the rest of the layers, and fine-
tuning the entire network. Furthermore, we trained a DL model
from scratch to act as a baseline. We performed the experiments
on the Edge Impulse platform. We evaluated the models based on
plant-village, tea diseases and land use datasets. Fine-tuning and
training the whole network produced the best precision, accuracy,
recall, f-measure and sensitivity across the datasets. All three trans-
fer learning schemes significantly reduced the training by more
than half. Further, we deployed the fine-tuned model in detect-
ing diseases in tea two months after the idea’s conception, and it
showed a good correlation with the experts’ decisions. The evalu-
ation results showed that it is viable to perform transfer learning
among domains to accelerate solutions deployments. Additionally,
Edge Impulse is ideal in resource-constrained environments, es-
pecially in developing countries lacking computing resources and
expertise to train DL models from scratch. This insight can propel
the development and rollout of various applications addressing the
Sustainable Development Goals targeted at zero hunger and no
poverty, among other goals.