Manara - Qatar Research Repository
10.1371_journal.pone.0239746.pdf (1.88 MB)

On the performance of fusion based planet-scope and Sentinel-2 data for crop classification using inception inspired deep convolutional neural network

Download (1.88 MB)
journal contribution
submitted on 2024-06-10, 07:55 and posted on 2024-06-11, 09:52 authored by Nasru Minallah, Mohsin Tariq, Najam Aziz, Waleed Khan, Atiq ur Rehman, Samir Brahim Belhaouari

This research work aims to develop a deep learning-based crop classification framework for remotely sensed time series data. Tobacco is a major revenue generating crop of Khyber Pakhtunkhwa (KP) province of Pakistan, with over 90% of the country’s Tobacco production. In order to analyze the performance of the developed classification framework, a pilot sub-region named Yar Hussain is selected for experimentation work. Yar Hussain is a tehsil of district Swabi, within KP province of Pakistan, having highest contribution to the gross production of the KP Tobacco crop. KP generally consists of a diverse crop land with different varieties of vegetation, having similar phenology which makes crop classification a challenging task. In this study, a temporal convolutional neural network (TempCNNs) model is implemented for crop classification, while considering remotely sensed imagery of the selected pilot region with specific focus on the Tobacco crop. In order to improve the performance of the proposed classification framework, instead of using the prevailing concept of utilizing a single satellite imagery, both Sentinel-2 and Planet-Scope imageries are stacked together to assist in providing more diverse features to the proposed classification framework. Furthermore, instead of using a single date satellite imagery, multiple satellite imageries with respect to the phenological cycle of Tobacco crop are temporally stacked together which resulted in a higher temporal resolution of the employed satellite imagery. The developed framework is trained using the ground truth data. The final output is obtained as an outcome of the SoftMax function of the developed model in the form of probabilistic values, for the classification of the selected classes. The proposed deep learning-based crop classification framework, while utilizing multi-satellite temporally stacked imagery resulted in an overall classification accuracy of 98.15%. Furthermore, as the developed classification framework evolved with specific focus on Tobacco crop, it resulted in best Tobacco crop classification accuracy of 99%.

Other Information

Published in: PLOS ONE
See article on publisher's website:


Open Access funding provided by the Qatar National Library.

This work is funded in part by the National Center of Big data and Cloud Computer (NCBC), University of Engineering and Technology, Peshawar, under the auspices of Higher Education Commission, Pakistan and Pakistan Tobacco Board (PTB).



  • English


Public Library of Science (PLoS)

Publication Year

  • 2020

License statement

This Item is licensed under the Creative Commons Attribution 4.0 International License.

Institution affiliated with

  • Hamad Bin Khalifa University
  • College of Science and Engineering - HBKU

Geographic coverage