Manara - Qatar Research Repository
Browse
10.1109_tnnls.2023.3292063.pdf (4.48 MB)

Self-Distillation for Randomized Neural Networks

Download (4.48 MB)
journal contribution
submitted on 2024-02-19, 12:00 and posted on 2024-02-19, 12:00 authored by Minghui Hu, Ruobin Gao, Ponnuthurai Nagaratnam Suganthan

Knowledge distillation (KD) is a conventional method in the field of deep learning that enables the transfer of dark knowledge from a teacher model to a student model, consequently improving the performance of the student model. In randomized neural networks, due to the simple topology of network architecture and the insignificant relationship between model performance and model size, KD is not able to improve model performance. In this work, we propose a self-distillation pipeline for randomized neural networks: the predictions of the network itself are regarded as the additional target, which are mixed with the weighted original target as a distillation target containing dark knowledge to supervise the training of the model. All the predictions during multi-generation self-distillation process can be integrated by a multi-teacher method. By induction, we have additionally arrived at the methods for infinite self-distillation (ISD) of randomized neural networks. We then provide relevant theoretical analysis about the self-distillation method for randomized neural networks. Furthermore, we demonstrated the effectiveness of the proposed method in practical applications on several benchmark datasets.

Other Information

Published in: IEEE Transactions on Neural Networks and Learning Systems
License: https://creativecommons.org/licenses/by/4.0/
See article on publisher's website: https://dx.doi.org/10.1109/tnnls.2023.3292063

Funding

Open Access funding provided by the Qatar National Library.

History

Language

  • English

Publisher

IEEE

Publication Year

  • 2023

License statement

This Item is licensed under the Creative Commons Attribution 4.0 International License

Institution affiliated with

  • Qatar University
  • College of Engineering - QU
  • KINDI Center for Computing Research - CENG