SMFSwap: Student-aware multi-teacher knowledge distillation for fast face-swapping
Article
| Article Title | SMFSwap: Student-aware multi-teacher knowledge distillation for fast face-swapping |
|---|---|
| ERA Journal ID | 18092 |
| Article Category | Article |
| Authors | Ding, Yifeng, Yang, Gaoming, Yin, Shuting, Zhang, Ji, Fang, Xianjin and Yang, Wencheng |
| Journal Title | Neurocomputing |
| Journal Citation | 649 |
| Article Number | 130807 |
| Number of Pages | 21 |
| Year | 2025 |
| Publisher | Elsevier |
| Place of Publication | Netherlands |
| ISSN | 0925-2312 |
| 1872-8286 | |
| Digital Object Identifier (DOI) | https://doi.org/10.1016/j.neucom.2025.130807 |
| Web Address (URL) | https://www.sciencedirect.com/science/article/pii/S0925231225014791 |
| Abstract | Recent face swapping frameworks have performed admirably in generating high-fidelity results. However, most of these methods require huge memory footprints and computational resources, making them difficult to apply in real-time applications. To overcome this problem, we propose a student-aware multi-teacher knowledge distillation for fast face swapping, called SMFSwap. Specifically, we use a compressed version of the existing network Simple Swap as the fast swap network (FSN) and introduce multi-teacher knowledge distillation to stabilize the training of FSN. In addition, we propose a novel multi-teacher weight assignment strategy for the face-swapping task without ground truth. It adaptively assigns different weights for each teacher’s result with the help of the matching degree between teacher and FSN results. During the training, teacher results that are close to the FSN result are assigned large weights to better guide the FSN training. Moreover, we also integrate teacher intermediate layer features to stabilize the knowledge transfer process. Extensive quantitative and qualitative experiments on various datasets demonstrate that the proposed SMFSwap achieves comparable results to the teacher network and other state-of-the-art face-swapping methods while having a lower model complexity. The code will be available at https://github.com/EifelTing/SMFSwap. |
| Keywords | Face swapping; Fast swap network; Multi-teacher knowledge distillation; Student-aware weight aggregation |
| Contains Sensitive Content | Does not contain sensitive content |
| ANZSRC Field of Research 2020 | 460101. Applications in arts and humanities |
| Public Notes | Files associated with this item cannot be displayed due to copyright restrictions. |
| Byline Affiliations | Anhui University of Science and Technology, China |
| School of Science, Engineering & Digital Technologies- Maths,Physics & Computing |
https://research.usq.edu.au/item/zyq6x/smfswap-student-aware-multi-teacher-knowledge-distillation-for-fast-face-swapping
12
total views0
total downloads12
views this month0
downloads this month