Fast face swapping with high-fidelity lightweight generator assisted by online knowledge distillation
Article
Yang, Gaoming, Ding, Yifeng, Fang, Xianjin, Zhang, Ji and Chu, Yan. 2024. "Fast face swapping with high-fidelity lightweight generator assisted by online knowledge distillation." The Visual Computer. https://doi.org/10.1007/s00371-024-03414-2
Article Title | Fast face swapping with high-fidelity lightweight generator assisted by online knowledge distillation |
---|---|
ERA Journal ID | 18149 |
Article Category | Article |
Authors | Yang, Gaoming, Ding, Yifeng, Fang, Xianjin, Zhang, Ji and Chu, Yan |
Journal Title | The Visual Computer |
Number of Pages | 21 |
Year | 2024 |
Publisher | Springer |
Place of Publication | Germany |
ISSN | 0178-2789 |
1432-2315 | |
Digital Object Identifier (DOI) | https://doi.org/10.1007/s00371-024-03414-2 |
Web Address (URL) | https://link.springer.com/article/10.1007/s00371-024-03414-2 |
Abstract | Advanced face swapping approaches have achieved high-fidelity results. However, the success of most methods hinges on heavy parameters and high-computational costs. With the popularity of real-time face swapping, these factors have become obstacles restricting their swap speed and application. To overcome these challenges, we propose a high-fidelity lightweight generator (HFLG) for face swapping, which is a compressed version of the existing network Simple Swap and consists of its 1/4 channels. Moreover, to stabilize the learning of HFLG, we introduce feature map-based online knowledge distillation into our training process and improve the teacher–student architecture. Specifically, we first enhance our teacher generator to provide more efficient guidance. It minimizes the loss of details on the lower face. In addition, a new identity-irrelevant similarity loss is proposed to improve the preservation of non-facial regions in the teacher generator results. Furthermore, HFLG uses an extended identity injection module to inject identity more efficiently. It gradually learns face swapping by imitating the feature maps and outputs of the teacher generator online. Extensive experiments on faces in the wild demonstrate that our method achieves comparable results with other methods while having fewer parameters, lower computations, and faster inference speed. The code is available at https://github.com/EifelTing/HFLFS. |
Keywords | Face swapping; Online knowledge distillation; Identity-irrelevant similarity loss; High-fidelity lightweight generator |
Contains Sensitive Content | Does not contain sensitive content |
ANZSRC Field of Research 2020 | 460306. Image processing |
Public Notes | Files associated with this item cannot be displayed due to copyright restrictions. |
Byline Affiliations | Anhui University of Science and Technology, China |
School of Mathematics, Physics and Computing | |
Harbin University of Engineering, China |
Permalink -
https://research.usq.edu.au/item/z85wy/fast-face-swapping-with-high-fidelity-lightweight-generator-assisted-by-online-knowledge-distillation
40
total views0
total downloads5
views this month0
downloads this month