Kalendarium
20
January
Exjobbspresentation
Karl-Johan Petersson och Emil Holm presenterar sitt examensarbete, varmt välkomna!
Titel: Knowledge Distillation for Improved Spoof Detection
Abstract: As deep neural networks grow more powerful, they also require more computational resources, which becomes a challenge when deploying on edge devices with limited computational capacity. This thesis looks into knowledge distillation, a model compression technique where a large ”teacher” network transfers its learned features to a smaller ”student” network. Our goal is to maintain high accuracy in spoof detection of fingerprint biometric identification while reducing model size and computational costs. We focus on distilling knowledge from a ResNet18 teacher model into lightweight MobileNet based students (TinyNet and MicroNet), testing logit based and feature based distillation strategies and projection methods between teacher and student layers. Experiments using both public and internal datasets with varied cropping size show that distillation improves performance in smaller models, with feature based distillation using convolutional projections giving the best results. These results demonstrate the potential of knowledge distillation for deploying robust spoof detection models in real world, resource constrained environments.
Handledare: Ivar Persson
Examinator: Alexandros Sopasakis
Om händelsen
Tid:
2025-01-20 10:15
till
12:00
Plats
MH:309A
Kontakt
ivar [dot] persson [at] math [dot] lth [dot] se