Identifying Pufferfish Specie Using Deep Neural Networks And Face Embedding Method

Authors

  • Yuan Lin
  • Shaomin Xie
  • Jari Korhonen
  • Juan Liu
  • Xiangrong Liu
  • Junyong You
  • Debasish Ghose

Abstract

ufferfish, acclaimed for its distinctive texture and extraor- dinary delicacy, is however notorious for the highly toxic poison. Identi- fying individual pufferfish can be very beneficial for the aquaculture and food processing industries, as it tackles the challenges of food security and nutrition strategies, as well as maintenance of a sustainable ecosys- tem. Current methods of identifying and tracking pufferfish mainly rely on heuristic visual recognition or manual intervention such as RFID. The rapid advances in deep learning together with the presence of large scale database, are now able to solve complex tasks that previously required human expertise. In this work, we have implemented a deep learning framework based on deep Face Recognition (deep FR) techniques, to identify individual pufferfish. First, we created a dataset of labeled and data augmented takifugu bimaculatus fish images, which is publicly ac- cessible as benchmark for interested researchers. Second, we conducted an extensive evaluation of state-of-the-art building blocks of Deep FR, in particular segmentation and loss functions, and conducted an ablation study for their applicability to pufferfish recognition. Third, we proposed a framework named FishIR composed of four deep FR stages. Experi- ments verified the effectiveness of this framework in terms of learning useful representation of individual pufferfish specie based on the back skin texture pattern. We believe that this approach can generalize to other similar individual recognition tasks, as well as contribute to the massive growth of smart farming and deep ocean fishery.

Downloads

Download data is not yet available.

Downloads

Published

2023-03-09

How to Cite

[1]
Y. Lin, “Identifying Pufferfish Specie Using Deep Neural Networks And Face Embedding Method ”, NIKT, no. 1, Mar. 2023.