HSANet: Hybrid Self-Attention Network Recognition Facial Micro Plastic Method
- 1 January 2023
- journal article
- Published by Hans Publishers in Computer Science and Application
- Vol. 13 (03), 301-310
- https://doi.org/10.12677/csa.2023.133029
Abstract
Due to the large changes in facial features, the correct recognition rate of the original face is low. In view of the phenomenon, this experiment proposed a hybrid self-attention block structure for rec-ognizing faces with facial features changes. For this reason, 26 kinds of micro-plastic surgery small sample image data sets were made by ourselves. Integrating self-attention into the bottleneck block of the residual network improves the ability of the hybrid self-attention block to capture the features of each region of the image. The experiment on the small sample micro-plastic data sets shows that the hybrid self-attention network proposed in this experiment has a higher correct recognition rate: 89.70%, the correct recognition rate increased by 2.65% compared with ResNet50, and the correct recognition rate of the hybrid selfattention model with improved connection increased by 1.12% compared with the hybrid self-attention model without improved connection, and the net-work performance was also improved.Keywords
This publication has 7 references indexed in Scilit:
- ResNeSt: Split-Attention NetworksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2022
- Bottleneck Transformers for Visual RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2021
- Axial-DeepLab: Stand-Alone Axial-Attention for Panoptic SegmentationPublished by Springer Science and Business Media LLC ,2020
- Cell Image Segmentation Method Based on Residual Block and Attention MechanismActa Optica Sinica, 2020
- Attention Augmented Convolutional NetworksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2019
- Deep Residual Learning for Image RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2016
- Effective Approaches to Attention-based Neural Machine TranslationPublished by Association for Computational Linguistics (ACL) ,2015