Research on the Coordinate Attention Mechanism Fuse in a YOLOv5 Deep Learning Detector for the SAR Ship Detection Task
2022-05-01
发表期刊SENSORS
ISSN1424-8220
EISSN1424-8220
卷号22期号:9
发表状态已发表
DOI10.3390/s22093370
摘要

The real-time performance of ship detection is an important index in the marine remote sensing detection task. Due to the computing resources on the satellite being limited by the solar array size and the radiation-resistant electronic components, information extraction tasks are usually implemented after the image is transmitted to the ground. However, in recent years, the one-stage based target detector such as the You Only Look Once Version 5 (YOLOv5) deep learning framework shows powerful performance while being lightweight, and it provides an implementation scheme for on-orbit reasoning to shorten the time delay of ship detention. Optimizing the lightweight model has important research significance for SAR image onboard processing. In this paper, we studied the fusion problem of two lightweight models which are the Coordinate Attention (CA) mechanism module and the YOLOv5 detector. We propose a novel lightweight end-to-end object detection framework fused with a CA module in the backbone of a suitable position: YOLO Coordinate Attention SAR Ship (YOLO-CASS), for the SAR ship target detection task. The experimental results on the SSDD synthetic aperture radar (SAR) remote sensing imagery indicate that our method shows significant gains in both efficiency and performance, and it has the potential to be developed into onboard processing in the SAR satellite platform. The techniques we explored provide a solution to improve the performance of the lightweight deep learning-based object detection framework. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.

关键词Deep learning Object detection Orbits Radar imaging Remote sensing Satellite imagery Ships Synthetic aperture radar Tracking radar Vehicle performance Attention mechanisms Coordinate attention mechanism Detection tasks On-board processing Onboard computing Performance Real time performance Ship detection Ship object detection You only look once version 5
URL查看原文
收录类别SCI ; SCIE ; EI
语种英语
WOS研究方向Chemistry ; Engineering ; Instruments & Instrumentation
WOS类目Chemistry, Analytical ; Engineering, Electrical & Electronic ; Instruments & Instrumentation
WOS记录号WOS:000794488400001
出版者MDPI
EI入藏号20221812051734
EI主题词Object recognition
EI分类号461.4 Ergonomics and Human Factors Engineering ; 655.2 Satellites ; 662.1 Automobiles ; 663.1 Heavy Duty Motor Vehicles ; 716.2 Radar Systems and Equipment ; 723.2 Data Processing and Image Processing
原始文献类型Journal article (JA)
引用统计
文献类型期刊论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/180923
专题信息科学与技术学院_特聘教授组_林宝军组
通讯作者Lin, Baojun
作者单位
1.Chinese Acad Sci, Aerosp Informat Res Inst, Beijing 100094, Peoples R China
2.Univ Chinese Acad Sci, Sch Optoelect, Beijing 100094, Peoples R China
3.Chinese Acad Sci, Innovat Acad Microsatellites, Shanghai 201210, Peoples R China
4.Shanghai Engn Ctr Microsatellites, Shanghai 201304, Peoples R China
5.ShanghaiTech Univ, Sch Informat Sci & Technol, Shanghai 201210, Peoples R China
通讯作者单位信息科学与技术学院
推荐引用方式
GB/T 7714
Xie, Fang,Lin, Baojun,Liu, Yingchun. Research on the Coordinate Attention Mechanism Fuse in a YOLOv5 Deep Learning Detector for the SAR Ship Detection Task[J]. SENSORS,2022,22(9).
APA Xie, Fang,Lin, Baojun,&Liu, Yingchun.(2022).Research on the Coordinate Attention Mechanism Fuse in a YOLOv5 Deep Learning Detector for the SAR Ship Detection Task.SENSORS,22(9).
MLA Xie, Fang,et al."Research on the Coordinate Attention Mechanism Fuse in a YOLOv5 Deep Learning Detector for the SAR Ship Detection Task".SENSORS 22.9(2022).
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Xie, Fang]的文章
[Lin, Baojun]的文章
[Liu, Yingchun]的文章
百度学术
百度学术中相似的文章
[Xie, Fang]的文章
[Lin, Baojun]的文章
[Liu, Yingchun]的文章
必应学术
必应学术中相似的文章
[Xie, Fang]的文章
[Lin, Baojun]的文章
[Liu, Yingchun]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 10.3390@s22093370.pdf
格式: Adobe PDF
此文件暂不支持浏览
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。