基于CNN-LSTM神经网络的前视成像算法
作者:
作者单位:

1.北京遥测技术研究所 北京 100076;2.中国航天电子技术研究院 北京 100094

作者简介:

孙晓翰 1998年生,硕士研究生。
李凉海 1965年生,硕士,研究员。
张 彬 1981年生,硕士,研究员。

通讯作者:

中图分类号:

TP18;TP75

基金项目:


Forward-looking Imaging Algorithm Based on CNN-LSTM Neural Network
Author:
Affiliation:

1.Beijing Research Institute of Telemetry, Beijing 100076, China;2.China Academy of Aerospace Electronics Technology, Beijing 100080, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    雷达前视成像作为雷达成像领域的难点与重点,在自动驾驶、导航、精确制导等方面具有广阔的应用前景。传统的前视成像算法受限于天线孔径的宽度,无法实现高分辨率的成像,本文使用卷积神经网络(Convolutional Neural Networks, CNN)与长短期记忆(Long Short-Term Memory,LSTM)网络相结合实现前视成像中方位向的预测,首先介绍了扫描前视成像信号的类卷积模型及其病态性,利用脉冲压缩以及距离徙动校正对回波信号预处理,输入CNN-LSTM神经网络逐距离单元进行方位向估计。仿真结果表明:算法能有效提高前视成像的方位分辨率,实现前视成像的超分辨。

    Abstract:

    As a difficulty and focus in the field of radar imaging, radar forward-looking imaging has broad application prospects in automatic driving, navigation, precision guidance and so on. The traditional forward-looking imaging algorithm is limited by the width of the antenna aperture and cannot achieve high-resolution imaging. In this paper, CNN ( Convolutional Neural Networks ) neural network and LSTM ( Long Short-Term Memory ) neural network are combined to realize the prediction of azimuth in forward-looking imaging. Firstly, the convolution-like model of the scanning forward-looking imaging signal and its ill-posedness are introduced. The echo signal is preprocessed by pulse compression and range migration correction, and input into the CNN-LSTM neural network to perform azimuth estimation by range unit. The simulation results show that the algorithm can effectively improve the azimuth resolution of forward-looking imaging and realize the super-resolution of forward-looking imaging.

    相似文献
    引证文献
引用本文

孙晓翰,李凉海,张彬.基于CNN-LSTM神经网络的前视成像算法[J].遥测遥控,2024,45(2):29-36.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
    参考文献
历史
  • 收稿日期:2023-12-25
  • 最后修改日期:2024-01-03
  • 录用日期:
  • 在线发布日期: 2024-04-02
  • 出版日期:
  • 优先出版日期: 2024-04-02