CIOMP OpenIR
Event-Based Optical Flow Estimation with Spatio-Temporal Backpropagation Trained Spiking Neural Network
Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi
2023
发表期刊Micromachines
ISSN2072666X
卷号14期号:1
摘要The advantages of an event camera, such as low power consumption, large dynamic range, and low data redundancy, enable it to shine in extreme environments where traditional image sensors are not competent, especially in high-speed moving target capture and extreme lighting conditions. Optical flow reflects the target’s movement information, and the target’s detailed movement can be obtained using the event camera’s optical flow information. However, the existing neural network methods for optical flow prediction of event cameras has the problems of extensive computation and high energy consumption in hardware implementation. The spike neural network has spatiotemporal coding characteristics, so it can be compatible with the spatiotemporal data of an event camera. Moreover, the sparse coding characteristic of the spike neural network makes it run with ultra-low power consumption on neuromorphic hardware. However, because of the algorithmic and training complexity, the spike neural network has not been applied in the prediction of the optical flow for the event camera. For this case, this paper proposes an end-to-end spike neural network to predict the optical flow of the discrete spatiotemporal data stream for the event camera. The network is trained with the spatio-temporal backpropagation method in a self-supervised way, which fully combines the spatiotemporal characteristics of the event camera while improving the network performance. Compared with the existing methods on the public dataset, the experimental results show that the method proposed in this paper is equivalent to the best existing methods in terms of optical flow prediction accuracy, and it can save 99% more power consumption than the existing algorithm, which is greatly beneficial to the hardware implementation of the event camera optical flow prediction., laying the groundwork for future low-power hardware implementation of optical flow prediction for event cameras. © 2023 by the authors.
DOI10.3390/mi14010203
URL查看原文
收录类别sci ; ei
引用统计
文献类型期刊论文
条目标识符http://ir.ciomp.ac.cn/handle/181722/68210
专题中国科学院长春光学精密机械与物理研究所
推荐引用方式
GB/T 7714
Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi. Event-Based Optical Flow Estimation with Spatio-Temporal Backpropagation Trained Spiking Neural Network[J]. Micromachines,2023,14(1).
APA Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi.(2023).Event-Based Optical Flow Estimation with Spatio-Temporal Backpropagation Trained Spiking Neural Network.Micromachines,14(1).
MLA Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi."Event-Based Optical Flow Estimation with Spatio-Temporal Backpropagation Trained Spiking Neural Network".Micromachines 14.1(2023).
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Event-Based Optical (4246KB)期刊论文出版稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi]的文章
百度学术
百度学术中相似的文章
[Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi]的文章
必应学术
必应学术中相似的文章
[Y. Zhang, H. Lv, Y. Zhao, Y. Feng, H. Liu and G. Bi]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Event-Based Optical Flow Estimation with Spati.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。