• Opto-Electronic Engineering
  • Vol. 51, Issue 11, 240208-1 (2024)
Miao Li, Nuo Chen, Wei An*, Boyang Li..., Qiang Ling and Weixing Li|Show fewer author(s)
Author Affiliations
  • College of Electronic Science and Technology, National University of Defense Technology, Changsha, Hunan 410073, China
  • show less
    DOI: 10.12086/oee.2024.240208 Cite this Article
    Miao Li, Nuo Chen, Wei An, Boyang Li, Qiang Ling, Weixing Li. Dual view fusion detection method for event camera detection of unmanned aerial vehicles[J]. Opto-Electronic Engineering, 2024, 51(11): 240208-1 Copy Citation Text show less

    Abstract

    With the widespread application of low-altitude drones, real-time detection of such slow and small targets is crucial for maintaining public safety. Traditional cameras capture image frames with a fixed exposure time, which makes it challenging to adapt to changes in lighting conditions, resulting in the detection of blind spots in intense light and other scenes. Event cameras, as a new type of neuromorphic sensor, sense differences in external brightness changes pixel by pixel. They can still generate high-frequency sparse event data under complex lighting conditions. In response to the difficulty of adapting image-based detection methods to sparse and irregular data from event cameras, this paper models the two-dimensional object detection task as a semantic segmentation task in a three-dimensional spatiotemporal point cloud and proposes a drone object segmentation model based on dual-view fusion. Based on the event camera collecting accurate drone detection datasets, the experimental results show that the proposed method has the optimal detection performance while ensuring real-time performance, achieving stable detection of drone targets.
    Miao Li, Nuo Chen, Wei An, Boyang Li, Qiang Ling, Weixing Li. Dual view fusion detection method for event camera detection of unmanned aerial vehicles[J]. Opto-Electronic Engineering, 2024, 51(11): 240208-1
    Download Citation