Enhancing Drone Safety with Onboard Vision Systems

1. Vision & Challenge

Autonomous aerial vehicles are evolving fast. But what happens when something goes wrong mid-flight? Our project explores how onboard vision systems can assist drones in identifying safe crash zones in real-time, even in dense urban environments.

2. GPS 4D Integration

GPS 4D already handles spatial-temporal data: latitude, longitude, altitude, and time. But when emergencies occur, real-time visual feedback becomes crucial. We aim to integrate camera input into GPS 4D to ensure situational awareness and safety.

3. Real-Time Crash Zone Detection

Using deep learning models like YOLO or Mask R-CNN, drones can detect open zones such as rooftops, parking lots, or emergency-designated areas. These areas are tagged for potential landing in case of propulsion failure or GNSS signal loss.

4. Urban Scenarios and Use Cases

  • Delivery drone rerouting to a rooftop in case of engine failure
  • VTOL emergency descent through visually verified corridors
  • Safe autonomous landing after GNSS disruption near buildings

5. Architecture & Technologies

We leverage edge computing (Jetson Nano, Coral TPU) and data fusion from GPS, LiDAR, and computer vision. The vision module is connected with our WebSocket-based telemetry system and visualized using CesiumJS in 3D.

6. Collaboration & Open Source

The project is developed under the Lesser Open Bee License 1.3 and benefits from the contributions of ESTACA, ENSTA Paris, and CY Tech. We welcome partners, developers, and researchers to join and expand the system to other cities.

7. Next Steps

  • Urban dataset creation for machine learning
  • Simulation with BlenderGIS and Gazebo
  • Validation tests with regulatory entities

Join us in shaping a safer, smarter aerial mobility system.