Lead - Perception (UAVs)
Unmannd Autonomy
About Us
We're building deeply intelligent UAVs for defense forces - designed to out-think, outmaneuver, and dominate even in contested environments. Our mission is to give the armed forces an undeniable edge and help safeguard peace across borders. We move fast, challenge limits, and obsess over building technology that redefines what’s possible. If you are driven to build something truly transformational — then this is it.
Job Description
The Perception Lead will own the perception architecture for Unmannd UAVs, leading development across vision, LiDAR, radar, and sensor fusion pipelines. You will lead the development of high-performance perception systems that integrate seamlessly with autonomy and control. This role requires both deep technical expertise and leadership skills to guide a cross-functional team.
- Perception System Development
Lead development of UAV perception pipelines for object detection, tracking, mapping, and obstacle avoidance.
Architect and optimize multi-sensor fusion frameworks (e.g., combining cameras, LiDAR, radar, GPS, IMU).
Develop algorithms for SLAM (Simultaneous Localization & Mapping), 3D reconstruction, and terrain awareness.
Ensure perception stack runs in real-time on embedded hardware.
- Algorithm Design & Optimization
Develop and deploy ML/DL models for vision-based detection, semantic segmentation, and scene understanding.
Implement robust algorithms for adverse environments (low-light, rain, fog, GPS-denied zones).
Optimize for latency, robustness, and power efficiency in embedded systems.
- Integration & Testing
Work with autonomy engineers to feed perception outputs into planning and navigation modules.
Collaborate with controls engineers to enable perception-informed control loops (e.g., obstacle avoidance).
Define perception testing protocols in simulation (Gazebo, AirSim, Isaac Sim) and real-world flight tests.
Analyze UAV flight logs to improve perception performance and robustness.
- Research & Innovation
Stay at the cutting edge of computer vision, robotics perception, and AI for autonomy.
Evaluate and integrate state-of-the-art techniques (e.g., deep learning for perception, graph-SLAM, event-based vision).
Contribute to long-term strategy for making UAV perception certifiable and safety-critical.
Requirements
Master’s or Ph.D. in Robotics, Computer Vision, Machine Learning, or related field.
6–10 years of experience in perception for robotics, drones, or autonomous vehicles.
-
Strong expertise in:
Sensor fusion (EKF, UKF, particle filters).
SLAM and mapping frameworks (ORB-SLAM, Cartographer, RTAB-Map, LOAM).
Computer vision & ML/DL (OpenCV, PyTorch/TensorFlow, ROS).
Hands-on experience with embedded systems & GPU acceleration (CUDA, TensorRT, Jetson, etc.).
Experience in C++ and Python for robotics software development.
Track record of deploying perception algorithms on real robots/UAVs.
Benefits
Ownership of Unmannd’s perception architecture and roadmap.
Direct impact on enabling safe autonomous flight in real-world environments.
Access to UAV hardware, flight testing, and high-performance compute resources.
A startup environment with significant growth opportunities.
Competitive salary, and stock options.
