Perception Engineering Intern
Anduril Industries is a defense technology company with a mission to transform U.S. and allied military capabilities with advanced technology. By bringing the expertise, technology, and business model of the 21st century's most innovative companies to the defense industry, Anduril is changing how military systems are designed, built and sold. Anduril's family of systems is powered by Lattice OS, an AI-powered operating system that turns thousands of data streams into a realtime, 3D command and control center. As the world enters an era of strategic competition, Anduril is committed to bringing cutting-edge autonomy, AI, computer vision, sensor fusion, and networking technology to the military in months, not years.
About The Job
We are seeking a perception engineer with a strong background in computer vision to join our rapidly growing team in Costa Mesa, CA. In this role, you will be at the forefront of developing advanced perception systems for complex autonomous aerial platforms. Your expertise in computer vision algorithms, combined with your understanding of robotics principles, will be crucial in solving a wide variety of challenges involving visual perception, SLAM, motion planning, controls, and state estimation. This role requires not only technical expertise in computer vision and robotics but also the ability to make pragmatic engineering tradeoffs, considering the unique constraints of aerial platforms. Your work will directly contribute to the seamless integration of Anduril's products, achieving critical outcomes in autonomous operations. This position demands strong systems-level knowledge and experience, as you'll be working on the intersection of computer vision, robotics, and autonomous systems. If you are passionate about pushing the boundaries of computer vision in robotics, possess a "Whatever It Takes" mindset, and can execute in an expedient, scalable, and pragmatic way while keeping the mission top-of-mind and making sound engineering decisions, then this role is for you.
What You'll Do
- Work at the intersection of 3D perception and computer vision, developing robust algorithms that power real-time decision-making for autonomous aerial systems.
- Develop and implement advanced structure from motion and SLAM algorithms to create accurate 3D models from multiple camera inputs in real-time.
- Integrate perception outputs with path planning algorithms to enable autonomous navigation in complex, unstructured environments.
- Design experiments, data collection efforts, and curate training/evaluation sets to develop insights for both internal purposes and customers.
- Collaborate closely with robotics, software, and hardware teams to integrate perception algorithms into autonomous aerial systems.
- Work with vendors and government stakeholders to advance the state-of-the-art in perception and world modeling for autonomous aerial systems.
Required Qualifications
- BS in Robotics, Computer Science, Mechatronics, Electrical Engineering, Mechanical Engineering, or related field.
- Strong knowledge of 3D computer vision concepts, including multi-view geometry, camera models, photogrammetry, depth estimation, and 3D reconstruction techniques.
- Fluency in standard domain libraries (numpy, opencv, pytorch, etc).
- Proven understanding of data structures, algorithms, concurrency, and code optimization.
- Experience working with Python, PyTorch, or C++ programming languages.
- Experience deploying software to end customers, internal or external.
- Must be willing to travel 25%.
- Eligible to obtain an active U.S. Secret security clearance.
Preferred Qualifications
- MS or PhD in Robotics, Computer Science, Engineering, or related field.
- Experience with perception systems for aerial robotics or other highly dynamic platforms.
- Experience with real-world sensor integrations, including LiDAR, RGB-D cameras, IR cameras, stereo cameras, or TOF cameras.
- Experience with GPU / CUDA programming for accelerated computer vision processing.
- Knowledge of path planning algorithms and their integration with perception systems in dynamic environments.