Job Title: Autonomous Systems Perception Engineer
Job Description:
Our edge AI team is developing innovative sensing systems that revolutionize how machines perceive and interact with the physical world. We fuse advanced hardware with real-time intelligence to create intelligent sensing systems.
We are seeking a Principal Engineer with expertise in visual odometry, sensor fusion, SLAM, and AI-based perception to lead the development of real-world localization and mapping solutions. You will work at the intersection of embedded sensing, robotics, and machine learning, creating systems that can understand, navigate, and adapt to their environments autonomously.
Requirements:
* 10+ years of experience in robotics, computer vision, or AI, including 5+ years in SLAM, VO, or sensor fusion; with 3+ years in a technical leadership role.
* M.S. or Ph.D. in Robotics, Computer Science, Electrical Engineering, or related field.
* Demonstrated expertise in at least one of the following: Visual-inertial odometry (VIO), Multi-sensor fusion (camera, LiDAR, IMU, encoders), 3D SLAM and mapping in dynamic environments, AI-based perception models for real-time localization.
* Proven track record of deploying perception algorithms in real-world systems (e.g., autonomous robots, drones, AR/VR, self-driving platforms).
Responsibilities:
Lead the design and deployment of SLAM, VIO, and multi-sensor fusion systems. Drive algorithm development and evaluation. Architect AI pipelines. Collaborate cross-functionally with embedded, hardware, and systems engineers.
Why Join Us?
Join us in shaping the future of edge intelligence. Your work will bridge the gap between sensing and understanding. You'll help machines gather data and act meaningfully in the world.