We are looking for a Robotics Autonomy Engineer with strong experience in SLAM, localization, mapping, and navigation to help build robots that can operate reliably in complex real-world environments.
In this role, you will develop and deploy autonomy systems that enable robots to perceive their surroundings, understand where they are, and move safely and efficiently through dynamic spaces. You should be comfortable working across software, sensing, and on-robot deployment, with strong hands-on experience in ROS / ROS 2 and real robotic systems.
What you'll do
- Design, develop, and deploy autonomy stacks for mobile robots
- Build and improve SLAM, localization, mapping, path planning, and navigation systems
- Work with ROS / ROS 2 to integrate autonomy pipelines across perception, planning, and control
- Develop and tune robot behavior in real-world indoor and outdoor environments
- Integrate sensors such as LiDAR, IMU, depth cameras, wheel encoders, and GPS where applicable
- Improve robustness in dynamic, cluttered, and partially observable environments
- Debug autonomy issues related to mapping drift, localization loss, obstacle handling, and route planning
- Collaborate with mechanical, electrical, and software teams to bring robots from prototype to deployment
- Evaluate system performance through testing, field trials, and continuous iteration
What we're looking for
- Degree in Robotics, Computer Science, Electrical Engineering, Mechanical Engineering, or related field
- Strong experience in SLAM, localization, mapping, and robot navigation
- Solid working knowledge of ROS / ROS 2
- Experience with motion planning, obstacle avoidance, and autonomous robot behaviors
- Familiarity with robotic sensors and sensor fusion
- Strong software engineering skills in C++ and/or Python
- Experience deploying and debugging robotics systems on real hardware
- Strong problem-solving skills and ability to work across disciplines
Nice to have
- Experience with Nav2, RTAB-Map, Cartographer, SLAM Toolbox, AMCL, or similar autonomy frameworks
- Familiarity with multi-sensor fusion, visual-inertial odometry, or GPS-denied navigation
- Experience with dynamic costmaps, behavior trees, or fleet-level navigation systems
- Exposure to outdoor autonomy, multi-floor navigation, or semantic mapping
- Experience with simulation tools such as Gazebo or Isaac Sim
Who you are
- Hands-on, practical, and deeply technical
- Comfortable working at the intersection of algorithms and real-world deployment
- Excited by solving messy autonomy problems outside controlled lab conditions
- Able to move quickly, test often, and improve systems through iteration